Sample records for analysis methods developed

  1. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh


    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  2. Scientific and methodical approaches to analysis of enterprise development potential

    Directory of Open Access Journals (Sweden)

    Hrechina Iryna V.


    Full Text Available The modern state of the Ukrainian economy urge enterprises to search for new possibilities of their development, which makes the study subject topical. The article systemises existing approaches to analysis of the potential of enterprise development and marks out two main scientific approaches: first is directed at analysis of prospects of self-development of the economic system; the second – at analysis of probability of possibilities of growth. In order to increase the quality of the process of formation of methods of analysis of potential of enterprise development, the article offers an organisation model of methods and characterises its main elements. It develops methods of analysis, in the basis of which there are indicators of potentialogical sustainability. Scientific novelty of the obtained results lies in a possibility of identification of main directions of enterprise development with the use of the enterprise development potential ration: self-development or probability of augmenting opportunities, which is traced through interconnection of resources and profit.

  3. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V


    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  4. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.


    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  5. The development of a 3D risk analysis method. (United States)

    I, Yet-Pole; Cheng, Te-Lung


    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  6. Leadership Development Expertise: A Mixed-Method Analysis (United States)

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward


    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  7. Analysis of cultural development of Isfahan city Using Factor analysis method

    Directory of Open Access Journals (Sweden)



    Full Text Available Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical population of the study is 14 districts of Isfahan municipality. The dominant approach ofthis study is quantitative – description and analytical. In this study, 35 indices have been summarized by factor analysis method and have been reduced to 5 factors and combined in significant ones and delivered.2 – Theoretical basesThe most important objectives of spatial planning, considering limitation of resources, are optimum distributions of facilities and services among different locations in which people live. To do this,there is a need to identify different locations in terms of having different facilities and services, so that developed locations are specified and planners can proceed to do something for spatial equilibrium and reducing privileged distance between districts.The present study has been conducted to reach to an equal development in Isfahan urban districts following identifying the situation and the manner of distributing development facilities cultural selected indices in different districts.3 – DiscussionCultural development of societies is evaluated by considering the changes and improvement of its indices and measured by quantitative frames. Cultural development indices are the most important tools for cultural planning in a special district in a society. In this study, cultural development indices have been used to determine the levels of districts. By using factor analysis model, the share of influential factors in the cultural

  8. Development of automatic image analysis methods for high-throughput and high-content screening

    NARCIS (Netherlands)

    Di, Zi


    This thesis focuses on the development of image analysis methods for ultra-high content analysis of high-throughput screens where cellular phenotype responses to various genetic or chemical perturbations that are under investigation. Our primary goal is to deliver efficient and robust image analysis

  9. Development of Microcomputer Methods for Analysis and Simulation of Clinical Pharmacokinetic Data Relevant to New Drug Development. (United States)


    applications of new microcomputer graphics techonology to the analysis and interpretation of clinical pharmacological data. This involves continuing...COF AD DEVELOPMENT OF MICROCOMPUTER METHODS FOR ANALYSIS AND SIMULATION OF CLINICAL PHARMACOKINETIC DATA RELEVANT TO NEW DRUG DEVELOPMENT ANNUAL/FINAL...Simulation of Clinical Pharmacokinetic Data ERING ORG. TU Relevant to New Drug Development 6. PERFORMiNG ORG. REPORT NUMBER 7. AUTHOR(*) 5. CONTRACT OR

  10. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)


    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. A Product Analysis Method and its Staging to Develop Redesign Competences

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Lenau, Torben Anker


    Most product development work in industrial practice is incremental, i.e. the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry...... through an analysis of the existing product encompassing both a user-oriented and a technical perspective, as well as to synthesise solution proposals for the upgraded variant. In the course module Product Analysis and Redesign we have developed a product analysis method and a staging of it, which seems...... to be very productive. In this paper we present the product analysis method and its staging and we reflect on the students’ application of it. We conclude that the method is a valid contribution to develop the students’ redesign competences....

  12. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis (United States)

    Carlson, Leland A.


    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  13. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize. (United States)

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi


    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this.

  14. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  15. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    architecture through an integrated design approach. The findings of the literature study and the qualitative interview have directed the PhD project towards the importance of project specific design strategies and an integrated and multiprofessional approach to environmentally sustainable building design....... The project therefore focuses on the issue of design strategy development in an experimental application of sensitivity analysis as a methodical approach to the development of a design strategy for a new energy-efficient residential building in Denmark. The outset of the analysis is a single family reference...... building through which the sensitivity of parameters relating to energy and residential building design are analysed. In conclusion the PhD project discusses the strengths and weaknesses of sensitivity analysis as a methodical approach to design strategy development, and makes a suggestion...

  16. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested amethod using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.


    Directory of Open Access Journals (Sweden)

    A. A. Chakhirova


    Full Text Available The article presents studies on development of cosmetics with complex extracts from herb of Bidens, flowers of Calendula, and flowers of Matricaria. We cited the analysis methods of the received extract and a drug on its base

  18. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking (United States)

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla


    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  19. History and Development of the Schmidt-Hunter Meta-Analysis Methods (United States)

    Schmidt, Frank L.


    In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…

  20. Development of high performance liquid chromatography method for miconazole analysis in powder sample (United States)

    Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.


    A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.

  1. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method. (United States)

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko


    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities.

  2. Development of methods of the Fractal Dimension estimation for the ecological data analysis

    CERN Document Server

    Jura, Jakub; Mironovová, Martina


    This paper deals with an estimating of the Fractal Dimension of a hydrometeorology variables like an Air temperature or humidity at a different sites in a landscape (and will be further evaluated from the land use point of view). Three algorithms and methods of an estimation of the Fractal Dimension of a hydrometeorology time series were developed. The first results indicate that developed methods are usable for the analysis of a hydrometeorology variables and for a testing of the relation with autoregulation functions of ecosystem

  3. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy. (United States)

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A


    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study.

  4. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures (United States)

    Brown, Andrew M.; Ferri, Aldo A.


    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  5. Development and validation of an extraction method for the analysis of perfluoroalkyl substances in human hair. (United States)

    Kim, Da-Hye; Oh, Jeong-Eun


    Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples.

  6. Nonlinear modal propagation analysis method in multimode interference coupler for operation development (United States)

    Tajaldini, Mehdi; Mat Jafri, Mohd Zubir Mat


    In this study, we propose a novel approach that is called nonlinear modal propagation analysis method (NMPA) in MMI coupler via the enhances of nonlinear wave propagation in terms of guided modes interferences in nonlinear regimes, such that the modal fields are measurable at any point of coupler and output facets. Then, the ultra-short MMI coupler is optimized as a building block in micro ring resonator to investigate the method efficiency against the already used method. Modeling results demonstrate more efficiency and accuracy in shorter lengths of multimode interference coupler. Therefore, NMPA can be used as a method to study the compact dimension coupler and for developing the performance in applications. Furthermore, the possibility of access tothe all-optical switching is assumed due to one continuous MMI for proof of the development of performances in nonlinear regimes.


    Directory of Open Access Journals (Sweden)

    Veronica S. Moertini


    Full Text Available Along with the growth of the Internet, the trend shows that e-commerce have been growing significantly in the last several years. This means business opportunities for small-medium enterprises (SMEs, which are recognized as the backbone of the economy. SMEs may develop and run small to medium size of particular e-commerce websites as the solution of specific business opportunities. Certainly, the websites should be developed accordingly to support business success. In developing the websites, key elements of e-commerce business model that are necessary to ensure the success should be resolved at the requirement stage of the development. In this paper, we propose an enhancement of requirement analysis method found in literatures such that it includes activities to resolve the key elements. The method has been applied in three case studies based on Indonesia situations and we conclude that it is suitable to be adopted by SMEs.

  8. Ion beam analysis - development and application of nuclear reaction analysis methods, in particular at a nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeland, K.A.


    This thesis treats the development of Ion Beam Analysis methods, principally for the analysis of light elements at a nuclear microprobe. The light elements in this context are defined as having an atomic number less than approx. 13. The work reported is to a large extent based on multiparameter methods. Several signals are recorded simultaneously, and the data can be effectively analyzed to reveal structures that can not be observed through one-parameter collection. The different techniques are combined in a new set-up at the Lund Nuclear Microprobe. The various detectors for reaction products are arranged in such a way that they can be used for the simultaneous analysis of hydrogen, lithium, boron and fluorine together with traditional PIXE analysis and Scanning Transmission Ion Microscopy as well as photon-tagged Nuclear Reaction Analysis. 48 refs.

  9. Development of Hydrophilic Interaction Liquid Chromatography Method for the Analysis of Moxonidine and Its Impurities

    Directory of Open Access Journals (Sweden)

    Slavica Filipic


    Full Text Available Fast and simple hydrophilic interaction liquid chromatography (HILIC method was developed and validated for the analysis of moxonidine and its four impurities (A, B, C, and D in pharmaceutical dosage form. All experiments were performed on the Agilent Technologies 1200 high-performance liquid chromatography (HPLC system using Zorbax RX-SIL, 250 mm × 4.6 mm, 5 μm column as stationary phase (T=25°C, F=1 mL/min, and λ=255 nm, and mixture of acetonitrile and 40 mM ammonium formate buffer (pH 2.8 80 : 20 (v/v as mobile phase. Under the optimal chromatographic conditions, selected by central composite design, separation and analysis of moxonidine and its four impurities are enabled within 12 minutes. Validation of the method was conducted in accordance with ICH guidelines. Based on the obtained results selectivity, linearity (r≥0.9976, accuracy (recovery: 93.66%–114.08%, precision (RSD: 0.56%–2.55%, and robustness of the method were confirmed. The obtained values of the limit of detection and quantification revealed that the method can be used for determination of impurities levels below 0.1%. Validated method was applied for determination of moxonidine and its impurities in commercially available tablet formulation. Obtained results confirmed that validated method is fast, simple, and reliable for analysis of moxonidine and its impurities in tablets.

  10. Integrated support for medical image analysis methods: from development to clinical application. (United States)

    Olabarriaga, Sílvia D; Snel, Jeroen G; Botha, Charl P; Belleman, Robert G


    Computer-aided image analysis is becoming increasingly important to efficiently and safely handle large amounts of high-resolution images generated by advanced medical imaging devices. The development of medical image analysis (MIA) software with the required properties for clinical application, however, is difficult and labor-intensive. Such development should be supported by systems providing scalable computational capacity and storage space, as well as information management facilities. This paper describes the properties of distributed systems to support and facilitate the development, evaluation, and clinical application of MIA methods. First, the main characteristics of existing systems are presented. Then, the phases in a method's lifecycle are analyzed (development, parameter optimization, evaluation, clinical routine), identifying the types of users, tasks, and related computational issues. A scenario is described where all tasks are performed with the aid of computational tools integrated into an ideal supporting environment. The requirements for this environment are described, proposing a grid-oriented paradigm that emphasizes virtual collaboration among users, pieces of software, and devices distributed among geographically dispersed healthcare, research, and development enterprises. Finally, the characteristics of the existing systems are analyzed according to these requirements. The proposed requirements offer a useful framework to evaluate, compare, and improve the existing systems that support MIA development.

  11. The development of controlled damage mechanisms-based design method for nonlinear static pushover analysis

    Directory of Open Access Journals (Sweden)

    Ćosić Mladen


    Full Text Available This paper presents the original method of controlled building damage mechanisms based on Nonlinear Static Pushover Analysis (NSPA-DMBD. The optimal building damage mechanism is determined based on the solution of the Capacity Design Method (CDM, and the response of the building is considered in incremental situations. The development of damage mechanism of a system in such incremental situations is being controlled on the strain level, examining the relationship of current and limit strains in concrete and reinforcement steel. Since the procedure of the system damage mechanism analysis according to the NSPA-DMBD method is being iteratively implemented and designing checked after the strain reaches the limit, for this analysis a term Iterative-Interactive Design (IID has been introduced. By selecting, monitoring and controlling the optimal damage mechanism of the system and by developed NSPA-DMBD method, damage mechanism of the building is being controlled and the level of resistance to an early collapse is being increased. [Projekat Ministarstva nauke Republike Srbije, br. TR 36043


    Directory of Open Access Journals (Sweden)

    О. Domkina


    Full Text Available In the article, we study the main approaches to the assessment of investment risk aiming to find the most appropriate method for estimation of risks of the investment in personnel development considering the human factor. We analyze the pros and cons of the existing methods. As a result, we suggest using the combination of expert and ranking methods as it provides wide opportunities for risk factors analysis in the sitiuation of data scarcity, in spite of the methods’ limitations of the subjectivity of expert judgments that can, however, be reduced by some of the advanced expert methods. Additionally, we consider the application of the analytical method that provides factor analysis and a foundation for the further risk management of these factors. The use of the statistical group of methods, although promising, is not feasible in practice yet because of the paucity of required data and difficulty of obtaining it from the companies which do not have incentives to provide such sensible information. Logically, the next step of the research should be a practical application of the listed methods, a test of the presented hypotheses, and an evaluation of the obtained results with the accent on the quality of risk indicators, data demands, utility and complexity of the methods’ practical application.

  13. The development of a task analysis method applicable to the tasks of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Wan Chul; Park, Ji Soo; Baek, Dong Hyeon; Ham, Dong Han; Kim, Huhn [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)


    While task analysis is one of the essential processes for human factors studies, traditional methods reveal weaknesses in dealing with the cognitive aspects, which become more critical in modern complex system. This report proposes a cognitive task analysis (CTA) method for identifying cognitive requirements of operators' tasks in nuclear power plants. The proposed CTA method is characterized by the information-oriented concept and procedure-based approach. The task prescription identifies the information requirements and trace the information flow to reveal the cognitive organization of task procedure with emphasis to the relations among the information requirements. The cognitive requirements are then analyzed in terms of cognitive span of task information, cognitive envelope and working memory relief point of t procedures, and working memory load. The proposed method is relatively simple and, possibly being incorporated in a full task analysis scheme, directly applicable to the design/evaluation of human-machine interfaces and operating procedures. A prototype of a computerized support system is developed for supporting the practicality of the proposed method. (Author) 104 refs., 8 tabs., 7 figs.

  14. Method development for trace analysis of heteroaromatic compounds in contaminated groundwater

    DEFF Research Database (Denmark)

    Johansen, Sys Stybe; Hansen, Asger B.; Mosbæk, Hans


    Water analysis,environmental analysis,extraction methods,aromatic compounds,heteroaromatic compounds,creosote,dichloromethane,diethyl ether,pentane......Water analysis,environmental analysis,extraction methods,aromatic compounds,heteroaromatic compounds,creosote,dichloromethane,diethyl ether,pentane...

  15. Development and Validation of an HPLC Method for the Analysis of Sirolimus in Drug Products

    Directory of Open Access Journals (Sweden)

    Hadi Valizadeh


    Full Text Available Purpose: The aim of this study was to develop a simple, rapid and sensitive reverse phase high performance liquid chromatography (RP-HPLC method for quantification of sirolimus (SRL in pharmaceutical dosage forms. Methods: The chromatographic system employs isocratic elution using a Knauer- C18, 5 mm, 4.6 × 150 mm. Mobile phase consisting of acetonitril and ammonium acetate buffer set at flow rate 1.5 ml/min. The analyte was detected and quantified at 278nm using ultraviolet detector. The method was validated as per ICH guidelines. Results: The standard curve was found to have a linear relationship (r2 > 0.99 over the analytical range of 125–2000ng/ml. For all quality control (QC standards in intraday and interday assay, accuracy and precision range were -0.96 to 6.30 and 0.86 to 13.74 respectively, demonstrating the precision and accuracy over the analytical range. Samples were stable during preparation and analysis procedure. Conclusion: Therefore the rapid and sensitive developed method can be used for the routine analysis of sirolimus such as dissolution and stability assays of pre- and post-marketed dosage forms.

  16. Recent developments of nanoparticle-based enrichment methods for mass spectrometric analysis in proteomics

    Institute of Scientific and Technical Information of China (English)


    In proteome research, rapid and effective separation strategies are essential for successful protein identification due to the broad dynamic range of proteins in biological samples. Some important proteins are often expressed in ultra low abundance, thus making the pre-concentration procedure before mass spectrometric analysis prerequisite. The main purpose of enrichment is to isolate target molecules from complex mixtures to reduce sample complexity and facilitate the subsequent analyzing steps. The introduction of nanoparticles into this field has accelerated the development of enrichment methods. In this review, we mainly focus on recent developments of using different nanomaterials for pre-concentration of low-abundance peptides/ proteins, including those containing post-translational modifications, such as phosphorylation and glycosylation, prior to mass spectrometric analysis.

  17. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others


    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  18. Fast analysis of glibenclamide and its impurities: quality by design framework in capillary electrophoresis method development. (United States)

    Furlanetto, Sandra; Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Mura, Paola; Pinzauti, Sergio


    A fast capillary zone electrophoresis method for the simultaneous analysis of glibenclamide and its impurities (I(A) and I(B)) in pharmaceutical dosage forms was fully developed within a quality by design framework. Critical quality attributes were represented by I(A) peak efficiency, critical resolution between glibenclamide and I(B), and analysis time. Experimental design was efficiently used for rapid and systematic method optimization. A 3(5)//16 symmetric screening matrix was chosen for investigation of the five selected critical process parameters throughout the knowledge space, and the results obtained were the basis for the planning of the subsequent response surface study. A Box-Behnken design for three factors allowed the contour plots to be drawn and the design space to be identified by introduction of the concept of probability. The design space corresponded to the multidimensional region where all the critical quality attributes reached the desired values with a degree of probability π ≥ 90%. Under the selected working conditions, the full separation of the analytes was obtained in less than 2 min. A full factorial design simultaneously allowed the design space to be validated and method robustness to be tested. A control strategy was finally implemented by means of a system suitability test. The method was fully validated and was applied to real samples of glibenclamide tablets.

  19. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse


    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  20. Development and validation of a reversed phase liquid chromatographic method for analysis of griseofulvin and impurities. (United States)

    Kahsay, Getu; Adegoke, Aremu Olajire; Van Schepdael, Ann; Adams, Erwin


    A simple and robust reversed phase liquid chromatographic method was developed and validated for the quantitative determination of griseofulvin (GF) and its impurities in drug substances and drug products (tablets). Chromatographic separation was achieved on a Discovery C18 (250mm×4.6mm, 5μm) column kept at 30°C. The mobile phase consisted of a gradient mixture of mobile phase A (water-0.1% formic acid pH 4.5, 80:20, v/v) and B (ACN-water-0.1% formic acid pH 4.5, 65:15:20, v/v/v) pumped at a flow rate of 1.0mL/min. UV detection was performed at 290nm. The method was validated for its robustness, sensitivity, precision, accuracy and linearity based on ICH guidelines. The robustness study was performed by means of an experimental design and multivariate analysis. Satisfactory results were obtained from the validation studies. The use of volatile mobile phases allowed for the identification of three main impurities present above the identification threshold using mass spectrometry (MS). The developed LC method has been applied for the assay and impurity determination of GF drug substances and tablets. The method could be very useful for the quality control of GF and its impurities in bulk and formulated dosage forms.

  1. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others


    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  2. Methods and considerations for longitudinal structural brain imaging analysis across development

    Directory of Open Access Journals (Sweden)

    Kathryn L. Mills


    Full Text Available Magnetic resonance imaging (MRI has allowed the unprecedented capability to measure the human brain in vivo. This technique has paved the way for longitudinal studies exploring brain changes across the entire life span. Results from these studies have given us a glimpse into the remarkably extended and multifaceted development of our brain, converging with evidence from anatomical and histological studies. Ever-evolving techniques and analytical methods provide new avenues to explore and questions to consider, requiring researchers to balance excitement with caution. This review addresses what MRI studies of structural brain development in children and adolescents typically measure and how. We focus on measurements of brain morphometry (e.g., volume, cortical thickness, surface area, folding patterns, as well as measurements derived from diffusion tensor imaging (DTI. By integrating finding from multiple longitudinal investigations, we give an update on current knowledge of structural brain development and how it relates to other aspects of biological development and possible underlying physiological mechanisms. Further, we review and discuss current strategies in image processing, analysis techniques and modeling of brain development. We hope this review will aid current and future longitudinal investigations of brain development, as well as evoke a discussion amongst researchers regarding best practices.


    Directory of Open Access Journals (Sweden)

    B. Anupama


    Full Text Available A rapid, simple and validated reversed-phase high-performance liquid chromatographic method has been developed for analysis of Racecadotril in Tablet dosage form. Racecadotril was separated on a Phenomenex C18 column (250 mm length, 4.6 mm internal diameter and particle size 5 µm with a 60:40 (v/v mixture of Acetonitrile and Phosphate buffer as mobile phase at a flow rate of 1.0 mL min-1. The effluent was monitored by UV detection at 228 nm. Calibration plots were linear in the range of 10 to 50 μg mL-1 and the LOD and LOQ were 0.635 and 1.94 μg mL-1, respectively. The high recovery and low relative standard deviation values confirm the suitability of the method for routine quality control determination of Racecadotril in tablets

  4. Development of a Bayesian method for the analysis of inertial confinement fusion experiments on the NIF

    CERN Document Server

    Gaffney, Jim A; Sonnad, Vijay; Libby, Stephen B


    The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters that are only known with limited reliability. These parameters, combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating underlying microphysics models. The result is framed as a modified $\\chi^{2}$ analysis which is easy to implement in existing analyses, and quite portable. We present a first application to a recent convergent ablator experiment performed at the NIF, and investigate the effect of variations in all physical dimensions of the target (very difficult to do using other methods). We show that for well characterised targets in which dimensions vary at the 0.5% level there is little effect, but 3% variations change the results of i...

  5. Priority-sequence of mineral resources’ development and utilization based on grey relational analysis method

    Institute of Scientific and Technical Information of China (English)

    Wang Ying; Zhang Chang; Jiang Gaopeng


    Generally, the sequence decision of the development and utilization of Chinese mineral resources is based on national and provincial overall plan of the mineral resources. Such plan usually cannot reflect the relative size of the suitability of the development and utilization of mineral resources. To solve the problem, the paper has selected the gift condition, the market condition, the technological condition, socio-economic condition and environmental condition as the starting-points to analyze the influential factors of the priority-sequence of mineral resources’ development and utilization. The above 5 condi-tions are further specified into 9 evaluative indicators to establish an evaluation indicator system. At last, we propose a decision model of the priority sequence based on grey relational analysis method, and fig-ure out the observation objects by the suitability index of development. Finally, the mineral resources of a certain province in China were analyzed as an example. The calculation results indicate that silver (2.0057), coal (1.9955), zinc (1.9442), cement limestone (1.9077), solvent limestone (1.5624) and other minerals in the province are suitable for development and utilization.


    Directory of Open Access Journals (Sweden)

    N. V. Zhelninskaya


    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  7. Development and Analysis of Train Brake Curve Calculation Methods with Complex Simulation

    Directory of Open Access Journals (Sweden)

    Bela Vincze


    Full Text Available This paper describes an efficient method using simulation for developing and analyzing train brake curve calculation methods for the on-board computer of the ETCS system. An application example with actual measurements is also presented.


    Directory of Open Access Journals (Sweden)

    Pavel Holba


    Full Text Available A short history on the development of thermometric methods are reviewed accentuating the role of Rudolf Bárta in underpinning special thermoanalytical conferences and new journal Silikáty in fifties as well as Vladimir Šatava mentioning his duty in the creation of the Czech school on thermoanalytical kinetics. This review surveys the innovative papers dealing with thermal analysis and the related fields (e.g. calorimetry, kinetics which have been published by noteworthy postwar Czechoslovak scholars and scientists and by their disciples in 1950-1980. Itemized 227 references with titles show rich scientific productivity revealing that many of them were ahead of time even at international connotation.

  9. Development and optimization of a metabolomic method for analysis of adherent cell cultures. (United States)

    Danielsson, Anders P H; Moritz, Thomas; Mulder, Hindrik; Spégel, Peter


    In this investigation, a gas chromatography/mass spectrometry (GC/MS)-based metabolomic protocol for adherent cell cultures was developed using statistical design of experiments. Cell disruption, metabolite extraction, and the GC/MS settings were optimized aiming at a gentle, unbiased, sensitive, and high-throughput metabolomic protocol. Due to the heterogeneity of the metabolome and the inherent selectivity of all analytical techniques, development of unbiased protocols is highly complex. Changing one parameter of the protocol may change the response of many groups of metabolites. In this investigation, statistical design of experiments and multivariate analysis also allowed such interaction effects to be taken into account. The protocol was validated with respect to linear range, precision, and limit of detection in a clonal rat insulinoma cell line (INS-1 832/13). The protocol allowed high-throughput profiling of metabolites covering the major metabolic pathways. The majority of metabolites displayed a linear range from a single well in a 96-well plate up to a 10 cm culture dish. The method allowed a total of 47 analyses to be performed in 24h.

  10. Development of thermal analysis method for the near field of HLW repository using ABAQUS

    Energy Technology Data Exchange (ETDEWEB)

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)


    An appropriate tool is needed to evaluate the thermo-mechanical stability of high level radioactive waste (HLW) repository. In this report a thermal analysis methodology for the near field of HLW repository is developed to use ABAQUS which is one of the multi purpose FEM code and has been used for many engineering area. The main contents of this methodology development are the structural and material modelling to simulate a repository, setup of side conditions, e.g., boundary and load conditions, and initial conditions, and the procedure to selection proper material parameters. In addition to these, the interface programs for effective production of input data and effective change of model size for sensitivity analysis for disposal concept development are developed. The results of this work will be apply to evaluate the thermal stability and to use as main input data for mechanical analysis of HLW repository. (author). 20 refs., 15 figs., 5 tabs.

  11. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results (United States)

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  12. Development and Validation of a Chromatographic Method for the Analysis of Multicompound Pharmaceutical Preparations


    Ferreyra, Carola; Ortiz, Cristina; Bertorello, M. M. De


    A reverse phase high performance liquid chromatographic assay was carried out for the simultaneous determination of two out of three active principles present in a pharmaceutical preparation. This method was developed to assess the quality of the product.

  13. Development of a segmentation method for analysis of Campos basin typical reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Rego, Eneida Arendt; Bueno, Andre Duarte [Universidade Estadual do Norte Fluminense Darcy Ribeiro (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao de Petroleo (LENEP)]. E-mails:;


    This paper represents a master thesis proposal in Exploration and Reservoir Engineering that have the objective to development a specific segmentation method for digital images of reservoir rocks, which produce better results than the global methods available in the bibliography for the determination of rocks physical properties as porosity and permeability. (author)

  14. Pathways to Lean Software Development: An Analysis of Effective Methods of Change (United States)

    Hanson, Richard D.


    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  15. Developments of the neutron scattering analysis method for the determination of magnetic structures

    Energy Technology Data Exchange (ETDEWEB)

    Park, Je-Geun; Chung, Jae Gwan; Park, Jung Hwan; Kong, Ung Girl [Inha Univ., Incheon (Korea); So, Ji Yong [Seoul National University, Seoul(Korea)


    Neutron diffraction is up to now almost the only and very important experimental method of determining the magnetic structure of materials. Unlike the studies of crystallographic structure, however to use neutron diffraction for magnetic structure determination is not easily accessible to non-experts because of the complexity of magnetic group theory: which is very important in the magnetic structure analysis. With the recent development of computer code for magnetic group, it is now time to rethink of these difficulties. In this work, we have used the computer code of the magnetic group (Mody-2) and Fullprof refinement program in order to study the magnetic structure of YMnO{sub 3} and other interesting materials. YMnO{sub 3} forms in the hexagonal structure and show both ferroelectric and antiferromagnetic phase transitions. Since it was recently found that YMnO{sub 3} can be used as a nonvolatile memory device, there has been many numbers of applied research on this material. We used neutron diffraction to determine the magnetic structure, and, in particular, to investigate the correlation between the order parameters of the ferroelectric and antiferromagnetic phase transitions. From this study, we have demonstrated that with a proper use of the computer code of the magnetic group one can overcome most of difficulties arising from the magnetic group theory. 4 refs., 8 figs., 5 tabs. (Author)

  16. Development of Evaluation Methods for Lower Limb Function between Aged and Young Using Principal Component Analysis (United States)

    Nomoto, Yohei; Yamashita, Kazuhiko; Ohya, Tetsuya; Koyama, Hironori; Kawasumi, Masashi

    There is the increasing concern of the society to prevent the fall of the aged. The improvement in aged people's the muscular strength of the lower-limb, postural control and walking ability are important for quality of life and fall prevention. The aim of this study was to develop multiple evaluation methods in order to advise for improvement and maintenance of lower limb function between aged and young. The subjects were 16 healthy young volunteers (mean ± S.D: 19.9 ± 0.6 years) and 10 healthy aged volunteers (mean ± S.D: 80.6 ± 6.1 years). Measurement items related to lower limb function were selected from the items which we have ever used. Selected measurement items of function of lower are distance of extroversion of the toe, angle of flexion of the toe, maximum width of step, knee elevation, moving distance of greater trochanter, walking balance, toe-gap force and rotation range of ankle joint. Measurement items summarized by the principal component analysis into lower ability evaluation methods including walking ability and muscle strength of lower limb and flexibility of ankle. The young group demonstrated the factor of 1.6 greater the assessment score of walking ability compared with the aged group. The young group demonstrated the factor of 1.4 greater the assessment score of muscle strength of lower limb compared with the aged group. The young group demonstrated the factor of 1.2 greater the assessment score of flexibility of ankle compared with the aged group. The results suggested that it was possible to assess the lower limb function of aged and young numerically and to advise on their foot function.


    Energy Technology Data Exchange (ETDEWEB)

    Jurgensen, A; David Missimer, D; Ronny Rutherford, R


    The x-ray fluorescence laboratory (XRF) in the Analytical Development Directorate (ADD) of the Savannah River National Laboratory (SRNL) was requested to develop an x-ray fluorescence spectrometry method for elemental characterization of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) pretreated low activity waste (LAW) stream to the LAW Vitrification Plant. The WTP is evaluating the potential for using XRF as a rapid turnaround technique to support LAW product compliance and glass former batching. The overall objective of this task was to develop an XRF analytical method that provides rapid turnaround time (<8 hours), while providing sufficient accuracy and precision to determine variations in waste.

  18. Development and assessment of analysis methods for MATMOS trace gas retrievals (United States)

    Olsen, K. S.; Toon, G. C.; Boone, C.; Strong, K.


    The Mars Atmospheric Trace Molecule Occultation Spectrometer (MATMOS) mission will deploy a high-resolution infrared Fourier transform spectrometer (FTS) to Mars on-board the ExoMars Trace Gas Orbiter in 2016. MATMOS is a joint investigation between the California Institute of Technology, NASA's Jet Propulsion Laboratory, and the Canadian Space Agency (CSA). The instrument will be similar to, and derives a strong heritage from, the CSA's Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS) on-board SCISAT. Both MATMOS and ACE-FTS measure the absorption spectra of the atmosphere during solar occultation, making up to 30 measurements per day, one at each satellite sunrise and sunset.ACE-FTS analyzes CO2 to determine temperature and pressure as functions of altitude, then simultaneously retrieves vertical profiles of volume mixing ratio (VMR) for more than two dozen gases using least squares minimization. MATMOS will make use of the GGG analysis suite developed for the MkIV balloon flights, the Atmospheric Trace Molecule Spectroscopy experiment and the ground-based Total Carbon Column Observatory Network. Both retrieval methods divide the spectra into smaller spectral windows and use a Voigt instrument line shape. Their forward models are similar and divide the atmosphere into layers that are assumed to have constant temperature, pressure, and VMR for each molecule. GGG uses the inverse method and its own nonlinear least-squares fitting program to derive partial columns along the optical slant path and to retrieve VMR profiles. MATMOS will inventory the composition of the Martian atmosphere with a sensitivity 2-3 orders of magnitude better than any previous instrument. In order to prepare the GGG analysis suite for the upcoming MATMOS mission, 25 ACE occultations have been analyzed using GGG and compared to the ACE-FTS v3.0 retrievals. This work examines the differences between the two algorithms, identifies those that produce inconsistencies

  19. Pathways to lean software development: An analysis of effective methods of change (United States)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  20. Development and validation of HPLC method for analysis of dexamethasone acetate in microemulsions

    Directory of Open Access Journals (Sweden)

    Maria Cristina Cocenza Urban


    Full Text Available A simple, rapid, accurate and sensitive method was developed for quantitative analysis of dexamethasone acetate in microemulsions using high performance liquid chromatography (HPLC with UV detection. The chromatography parameters were stainless steel Lichrospher 100 RP-18 column (250 mm x 4 mm i.d., 5 μm particle size, at 30 ± 2 ºC. The isocratic mobile phase was methanol:water (65:35; v/v at a flow rate of at 1.0 mL.min-1. The determinations were performed using UV-Vis detector set at 239 nm. Samples were prepared with methanol and the volume injected was 20 μL. The analytical curve was linear (r² 0.9995 over a wide concentration range (2.0-30.0 μg.mL-1. The presence of components of the microemulsion did not interfere in the results of the analysis. The method showed adequate precision, with a relative standard deviation (RSD smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The HPLC method developed in this study showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantification of dexamethasone in microemulsions. The analytical procedure is reliable and offers advantages in terms of speed and low cost of reagents.Um método simples, rápido, preciso e sensível foi desenvolvido para a análise quantitativa de acetato de dexametasona em microemulsões usando cromatografia líquida de alta eficiência (CLAE. Os parâmetros cromatográficos foram: coluna cromatográfica Lichrospher 100 RP-18, (250 mm x 4 mm i.d., 5 μm partícula tamanho, com temperatura de coluna de 30 ± 2 ºC. A fase móvel foi composta de metanol: água (65:35; v/v com fluxo isocrático de 1 mL.min-1 e volume de injeção de 20 μL. As determinações foram realizadas utilizando detector UV-Vis no comprimento de onda de 239 nm. A curva analítica mostrou-se linear (r² 0,999 em uma ampla faixa de

  1. The Social Memoir: An Analysis of Developing Reflective Ability in a Pre-Service Methods Course (United States)

    Braun, Joseph A., Jr.; Crumpler, Thomas P.


    This descriptive study uses narrative analysis to examine the nature and quality of pre-service teachers' initial attempts at reflection via the genre of memoir writing in a social studies methods course. The paper begins by reviewing other uses of narrative reflection and autobiographical writing in teacher education. This is followed by an…

  2. Recursive Frame Analysis: Reflections on the Development of a Qualitative Research Method (United States)

    Keeney, Hillary; Keeney, Bradford


    The origin of recursive frame analysis (RFA) is revisited and discussed as a postmodern alternative to modernist therapeutic models and research methods that foster hegemony of a preferred therapeutic metaphor, narrative, or strategy. It encourages improvisational performance while enabling a means of scoring the change and movement of the…

  3. Development of LC-MS/MS method for analysis of polyphenolic compounds in juice, tea and coffee samples (United States)

    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation “dilute and shoot” approach, and LC-MS/MS triple qu...

  4. Level set method for computational multi-fluid dynamics: A review on developments, applications and analysis

    Indian Academy of Sciences (India)

    Atul Sharma


    Functions and conservation as well as subsidiary equations in Level Set Method (LSM) are presented. After the mathematical formulation, improvements in the numerical methodology for LSM are reviewed here for advection schemes, reinitialization methods, hybrid methods, adaptive-grid LSM, dual-resolution LSM, sharp-interface LSM, conservative LSM, parallel computing and extension from two to multi fluid/phase as well as to various types of two-phase flow. In the second part of this article, LSM method based Computational Multi-Fluid Dynamics (CMFD) applications and analysis are reviewed for four different types of multi-phase flow: separated and parallel internal flow, drop/bubble dynamics during jet break-up, drop impact dynamics on a solid or liquid surface and boiling. In the last twenty years, LSM has established itself as a method which is easy to program and is accurate as well as computationally-efficient.

  5. A Product Analysis Method and Its Staging to Develop Redesign Competences (United States)

    Hansen, Claus Thorp; Lenau, Torben Anker


    Most product development work in industrial practice is incremental, i.e., the company has had a product in production and on the market for some time, and now time has come to design an upgraded variant. This type of redesign project requires that the engineering designers have competences to carry through an analysis of the existing product…

  6. Development of an unbiased statistical method for the analysis of unigenic evolution

    Directory of Open Access Journals (Sweden)

    Shilton Brian H


    Full Text Available Abstract Background Unigenic evolution is a powerful genetic strategy involving random mutagenesis of a single gene product to delineate functionally important domains of a protein. This method involves selection of variants of the protein which retain function, followed by statistical analysis comparing expected and observed mutation frequencies of each residue. Resultant mutability indices for each residue are averaged across a specified window of codons to identify hypomutable regions of the protein. As originally described, the effect of changes to the length of this averaging window was not fully eludicated. In addition, it was unclear when sufficient functional variants had been examined to conclude that residues conserved in all variants have important functional roles. Results We demonstrate that the length of averaging window dramatically affects identification of individual hypomutable regions and delineation of region boundaries. Accordingly, we devised a region-independent chi-square analysis that eliminates loss of information incurred during window averaging and removes the arbitrary assignment of window length. We also present a method to estimate the probability that conserved residues have not been mutated simply by chance. In addition, we describe an improved estimation of the expected mutation frequency. Conclusion Overall, these methods significantly extend the analysis of unigenic evolution data over existing methods to allow comprehensive, unbiased identification of domains and possibly even individual residues that are essential for protein function.

  7. An Observational Analysis of Coaching Behaviors for Career Development Event Teams: A Mixed Methods Study (United States)

    Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.


    School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…

  8. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps (United States)

    Chiu, Chiung-Hui; Lin, Chien-Liang


    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  9. Development and Validation of a Triple Quad LC/MS Method for Fiber Dye Analysis (United States)

    Connolly-Ingram, Ceirin M.

    This study aims to determine whether the analysis of dyed fiber through liquid chromatography (HPLC) with triple-quadrupole mass spectrometry (MS) can be used as a reliable alternative to the current chemical techniques used to differentiate dyes. Other methods of analysis involving HPLC and MS have proven to be capable of distinguishing chemically different dyes within a few dye classifications, but none have proven capable of providing a complete alternative to the current accepted technique of thin layer chromatography (TLC). In theory, HPLC-triple quad MS is capable of providing more reproducible and reliable data than the conventional TLC methods with a much greater depth of measurable information with which to characterize dye components. In this study, dyes will be extracted from various types of fibers, including commonly worn types like cotton, polyester, nylon, and wool, and examine dyes from most of the eight different dye classes.

  10. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene


    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  11. Cooperative method development

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Rönkkö, Kari; Eriksson, Jeanette;


    research is not easily combined with the improvement orientation of an engineering discipline. During the last 6 years, we have applied an approach we call `cooperative method development', which combines qualitative social science fieldwork, with problem-oriented method, technique and process improvement....... The action research based approach focusing on shop floor software development practices allows an understanding of how contextual contingencies influence the deployment and applicability of methods, processes and techniques. This article summarizes the experiences and discusses the further development......The development of methods tools and process improvements is best to be based on the understanding of the development practice to be supported. Qualitative research has been proposed as a method for understanding the social and cooperative aspects of software development. However, qualitative...

  12. Value analysis: a method for teaching nursing ethics and promoting the moral development of students. (United States)

    Frisch, N C


    In an investigation of junior-level baccalaureate nursing students, the value analysis teaching strategy was used to teach content related to nursing ethics. It was hypothesized that such a strategy, which emphasized the need for careful evaluation and weighing of facts preparatory to drawing conclusions, would impact on the student's level of cognitive moral development. Cognitive moral development was defined in accord with Kohlberg's theory of moral development. Control and experimental populations were derived from two groups of students sequentially enrolled in the same course. Pre- and post-testing using scores on Rest's Defining Issues Test (DIT) showed significant differences in gain between control and experimental subjects. There was a strong association between DIT score gains and self-report of peer discussion of ethical issues. The major implication of this study is that instructional intervention produces measurable change in some students' level of moral judgment.

  13. Analysis of heavy oils: Method development and application to Cerro Negro heavy petroleum

    Energy Technology Data Exchange (ETDEWEB)

    Carbognani, L.; Hazos, M.; Sanchez, V. (INTEVEP, Filial de Petroleos de Venezuela, SA, Caracas (Venezuela)); Green, J.A.; Green, J.B.; Grigsby, R.D.; Pearson, C.D.; Reynolds, J.W.; Shay, J.Y.; Sturm, G.P. Jr.; Thomson, J.S.; Vogh, J.W.; Vrana, R.P.; Yu, S.K.T.; Diehl, B.H.; Grizzle, P.L.; Hirsch, D.E; Hornung, K.W.; Tang, S.Y.


    On March 6, 1980, the US Department of Energy (DOE) and the Ministry of Energy and Mines of Venezuela (MEMV) entered into a joint agreement which included analysis of heavy crude oils from the Venezuelan Orinoco oil belt.The purpose of this report is to present compositional data and describe new analytical methods obtained from work on the Cerro Negro Orinoco belt crude oil since 1980. Most of the chapters focus on the methods rather than the resulting data on Cerro Negro oil, and results from other oils obtained during the verification of the method are included. In addition, published work on analysis of heavy oils, tar sand bitumens, and like materials is reviewed, and the overall state of the art in analytical methodology for heavy fossil liquids is assessed. The various phases of the work included: distillation and determination of routine'' physical/chemical properties (Chapter 1); preliminary separation of >200{degree}C distillates and the residue into acid, base, neutral, saturated hydrocarbon and neutral-aromatic concentrates (Chapter 2); further separation of acid, base, and neutral concentrates into subtypes (Chapters 3-5); and determination of the distribution of metal-containing compounds in all fractions (Chapter 6).

  14. Development and validation of high performance liquid chromatographic method for analysis of clozapine. (United States)

    Kaur, Harpreet; Bassi, Pallavi; Monif, Tausif; Khuroo, Arshad; Kaur, Gurpreet


    In this study a rapid, simple and sensitive assay to quantify clozapine in human plasma by using reverse phase high performance liquid chromatographic method has been developed. Clozapine was extracted from human plasma using a mixture of chloroform: n-hexane 50:50 employing liquid-liquid extraction method. The calibration curve was found to be linear in the concentration range of 25-800 ng/ml. The inter day and intra day assay accuracy and precision fulfilled the criteria specified by USFDA, Guidance for industry: bioanalytical method validation. Clozapine was found to be stable in human plasma after 6 h incubation at room temperature, 50 days storage at -27°C and freeze thaw cycles, as well as after reconstitution with mobile phase after 24 h of storage in refrigerator. The validated method offers the advantage of using minimum injection volume (25μl) and plasma sample volume (300μl). The extraction method is simple and single step with no back extraction step, thus, making this method applicable to determination of pharmacokinetic profiles and parameters.

  15. Developing A New Sampling and Analysis Method for Hydrazine and Monomethyl Hydrazine (United States)

    Allen, John R.


    Solid phase microextraction (SPME) will be used to develop a method for detecting monomethyl hydrazine (MMH) and hydrazine (Hz). A derivatizing agent, pentafluorobenzoyl chloride (PFBCl), is known to react readily with MMH and Hz. The SPME fiber can either be coated with PFBCl and introduced into a gaseous stream containing MMH, or PFBCl and MMH can react first in a syringe barrel and after a short equilibration period a SPME is used to sample the resulting solution. These methods were optimized and compared. Because Hz and MMH can degrade the SPME, letting the reaction occur first gave better results. Only MMH could be detected using either of these methods. Future research will concentrate on constructing calibration curves and determining the detection limit.

  16. Methods developed for the mass sampling analysis of CO and carboxyhemoglobin in man

    Energy Technology Data Exchange (ETDEWEB)

    Baretta, E.D.; Stewart, R.D.; Graff, S.A.; Donahoo, K.K.


    Gas chromatography was used to quantitate CO in air and also as an indirect means of determining %COHb in blood. The blood was then used to calibrate four CO-Oximeters used in a survey to determine average COHb levels in various segments of the U.S. population. Mean differences, both between the two methods of analysis and between pairs of CO-Oximeters, were less than 0.1% COHb saturation. COHb values obtained on consecutive days using one CO-Oximeter were repeatable within a S.D. +- 0.13% COHb.

  17. Development of advanced methods for analysis of experimental data in diffusion (United States)

    Jaques, Alonso V.

    There are numerous experimental configurations and data analysis techniques for the characterization of diffusion phenomena. However, the mathematical methods for estimating diffusivities traditionally do not take into account the effects of experimental errors in the data, and often require smooth, noiseless data sets to perform the necessary analysis steps. The current methods used for data smoothing require strong assumptions which can introduce numerical "artifacts" into the data, affecting confidence in the estimated parameters. The Boltzmann-Matano method is used extensively in the determination of concentration - dependent diffusivities, D(C), in alloys. In the course of analyzing experimental data, numerical integrations and differentiations of the concentration profile are performed. These methods require smoothing of the data prior to analysis. We present here an approach to the Boltzmann-Matano method that is based on a regularization method to estimate a differentiation operation on the data, i.e., estimate the concentration gradient term, which is important in the analysis process for determining the diffusivity. This approach, therefore, has the potential to be less subjective, and in numerical simulations shows an increased accuracy in the estimated diffusion coefficients. We present a regression approach to estimate linear multicomponent diffusion coefficients that eliminates the need pre-treat or pre-condition the concentration profile. This approach fits the data to a functional form of the mathematical expression for the concentration profile, and allows us to determine the diffusivity matrix directly from the fitted parameters. Reformulation of the equation for the analytical solution is done in order to reduce the size of the problem and accelerate the convergence. The objective function for the regression can incorporate point estimations for error in the concentration, improving the statistical confidence in the estimated diffusivity matrix

  18. Contribution of ion beam analysis methods to the development of second generation high temperature superconducting wires (United States)

    Usov, I. O.; Arendt, P. N.; Foltyn, S. R.; Stan, L.; DePaula, R. F.; Holesinger, T. G.


    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer-layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and intermediate layer providing a suitable lattice match to the superconducting Y 1Ba 2Cu 3O 7 (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA and ERD) was employed for analysis of each buffer layer and the YBCO film. These results assisted in understanding of a variety of physical processes occurring during the buffer layer fabrication and helped to optimize the buffer-layer architecture as a whole.

  19. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J


    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  20. Effective methods of consumer protection in Brazil. An analysis in the context of property development contracts

    Directory of Open Access Journals (Sweden)

    Deborah Alcici Salomão


    Full Text Available This study examines consumer protection in arbitration, especially under the example of property development contract disputes in Brazil. This is a very current issue in light of the presidential veto of consumer arbitration on May 26, 2015. The article discusses the arbitrability of these disputes based on Brazilian legislation and relevant case law. It also analyzes of the advantages, disadvantages and trends of consumer arbitration in the context of real estate contracts. The paper concludes by providing suggestions specific to consumer protection in arbitration based on this analysis.

  1. Development of a preparation and staining method for fetal erythroblasts in maternal blood : Simultaneous immunocytochemical staining and FISH analysis

    NARCIS (Netherlands)

    Oosterwijk, JC; Mesker, WE; Ouwerkerk-van Velzen, MCM; Knepfle, CFHM; Wiesmeijer, KC; van den Burg, MJM; Beverstock, GC; Bernini, LF; van Ommen, Gert-Jan B; Kanhai, HHH; Tanke, HJ


    In order to detect fetal nucleated red blood cells (NRBCs) in maternal blood, a protocol was developed which aimed at producing a reliable staining method for combined immunocytochemical and FISH analysis. The technique had to be suitable for eventual automated screening of slides. Chorionic villi w

  2. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network. (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae


    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research.

  3. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics. (United States)

    Lesellier, E; Mith, D; Dubrulle, I


    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  4. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures (United States)

    Brown, A. M.


    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  5. Development of a carbohydrate silylation method in ionic liquids for their gas chromatographic analysis. (United States)

    Ruiz-Aceituno, L; Carrero-Carralero, C; Ramos, L; Martinez-Castro, I; Sanz, M L


    This paper reports on the feasibility of silylation of low molecular weight carbohydrates dissolved in different ionic liquids (ILs) for their further analysis by gas chromatography (GC). Derivatization reagents (nature and amounts), temperature and time of reaction and stirring conditions were evaluated for different carbohydrates (i.e., glucose, mannose, fructose and lactose) dissolved in 1-ethyl-3-methylimidazolium dicyanamide [EMIM][DCA]. Evaluation of conformational isomerism of glucose dissolved in [EMIM][DCA] revealed the effect of the time of dissolution in the equilibration of α- and β-furanoses (up to 3% and 6%, respectively, after 70 h of incubation) and that 21 h sufficed to obtain results similar to those provided by the reference method involving pyridine. Once optimized, the proposed derivatization procedure provided satisfactory yields (i.e., close to 100%) using 100 μL of trimethylsilylimidazole (TMSI) at mild conditions (25°C) for a relatively short time (1h) for most of the investigated carbohydrates. Under these experimental conditions, linear responses (i.e., R(2) better than 0.974) were obtained in the tested range of 0.25-1mg of the derivatized target compounds. Other reagents, such as N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA)+1% trimethylchlorosilane (TMCS), were successfully used under ultrasonic conditions for aldose monosaccharides and disaccharides derivatization, while BSTFA was useful for ketose monosaccharides. The possibility of using the proposed method for the derivatization of selected carbohydrates dissolved in different ILs and the efficiency of the method applied to the analysis of carbohydrates present in real samples (fruit juices) have also been investigated.

  6. Performance analysis of a modified moving shadow elimination method developed for indoor scene activity tracking (United States)

    Mitra, Bhargav Kumar; Fiaz, Muhammad Kamran; Kypraios, Ioannis; Birch, Philip; Young, Rupert; Chatwin, Chris


    Moving shadow detection is an important step in automated robust surveillance systems in which a dynamic object is to be segmented and tracked. Rejection of the shadow region significantly reduces the erroneous tracking of non-target objects within the scene. A method to eliminate such shadows in indoor video sequences has been developed by the authors. The objective has been met through the use of a pixel-wise shadow search process that utilizes a computational model in the RGB colour space to demarcate the moving shadow regions from the background scene and the foreground objects. However, it has been observed that the robustness and efficiency of the method can be significantly enhanced through the deployment of a binary-mask based shadow search process. This, in turn, calls for the use of a prior foreground object segmentation technique. The authors have also automated a standard foreground object segmentation technique through the deployment of some popular statistical outlier-detection based strategies. The paper analyses the performance i.e. the effectiveness as a shadow detector, discrimination potential, and the processing time of the modified moving shadow elimination method on the basis of some standard evaluation metrics.


    Directory of Open Access Journals (Sweden)



    Full Text Available Study of organizational activity and highlighting problem situations that require specific solutions, require a detailed analysis of the models defined for the real system of the economic companies, regarded not as a sum of assets, but as organizations in which there are activities related into processes. In addition to the usual approach of using modeling languages in the development of information systems, in this paper we intend to present some examples that demonstrate the usefulness of a standard modeling language (UML to analyze organizational activities and to report problem situations that may occur in data management registered on primary documents or in processes that bring together activities. Examples that have been focused on a travel agency can be extrapolated to any other organization, and the diagrams can be used in different contexts, depending on the complexity of the activities identified.

  8. Development of distinction method of production area of ginsengs by using a neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Youngjin; Chung, Yongsam; Sim, Chulmuu; Sun, Gwangmin; Lee, Yuna; Yoo, Sangho


    During the last 2 years of the project, we have tried to develop the technology to make a distinction of the production areas for Korean ginsengs cultivated in the various provinces in Korea and foreign countries. It will contribute to secure the health food safety for public and stability of its market. In this year, we collected ginseng samples cultivated in the northeastern province in Chinese mainland such as Liaoning province, Jilin province and Baekdu mountain within Jilin province. 10 ginseng samples were collected at each province. The elemental concentrations in the ginseng were analyzed by using a neutron activation analysis technique at the HANARO research reactor. The distinction of production area was made by using a statistical software. As a result, the Chinese Korean ginsengs were certainly differentiated from those cultivated in the famous province in Korea though there was a limitation that the number of our sample we analyzed is very small.

  9. Development of breached pin performance analysis code SAFFRON (System of Analyzing Failed Fuel under Reactor Operation by Numerical method)

    Energy Technology Data Exchange (ETDEWEB)

    Ukai, Shigeharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center


    On the assumption of fuel pin failure, the breached pin performance analysis code SAFFRON was developed to evaluate the fuel pin behavior in relation to the delayed neutron signal response during operational mode beyond the cladding failure. Following characteristic behavior in breached fuel pin is modeled in 3-dimensional finite element method : pellet swelling by fuel-sodium reaction, fuel temperature change, and resultant cladding breach extension and delayed neutron precursors release into coolant. Particularly, practical algorithm of numerical procedure in finite element method was originally developed in order to solve the 3-dimensional non-linear contact problem between the swollen pellet due to fuel-sodium reaction and breached cladding. (author).

  10. Methods and tools for developing virtual territories for scenario analysis of agro-ecosystems

    Directory of Open Access Journals (Sweden)

    Carlo Giupponi


    combinations of different typologies or levels of climate, physical conditions, socio-economic development, etc.; the efficiency and the flexibility of the tools adopted to easily generate realistic landscape and their variants. The approach is demonstrated through the development of erosion analysis under climate change scenarios.

  11. Nano-sized aerosol classification, collection and analysis--method development using dental composite materials. (United States)

    Bogdan, Axel; Buckett, Mary I; Japuntich, Daniel A


    This article presents a methodical approach for generating, collecting, and analyzing nano-size (1-100 nm) aerosol from abraded dental composite materials. Existing aerosol sampling instruments were combined with a custom-made sampling chamber to create and sample a fresh, steady-state aerosol size distribution before significant Brownian coagulation. Morphological, size, and compositional information was obtained by Transmission Electron Microscopy (TEM). To create samples sizes suitable for TEM analysis, aerosol concentrations in the test chamber had to be much higher than one would typically expect in a dental office, and therefore, these results do not represent patient or dental personnel exposures. Results show that nano-size aerosol was produced by the dental drill alone, with and without cooling water drip, prior to abrasion of dental composite. During abrasion, aerosol generation seemed independent of the percent filler load of the restorative material and the operator who generated the test aerosol. TEM investigation showed that "chunks" of filler and resin were generated in the nano-size range; however, free nano-size filler particles were not observed. The majority of observed particles consisted of oil droplets, ash, and graphitic structures.

  12. Development and validation of a GC-FID method for quantitative analysis of oleic acid and related fatty acids☆

    Institute of Scientific and Technical Information of China (English)

    Honggen Zhang; Zhenyu Wang; Oscar Liu


    Oleic acid is a common pharmaceutical excipient that has been widely used in various dosage forms. Gas chromatography (GC) has often been used as the quantitation method for fatty acids normally requiring a derivatization step. The aim of this study was to develop a simple, robust, and derivatization-free GC method that is suitable for routine analysis of all the major components in oleic acid USP-NF (United States Pharmacopeia-National Formulary) material. A gas chromatography-flame ionization detection (GC-FID) method was developed for direct quantitative analysis of oleic acid and related fatty acids in oleic acid USP-NF material. Fifteen fatty acids were separated using a DB-FFAP (nitroterephthalic acid modified polyethylene glycol) capillary GC column (30 m × 0.32 mm i.d.) with a total run time of 20 min. The method was validated in terms of specificity, linearity, precision, accuracy, sensitivity, and robustness. The method can be routinely used for the purpose of oleic acid USP-NF material analysis.

  13. The development and application of k -standardization method of neutron activation analysis at Es-Salam research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Alghem, L. [Departement d' Analyse par Activation Neutronique, CRNB, BP 180, Ain Oussera 17200, W Djelfa (Algeria)]. E-mail:; Ramdhane, M. [Departement de physique, Universite Mentouri de Constantine (Algeria); Khaled, S. [Departement d' Analyse par Activation Neutronique, CRNB, BP 180, Ain Oussera 17200, W Djelfa (Algeria); Akhal, T. [Departement d' Analyse par Activation Neutronique, CRNB, BP 180, Ain Oussera 17200, W Djelfa (Algeria)


    In recent years the k -NAA method has been applied and developed at the 15 MW Es-Salam research reactor, which includes: (1) the detection efficiency calibration of {gamma}-spectrometer used in k -NAA (2) the determination of reactor neutron spectrum parameters such as {alpha} and f factors in the irradiation channel, and (3) the validation of the developed k -NAA procedure by analysing SRM, namely AIEA-Soil7 and CRM, namely IGGE-GSV4. The analysis results obtained by k -NAA with 27 elements of Soil-7 standard and 14 elements of GSV-4 standard were compared with certified values. The analysis results showed that the deviations between experimental and certified values were mostly less than 10%. The k -NAA procedure established at Es-Salam research reactor has been regarded as a reliable standardization method of NAA and as available for practical applications.

  14. Development of soil-structure interaction analysis method (II) - Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S. P.; Ko, H. M.; Park, H. K. and others [Seoul National Univ., Seoul (Korea, Republic of)


    This project includes following six items : free field analysis for the determination of site input motions, impedance analysis which simplifies the effects of soil-structure interaction by using lumped parameters, soil-structure interaction analysis including the material nonlinearity of soil depending on the level of strains, strong geometric nonlinearity due to the uplifting of the base, seismic analysis of underground structure such as varied pipes, seismic analysis of liquid storage tanks. Each item contains following contents respectively : state-of-the-art review on each item and data base construction on the past researches, theoretical review on the technology of soil-structure interaction analysis, proposing preferable technology and estimating the domestic applicability, proposing guidelines for evaluation of safety and analysis scheme.

  15. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation]. (United States)

    Occhipinti, E; Colombini, Daniela; Occhipinti, M


    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  16. Development and in house validation of a new thermogravimetric method for water content analysis in soft brown sugar. (United States)

    Ducat, Giseli; Felsner, Maria L; da Costa Neto, Pedro R; Quináia, Sueli P


    Recently the use of brown sugar has increased due to its nutritional characteristics, thus requiring a more rigid quality control. The development of a method for water content analysis in soft brown sugar is carried out for the first time by TG/DTA with application of different statistical tests. The results of the optimization study suggest that heating rates of 5°C min(-1) and an alumina sample holder improve the efficiency of the drying process. The validation study showed that thermo gravimetry presents good accuracy and precision for water content analysis in soft brown sugar samples. This technique offers advantages over other analytical methods as it does not use toxic and costly reagents or solvents, it does not need any sample preparation, and it allows the identification of the temperature at which water is completely eliminated in relation to other volatile degradation products. This is an important advantage over the official method (loss on drying).

  17. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  18. Development of an Ion Chromatography Method for Analysis of Organic Anions (Fumarate, Oxalate, Succinate, and Tartrate) in Single Chromatographic Conditions. (United States)

    Kaviraj, Yarbagi; Srikanth, B; Moses Babu, J; Venkateswara Rao, B; Paul Douglas, S


    A single organic counterion analysis method was developed by using an ion chromatography separation technique and conductivity detector. This allows the rapid characterization of an API to support clinical studies and to fulfil the regulatory requirements for the quantitation of fumarate, oxalate, succinate, and tartrate counterions in active pharmaceutical ingredients (quetiapine fumarate, escitalopram oxalate, sumatriptan succinate, and tolterodine tartrate). The method was developed by using the Metrohm Metrosep A Supp 1 (250 × 4.0 mm, 5.0 µm particle size) column with a mobile phase containing an isocratic mixture of solution A: 7.5 mM sodium carbonate and 2.0 mM sodium bicarbonate in Milli-Q water and solution B: acetonitrile. The flow rate was set at 1.0 mL/min and the run time was 25 minutes. The developed method was validated as per ICH guidelines, and the method parameters were chosen to ensure the spontaneous quantitation of all four anions. The method was validated for all four anions to demonstrate the applicability of this method to common anions present in various APIs.


    Institute of Scientific and Technical Information of China (English)

    Wei-nan E; Ping-bing Ming


    The heterogeneous multiscale method gives a general framework for the analysis of multiscale methods. In this paper, we demonstrate this by applying this framework to two canonical problems: The elliptic problem with multiscale coefficients and the quasicontinuum method.

  20. Development of Optimized Core Design and Analysis Methods for High Power Density BWRs (United States)

    Shirvan, Koroush

    temperature was kept the same for the BWR-HD and ABWR which resulted in 4 °K cooler core inlet temperature for the BWR-HD given that its feedwater makes up a larger fraction of total core flow. The stability analysis using the STAB and S3K codes showed satisfactory results for the hot channel, coupled regional out-of-phase and coupled core-wide in-phase modes. A RELAPS model of the ABWR system was constructed and applied to six transients for the BWR-HD and ABWR. The 6MCPRs during all the transients were found to be equal or less for the new design and the core remained covered for both. The lower void coefficient along with smaller core volume proved to be advantages for the simulated transients. Helical Cruciform Fuel (HCF) rods were proposed in prior MIT studies to enhance the fuel surface to volume ratio. In this work, higher fidelity models (e.g. CFD instead of subchannel methods for the hydraulic behaviour) are used to investigate the resolution needed for accurate assessment of the HCF design. For neutronics, conserving the fuel area of cylindrical rods results in a different reactivity level with a lower void coefficient for the HCF design. In single-phase flow, for which experimental results existed, the friction factor is found to be sensitive to HCF geometry and cannot be calculated using current empirical models. A new approach for analysis of flow crisis conditions for HCF rods in the context of Departure from Nucleate Boiling (DNB) and dryout using the two phase interface tracking method was proposed and initial results are presented. It is shown that the twist of the HCF rods promotes detachment of a vapour bubble along the elbows which indicates no possibility for an early DNB for the HCF rods and in fact a potential for a higher DNB heat flux. Under annular flow conditions, it was found that the twist suppressed the liquid film thickness on the HCF rods, at the locations of the highest heat flux, which increases the possibility of reaching early dryout. It

  1. Development of Distinction Method of Production Area of Ginsengs by Using a Neutron Activation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Chung, Yong Sam; Sun, Gwang Min; Lee, Yu Na; Yoo, Sang Ho [KAERI, Daejeon (Korea, Republic of)


    Distinction of production area of Korean ginsengs has been tried by using neutron activation techniques such as an instrumental neutron activation analysis (INAA) and a prompt gamma activation analysis (PGAA). A distribution of elements has varied according to the part of plant clue to the difference of enrichment effect and influence from a soil where the plants have been grown. So correlation study between plants and soil has been an Issue. In this study, the distribution of trace elements within a Korean ginseng was investigated by using an instrumental neutron activation analysis

  2. Development and comparison of advanced reduced-basis methods for the transient structural analysis of unconstrained structures (United States)

    Mcgowan, David M.; Bostic, Susan W.; Camarda, Charles J.


    The development of two advanced reduced-basis methods, the force derivative method and the Lanczos method, and two widely used modal methods, the mode displacement method and the mode acceleration method, for transient structural analysis of unconstrained structures is presented. Two example structural problems are studied: an undamped, unconstrained beam subject to a uniformly distributed load which varies as a sinusoidal function of time and an undamped high-speed civil transport aircraft subject to a normal wing tip load which varies as a sinusoidal function of time. These example problems are used to verify the methods and to compare the relative effectiveness of each of the four reduced-basis methods for performing transient structural analyses on unconstrained structures. The methods are verified with a solution obtained by integrating directly the full system of equations of motion, and they are compared using the number of basis vectors required to obtain a desired level of accuracy and the associated computational times as comparison criteria.

  3. Developments in Surrogating Methods

    Directory of Open Access Journals (Sweden)

    Hans van Dormolen


    Full Text Available In this paper, I would like to talk about the developments in surrogating methods for preservation. My main focus will be on the technical aspects of preservation surrogates. This means that I will tell you something about my job as Quality Manager Microfilming for the Netherlands’ national preservation program, Metamorfoze, which is coordinated by the National Library. I am responsible for the quality of the preservation microfilms, which are produced for Metamorfoze. Firstly, I will elaborate on developments in preservation methods in relation to the following subjects: · Preservation microfilms · Scanning of preservation microfilms · Preservation scanning · Computer Output Microfilm. In the closing paragraphs of this paper, I would like to tell you something about the methylene blue test. This is an important test for long-term storage of preservation microfilms. Also, I will give you a brief report on the Cellulose Acetate Microfilm Conference that was held in the British Library in London, May 2005.

  4. Development of direct-inverse 3-D methods for applied aerodynamic design and analysis (United States)

    Carlson, Leland A.


    Several inverse methods have been compared and initial results indicate that differences in results are primarily due to coordinate systems and fuselage representations and not to design procedures. Further, results from a direct-inverse method that includes 3-D wing boundary layer effects, wake curvature, and wake displacement are presented. These results show that boundary layer displacements must be included in the design process for accurate results.

  5. Development and validation of a reversed-phase liquid chromatographic method for analysis of demeclocycline and related impurities. (United States)

    Kahsay, Getu; Maxa, Jaroslav; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin


    A simple, robust, and rapid reversedphase high-performance liquid chromatographic method for the analysis of demeclocycline and its impurities is described. Chromatographic separations were achieved on a Symmetry Shield RP8 (75 mm × 4.6 mm, 3.5 μm) column kept at 40°C. The mobile phase was a gradient mixture of acetonitrile, 0.06 M sodium edetate (pH 7.5), 0.06 M tetrapropylammonium hydrogen sulphate (pH 7.5) and water, A (2:35:35:28 v/v/v/v) and B (30:35:35:0 v/v/v/v) pumped at a flow rate of 1 mL/min. UV detection was performed at 280 nm. The developed method was validated according to the ICH guidelines for specificity, limit of detection, limit of quantification, linearity, precision, and robustness. An experimental design was applied for robustness study. Results show that the peak shape, chromatographic resolution between the impurities, and the total analysis time are satisfactory and better than previous methods. The method has been applied for the analysis of commercial demeclocycline bulk samples available on the market.

  6. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities. (United States)

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin


    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities.

  7. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating (United States)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy


    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  8. Development and evaluation of the piecewise Prony method for evoked potential analysis. (United States)

    Garoosi, V; Jansen, B H


    A new method is presented to decompose nonstationary signals into a summation of oscillatory components with time varying frequency, amplitude, and phase characteristics. This method, referred to as piecewise Prony method (PPM), is an improvement over the classical Prony method, which can only deal with signals containing components with fixed frequency, amplitude and phase, and monotonically increasing or decreasing rate of change. PPM allows the study of the temporal profile of post-stimulus signal changes in single-trial evoked potentials (EPs), which can lead to new insights in EP generation. We have evaluated this method on simulated data to test its limitations and capabilities, and also on single-trial EPs. The simulation experiments showed that the PPM can detect amplitude changes as small as 10%, rate changes as small as 10%, and 0.15 Hz of frequency changes. The capabilities of the PPM were demonstrated using single electroencephalogram/EP trials of flash visual EPs recorded from one normal subject. The trial-by-trial results confirmed that the stimulation drastically attenuates the alpha activity shortly after stimulus presentation, with the alpha activity returning about 0.5 s later. The PPM results also provided evidence that delta activity undergoes phase alignment following stimulus presentation.

  9. Development of new software as a convenient analysis method for dental microradiography. (United States)

    Matsuda, Yasuhiro; Murata, Yukie; Tanaka, Toru; Komatsu, Hisanori; Sano, Hidehiko


    To the end of developing a convenient research tool to calculate the mineralization status of teeth in detail, a new program was developed using Visual Basic for Applications combined with Microsoft Excel 2004. To demonstrate the usefulness of this program, it was used to analyze tooth enamel mineralization after acid exposure. Transverse microradiography images (TMR) of specimens were digitalized with a charge-coupled device camera with a microscope (CCD camera) and a digital film scanner (FS). Subsequently, the mineral content profile of each specimen after de- and remineralization studies were calculated using the Angmar's formula. The newly developed program was applied to calculating the mineral loss (DeltaZ), lesion depth (Ld), surface zone depth (SZd), and lesion body depth (LBd) of tooth specimens. In addition, the outer surface zone (OSZ), inner lesion body (ILB), and sandwich area (SA) between OSZ and ILB- which together constituted DeltaZ - were calculated by the newly developed program. Data obtained with the newly developed program were in good agreement for both CCD camera and FS, indicating that the program was reliable for tooth enamel mineralization research studies.

  10. Spatial positioning : method development for spatial analysis of interaction in buildings


    Markhede, Henrik


    In offices, knowledge sharing largely depends on everyday face-to-face interaction patterns. These interaction patterns may depend on how employees move through the office space. This thesis explores how these spatial relations influence individual choices with respect to employee movements or routes. Space syntax related research has shown a strong relationship between spatial configuration and pedestrian movement in cities, yet field of space syntax has not applied spatial analysis to the o...

  11. Development of Methods for Sampling and Analysis of Polychlorinated Naphthalenes in Ambient Air (United States)

    Erickson, Mitchell D.; And Others


    The procedure and sampler described permits detection of less than 50pg of one polychlorinated naphthalene (PCN) isomer. The method uses gas chromatography-mass spectrometry. The PCNs are collected on a glass fiber filter and two polyurethane foam plugs and extracted with toluene at 25 degrees Celsius. (BB)

  12. Methodical Approaches To Analysis And Forecasting Of Development Fuel And Energy Complex And Gas Industry In The Region

    Directory of Open Access Journals (Sweden)

    Vladimir Andreyevich Tsybatov


    Full Text Available Fuel and energy complex (FEC is one of the main elements of the economy of any territory over which intertwine the interests of all economic entities. To ensure economic growth of the region should ensure that internal balance of energy resources, which should be developed with account of regional specifics of economic growth and energy security. The study examined the status of this equilibrium, indicating fuel and energy balance of the region (TEB. The aim of the research is the development of the fuel and energy balance, which will allow to determine exactly how many and what resources are not enough to ensure the regional development strategy and what resources need to be brought in. In the energy balances as the focus of displays all issues of regional development, so thermopile is necessary as a mechanism of analysis of current issues, economic development, and in the forward-looking version — as a tool future vision for the fuel and energy complex, energy threats and ways of overcoming them. The variety of relationships in the energy sector with other sectors and aspects of society lead to the fact that the development of the fuel and energy balance of the region have to go beyond the actual energy sector, involving the analysis of other sectors of economy, as well as systems such as banking, budgetary, legislative, tax. Due to the complexity of the discussed problems, the obvious is the need to develop appropriate forecast-analytical system, allowing regional authorities to implement evidence-based predictions of the consequences of management decisions. Multivariant scenario study on development of fuel and energy complex and separately industry, to use the methods of project-based management, harmonized application of state regulation of strategic and market mechanisms on the operational directions of development of fuel and energy complex and separately industry in the economy of the region.

  13. Development of a capillary electrophoresis-mass spectrometry method using polymer capillaries for metabolomic analysis of yeast. (United States)

    Tanaka, Yoshihide; Higashi, Tetsuji; Rakwal, Randeep; Wakida, Shin-ichi; Iwahashi, Hitoshi


    Metabolomics is an emerging field in analytical biochemistry, and the development of such a method for comprehensive and quantitative analysis of organic acids, carbohydrates, and nucleotides is a necessity in the era of functional genomics. When a concentrated yeast extract was analyzed by CE-MS using a successive multiple ionic-polymer layer (SMIL)-coated capillary, the adsorption of the contaminants on the capillary wall caused severe problems such as no elution, band-broadening, and asymmetric peaks. Therefore, an analytical method for the analysis of anionic metabolites in yeast was developed by pressure-assisted CE using an inert polymer capillary made from poly(ether etherketone) (PEEK) and PTFE. We preferred to use the PEEK over the PTFE capillary in CE-MS due to the easy-to-use PEEK capillary and its high durability. The separation of anionic metabolites was successfully achieved with ammonium hydrogencarbonate/formate buffer (pH 6.0) as the electrolyte solution. The use of 2-propanol washing after every electrophoresis run not only eliminated wall-adsorption phenomena, but allowed for good repeatability to be obtained for migration times in the metabolomic analysis.

  14. Analysis of the Difficulties and Improvement Method on Introduction of PBL Approach in Developing Country (United States)

    Okano, Takasei; Sessa, Salvatore

    In the field of international cooperation, it is increasing to introduce Japanese engineering educational model in the developing country to improve the quality of education and research activity. A naive implementation of such model in different cultures and educational systems may lead to several problems. In this paper, we evaluated the Project Based Learning (PBL) class, developed at Waseda University in Japan, and employed to the Egyptian education context at the Egypt-Japan University of Science and Technology (E-JUST) . We found difficulties such as : non-homogeneous student’ s background, disconnection with the student’ s research, weak learning style adaptation, and irregular course conduction. To solve these difficulties at E-JUST, we proposed : the groupware introduction, project theme choice based on student’ s motivation, and curriculum modification.

  15. Generation of Synthetic Transcriptome Data with Defined Statistical Properties for the Development and Testing of New Analysis Methods

    Institute of Scientific and Technical Information of China (English)

    Guillaume Brysbaert; Sebastian Noth; Arndt Benecke


    We have previously developed a combined signal/variance distribution model that accounts for the particular statistical properties of datasets generated on the Applied Biosystems AB1700 transcriptome system. Here we show that this model can be efficiently used to generate synthetic datasets with statistical properties virtually identical to those of the actual data by aid of the JAVA application creator 1.0 that we have developed. The fundamentally different structure of AB1700 transcriptome profiles requires re-evaluation, adaptation, or even redevelopment of many of the standard microarray analysis methods in order to avoid misinterpretation of the data on the one hand, and to draw full benefit from their increased specificity and sensitivity on the other hand. Our composite data model and the creator 1.0 application thereby not only present proof of the correctness of our parameter estimation, but also provide a tool for the generation of synthetic test data that will be useful for further development and testing of analysis methods.

  16. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan


    Full Text Available precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  17. Episode-Based Evolution Pattern Analysis of Haze Pollution: Method Development and Results from Beijing, China. (United States)

    Zheng, Guangjie; Duan, Fengkui; Ma, Yongliang; Zhang, Qiang; Huang, Tao; Kimoto, Takashi; Cheng, Yafang; Su, Hang; He, Kebin


    Haze episodes occurred in Beijing repeatedly in 2013, resulting in 189 polluted days. These episodes differed in terms of sources, formation processes, and chemical composition and thus required different control policies. Therefore, an overview of the similarities and differences among these episodes is needed. For this purpose, we conducted one-year online observations and developed a program that can simultaneously divide haze episodes and identify their shapes. A total of 73 episodes were identified, and their shapes were linked with synoptic conditions. Pure-haze events dominated in wintertime, whereas mixed haze-dust (PM2.5/PM10 haze-fog (Aerosol Water/PM2.5 ∼ 0.3) events dominated in spring and summer-autumn, respectively. For all types, increase of ratio of PM2.5 in PM10 was typically achieved before PM2.5 reached ∼150 μg/m(3). In all PM2.5 species observed, organic matter (OM) was always the most abundant component (18-60%), but it was rarely the driving factor: its relative contribution usually decreased as the pollution level increased. The only OM-driven episode observed was associated with intensive biomass-burning activities. In comparison, haze evolution generally coincided with increasing sulfur and nitrogen oxidation ratios (SOR and NOR), indicating the enhanced production of secondary inorganic species. Applicability of these conclusions required further tests with simultaneously multisite observations.

  18. The NASA/Industry Design Analysis Methods for Vibrations (DAMVIBS) Program - A government overview. [of rotorcraft technology development using finite element method (United States)

    Kvaternik, Raymond G.


    An overview is presented of government contributions to the program called Design Analysis Methods for Vibrations (DAMV) which attempted to develop finite-element-based analyses of rotorcraft vibrations. NASA initiated the program with a finite-element modeling program for the CH-47D tandem-rotor helicopter. The DAMV program emphasized four areas including: airframe finite-element modeling, difficult components studies, coupled rotor-airframe vibrations, and airframe structural optimization. Key accomplishments of the program include industrywide standards for modeling metal and composite airframes, improved industrial designs for vibrations, and the identification of critical structural contributors to airframe vibratory responses. The program also demonstrated the value of incorporating secondary modeling details to improving correlation, and the findings provide the basis for an improved finite-element-based dynamics design-analysis capability.

  19. Development of achiral and chiral 2D HPLC methods for analysis of albendazole metabolites in microsomal fractions using multivariate analysis for the in vitro metabolism. (United States)

    Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V


    In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis.

  20. Development of an offline bidimensional high-performance liquid chromatography method for analysis of stereospecific triacylglycerols in cocoa butter equivalents. (United States)

    Kadivar, Sheida; De Clercq, Nathalie; Nusantoro, Bangun Prajanto; Le, Thien Trung; Dewettinck, Koen


    Acyl migration is a serious problem in enzymatic modification of fats and oils, particularly in production of cocoa butter equivalent (CBE) through enzymatic acidolysis reaction, which leads to the formation of non-symmetrical triacylglycerols (TAGs) from symmetrical TAGs. Non-symmetrical TAGs may affect the physical properties of final products and are therefore often undesired. Consequently, an accurate method is needed to determine positional isomer TAGs during the production of CBE. A bidimentional high-performance liquid chromatography (HPLC) method with combination of non-aqueous reversed-phase HPLC and silver ion HPLC joining with an evaporative light scattering detector was successfully developed for the analysis of stereospecific TAGs. The best separation of positional isomer standards was obtained with a heptane/acetone mobile-phase gradient at 25 °C and 1 mL/min. The developed method was then used in multidimensional determination of the TAG positional isomers in fat and oil blends and successfully identified the TAGs and possible isomers in enzymatically acidolyzed CBE.

  1. SIFT-MS and FA-MS methods for ambient gas phase analysis: developments and applications in the UK. (United States)

    Smith, David; Španěl, Patrik


    Selected ion flow tube mass spectrometry, SIFT-MS, a relatively new gas/vapour phase analytical method, is derived from the much earlier selected ion flow tube, SIFT, used for the study of gas phase ion-molecule reactions. Both the SIFT and SIFT-MS techniques were conceived and developed in the UK, the former at Birmingham University, the latter at Keele University along with the complementary flowing afterglow mass spectrometry, FA-MS, technique. The focus of this short review is largely to describe the origins, developments and, most importantly, the unique features of SIFT-MS as an analytical tool for ambient analysis and to indicate its growing use to analyse humid air, especially exhaled breath, its unique place as a on-line, real time analytical method and its growing use and applications as a non-invasive diagnostic in clinical diagnosis and therapeutic monitoring, principally within several UK universities and hospitals, and briefly in the wider world. A few case studies are outlined that show the potential of SIFT-MS and FA-MS in the detection and quantification of metabolites in exhaled breath as a step towards recognising pathophysiology indicative of disease and the presence of bacterial and fungal infection of the airways and lungs. Particular cases include the detection of Pseudomonas aeruginosa infection of the airways of patients with cystic fibrosis (SIFT-MS) and the measurement of total body water in patients with chronic kidney disease (FA-MS). The growing exploitation of SIFT-MS in other areas of research and commerce are briefly listed to show the wide utility of this unique UK-developed analytical method, and future prospects and developments are alluded to.

  2. Digital representation of meso-geomaterial spatial distribution and associated numerical analysis of geomechanics:methods,applications and developments

    Institute of Scientific and Technical Information of China (English)

    YUE Zhongqi


    This paper presents the author's efforts in the past decade for the establishment of a practical approach of digital representation of the geomaterial distribution of different minerals,particulars,and components in the meso-scale range(0.1 to 500 mm).The primary goal of the approach is to provide a possible solution to solve the two intrinsic problems associated with the current main-stream methods for geomechanics.The problems are (1) the constitutive models and parameters of soils and rocks cannot be given accurately in geomechanical prediction;and (2) there are numerous constitutive models of soils and rocks in the literature.The problems are possibly caused by the homogenization or averaging method in analyzing laboratory test results for establishing the constitutive models and parameters.The averaging method employs an assumption that the test samples can be represented by a homogeneous medium.Such averaging method ignores the fact that the geomaterial samples are also consisted of a number of materials and components whose properties may have significant differences.In the proposed approach,digital image processing methods are used as measurement tools to construct a digital representation for the actual spatial distribution of the different materials and components in geomaterial samples.The digital data are further processed to automatically generate meshes or grids for numerical analysis.These meshes or grids can be easily incorporated into existing numerical software packages for further mechanical analysis and failure prediction of the geomaterials under external loading.The paper presents case studies to illustrate the proposed approach.Further discussions are also made on how to use the proposed approach to develop the geomechanics by taking into account the geomaterial behavior at micro-scale,meso-scale and macro-scale levels.A literature review of the related developments is given by examining the SCI papers in the database of Science Citation

  3. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems: Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.


    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well establish

  4. Generic Analysis Methods for Gas Turbine Engine Performance: The development of the gas turbine simulation program GSP

    NARCIS (Netherlands)

    Visser, W.P.J.


    Numerical modelling and simulation have played a critical role in the research and development towards today’s powerful and efficient gas turbine engines for both aviation and power generation. The simultaneous progress in modelling methods, numerical methods, software development tools and methods,

  5. Advanced Durability and Damage Tolerance Design and Analysis Methods for Composite Structures: Lessons Learned from NASA Technology Development Programs (United States)

    Harris, Charles E.; Starnes, James H., Jr.; Shuart, Mark J.


    Aerospace vehicles are designed to be durable and damage tolerant. Durability is largely an economic life-cycle design consideration whereas damage tolerance directly addresses the structural airworthiness (safety) of the vehicle. However, both durability and damage tolerance design methodologies must address the deleterious effects of changes in material properties and the initiation and growth of microstructural damage that may occur during the service lifetime of the vehicle. Durability and damage tolerance design and certification requirements are addressed for commercial transport aircraft and NASA manned spacecraft systems. The state-of-the-art in advanced design and analysis methods is illustrated by discussing the results of several recently completed NASA technology development programs. These programs include the NASA Advanced Subsonic Technology Program demonstrating technologies for large transport aircraft and the X-33 hypersonic test vehicle demonstrating technologies for a single-stage-to-orbit space launch vehicle.

  6. Probabilistic methods for rotordynamics analysis (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.


    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  7. A bottom-up method for module-based product platform development through mapping, clustering and matching analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Meng; LI Guo-xi; CAO Jian-ping; GONG Jing-zhong; WU Bao-zhong


    Designing product platform could be an effective and efficient solution for manufacturing firms. Product platforms enable firms to provide increased product variety for the marketplace with as little variety between products as possible. Developed consumer products and modules within a firm can further be investigated to find out the possibility of product platform creation. A bottom-up method is proposed for module-based product platform through mapping, clustering and matching analysis. The framework and the parametric model of the method are presented, which consist of three steps: (1) mapping parameters from existing product families to functional modules, (2) clustering the modules within existing module families based on their parameters so as to generate module clusters, and selecting the satisfactory module clusters based on commonality, and (3) matching the parameters of the module clusters to the functional modules in order to capture platform elements. In addition, the parameter matching criterion and mismatching treatment are put forward to ensure the effectiveness of the platform process, while standardization and serialization of the platform element are presented. A design case of the belt conveyor is studied to demonstrate the feasibility of the proposed method.

  8. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C


    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit


    Development of a Novel Method for Analysis of Transcriptional Changes in Transitional Epithelium from Urinary Bladders of Rats Exposed to Drinking Water Disinfection By- products.Epidemiologic studies in human populations that drink chemically disinfected drinking wa...

  10. Pesticide hair analysis: development of a GC-NCI-MS method to assess chronic exposure to diazinon in rats. (United States)

    Tutudaki, Maria; Tsatsakis, Aristidis M


    The present study aimed to improve the gas chromatography-mass spectrometry (GC-MS) method, already developed in our laboratory, for trace analysis of diazinon in hair. Furthermore, it aimed to compare the disposition of the pesticide in the hair of two different animal species, one susceptible to diazinon toxicity and one resistant, under identical experimental conditions. Sprague Dawley rats were systemically exposed to two dose levels (6 mg/kg/day and 3 mg/kg/day) of the pesticide, through their drinking water, for a period of one and a half months. Hair samples from the back of the rats were removed before commencing the experiment and at the end of the dosing period. Diazinon was selectively isolated from pulverized hair, sample or spiked, by stepwise consequent extractions with methanol and ethyl acetate and quantified by GC-negative chemical ionization-MS. It was found that the concentration of diazinon in the hair of exposed animals was dose dependent and was found to be 0.24 +/- 0.01 ng/mg (n = 5) and 0.53 +/- 0.05 ng/mg (n = 5) for the low and high dosage, respectively. The concentration in both dose groups was much higher than the corresponding rabbit hair (rabbits were exposed to the pesticide under similar experimental conditions) as previously reported. Our results strongly point to the possibility of using hair analysis for low-level exposure monitoring of diazinon.

  11. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.


    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one o

  12. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    Directory of Open Access Journals (Sweden)

    H. Apel


    Full Text Available Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU for time-efficient flood propagation modelling. All hazards – fluvial, pluvial and combined – were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median

  13. Factor Analysis Methods and Validity Evidence: A Systematic Review of Instrument Development across the Continuum of Medical Education (United States)

    Wetzel, Angela Payne


    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…

  14. Social Phenomenological Analysis as a Research Method in Art Education: Developing an Empirical Model for Understanding Gallery Talks (United States)

    Hofmann, Fabian


    Social phenomenological analysis is presented as a research method to study gallery talks or guided tours in art museums. The research method is based on the philosophical considerations of Edmund Husserl and sociological/social science concepts put forward by Max Weber and Alfred Schuetz. Its starting point is the everyday lifeworld; the…


    Directory of Open Access Journals (Sweden)

    Lenchyk L.V.


    Full Text Available Introduction. Bird cherry Padus avium Mill, Rosaceae, is widespread in Ukraine, especially in forests and forest-steppe areas. Bird cherry fruits have long been used in medicine and is a valuable medicinal raw materials. They stated to posess astringent, anti-inflammatory, phytoncidal properties. Bird cherry fruits are included in the USSR Pharmacopoeia IX ed., The State Pharmacopoeia of the Russian Federation, The State Pharmacopoeia of Republic of Belarus. In Ukraine there are no contemporary normative documents for this medicinal plant material, therefore it is the actual to develop projects in the national monographs "dry bird cherry fruit" and "fresh bird cherry fruit" to be included in the State Pharmacopoeia of Ukraine. According to European Pharmacopoeia recommendation method of thin-layer chromatography (TLC is prescribed only for the identification of the herbal drug. The principles of thin-layer chromatography and application of the technique in pharmaceutical analysis are described in State Pharmacopoeia of Ukraine. As it is effective and easy to perform, and the equipment required is inexpensive, the technique is frequently used for evaluating medicinal plant materials and their preparations. The TLC is aimed at elucidating the chromatogram of the drug with respect to selected reference compounds that are described for inclusion as reagents. Aim of this study was to develop methods of qualitative analysis of bird cherry fruits for a monograph in the State Pharmacopoeia of Ukraine (SPU. Materials and Methods. The object of our study was dried bird cherry fruits (7 samples and fresh bird cherry fruits (7 samples harvested in 2013-2015 in Kharkiv, Poltava, Luhansk, Sumy, Lviv, Mykolaiv regions and the city Mariupol. Samples were registered in the department of SPU State Enterprise "Pharmacopeia center". In accordance with the Ph. Eur. and SPU requirements in "identification C" determination was performed by TLC. TLC was performed on

  16. Forces in bolted joints: analysis methods and test results utilized for nuclear core applications (LWBR Development Program)

    Energy Technology Data Exchange (ETDEWEB)

    Crescimanno, P.J.; Keller, K.L.


    Analytical methods and test data employed in the core design of bolted joints for the LWBR core are presented. The effects of external working loads, thermal expansion, and material stress relaxation are considered in the formulation developed to analyze joint performance. Extensions of these methods are also provided for bolted joints having both axial and bending flexibilities, and for the effect of plastic deformation on internal forces developed in a bolted joint. Design applications are illustrated by examples.

  17. Space Radiation Transport Methods Development (United States)

    Wilson, J.; Tripathi, R.; Qualls, G.; Cucinotta, F.; Prael, R.; Norbury, J.

    Early space radiation shield code development relied on Monte Carlo methods for proton, neutron and pion transport and made important contributions to the space program. More recently Monte Carlo code LAHET has been upgraded to include high-energy multiple-charged light ions for GCR simulations and continues to be expanded in capability. To compensate for low computational efficiency, Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representations of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process and resolving shielding issues usually had a negative impact on the design. We evaluate the implications of these common one-dimensional assumptions on the evaluation of the Shuttle internal radiation field. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be

  18. Motion as perturbation. II. Development of the method for dosimetric analysis of motion effects with fixed-gantry IMRT

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E. [Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Opp, Daniel; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir, E-mail: [Department of Radiation Oncology, Moffitt Cancer Center, Tampa, Florida 33612 (United States)


    Purpose: In this work, the feasibility of implementing a motion-perturbation approach to accurately estimate volumetric dose in the presence of organ motion—previously demonstrated for VMAT-–is studied for static gantry IMRT. The method's accuracy is improved for the voxels that have very low planned dose but acquire appreciable dose due to motion. The study describes the modified algorithm and its experimental validation and provides an example of a clinical application. Methods: A contoured region-of-interest is propagated according to the predefined motion kernel throughout time-resolved 4D phantom dose grids. This timed series of 3D dose grids is produced by the measurement-guided dose reconstruction algorithm, based on an irradiation of a staticARCCHECK (AC) helical dosimeter array (Sun Nuclear Corp., Melbourne, FL). Each moving voxel collects dose over the dynamic simulation. The difference in dose-to-moving voxel vs dose-to-static voxel in-phantom forms the basis of a motion perturbation correction that is applied to the corresponding voxel in the patient dataset. A new method to synchronize the accelerator and dosimeter clocks, applicable to fixed-gantry IMRT, was developed. Refinements to the algorithm account for the excursion of low dose voxels into high dose regions, causing appreciable dose increase due to motion (LDVE correction). For experimental validation, four plans using TG-119 structure sets and objectives were produced using segmented IMRT direct machine parameters optimization in Pinnacle treatment planning system (v. 9.6, Philips Radiation Oncology Systems, Fitchburg, WI). All beams were delivered with the gantry angle of 0°. Each beam was delivered three times: (1) to the static AC centered on the room lasers; (2) to a static phantom containing a MAPCHECK2 (MC2) planar diode array dosimeter (Sun Nuclear); and (3) to the moving MC2 phantom. The motion trajectory was an ellipse in the IEC XY plane, with 3 and 1.5 cm axes. The period

  19. Development of a physiologically relevant dripping analytical method using simulated nasal mucus for nasal spray formulation analysis

    Directory of Open Access Journals (Sweden)

    Tina Masiuk


    Full Text Available Current methods for nasal spray formulations have been elementary evaluating the dripping characteristics of a formulation and have not assessed the behavior of the nasal formulation in the presence of varying types of mucus depending on the indication or diseased state. This research investigated the effects of nasal mucus on the dripping behavior of nasal formulations and focused on developing an improved in vitro analytical test method that is more physiologically relevant in characterizing nasal formulation dripping behavior. Method development was performed using simulated nasal mucus preparations for both healthy and diseased states as coatings for the dripping experiment representing a wide range of viscosity. Factors evaluated during development of this in vitro test method included amount of mucus, application of mucus, drying times, and compatibility of the mucus on a C18 Thin Layer Chromatography (TLC substrate. The dripping behavior of nasal formulations containing a range of 1% Avicel to 3.5% Avicel was assessed by actuating the nasal spray on a perpendicular TLC plate coated with either healthy or diseased simulated nasal mucus. After actuation of the nasal spray, the dripping of the formulation on the coated TLC plate was measured after the plate was repositioned vertically. The method that was developed generated reproducible results on the dripping behavior of nasal formulations and provided critical information about the compatibility of the formulation with the nasal mucus for different diseased states, aiding in nasal spray formulation development and physical characterization of the nasal spray.

  20. Development of a physiologically relevant dripping analytical method using simulated nasal mucus for nasal spray formulation analysis$

    Institute of Scientific and Technical Information of China (English)

    Tina Masiuk n; Parul Kadakia; Zhenyu Wang


    Current methods for nasal spray formulations have been elementary evaluating the dripping char-acteristics of a formulation and have not assessed the behavior of the nasal formulation in the presence of varying types of mucus depending on the indication or diseased state. This research investigated the effects of nasal mucus on the dripping behavior of nasal formulations and focused on developing an improved in vitro analytical test method that is more physiologically relevant in characterizing nasal formulation dripping behavior. Method development was performed using simulated nasal mucus preparations for both healthy and diseased states as coatings for the dripping experiment representing a wide range of viscosity. Factors evaluated during development of this in vitro test method included amount of mucus, application of mucus, drying times, and compatibility of the mucus on a C18 Thin Layer Chromatography (TLC) substrate. The dripping behavior of nasal formulations containing a range of 1%Avicel to 3.5%Avicel was assessed by actuating the nasal spray on a perpendicular TLC plate coated with either healthy or diseased simulated nasal mucus. After actuation of the nasal spray, the dripping of the formulation on the coated TLC plate was measured after the plate was repositioned vertically. The method that was developed generated reproducible results on the dripping behavior of nasal formula-tions and provided critical information about the compatibility of the formulation with the nasal mucus for different diseased states, aiding in nasal spray formulation development and physical characterization of the nasal spray.

  1. The Development of a SPME-GC/MS Method for the Analysis of VOC Emissions from Historic Plastic and Rubber Materials


    Curran, K.; Underhill, M.; Gibson, L. T.; Strlic, M.


    Analytical methods have been developed for the analysis of VOC emissions from historic plastic and rubber materials using SPME-GC/MS. Parameters such as analysis temperature, sampling time and choice of SPME fibre coating were investigated and sampling preparation strategies explored, including headspace sampling in vials and in gas sampling bags. The repeatability of the method was evaluated. It was found that a 7 d accumulation time at room temperature, followed by sampling using a DVB/CAR/...

  2. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H. [and others


    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  3. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods. (United States)

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan


    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  4. LC-MS/MS method development for quantitative analysis of acetaminophen uptake by the aquatic fungus Mucor hiemalis. (United States)

    Esterhuizen-Londt, Maranda; Schwartz, Katrin; Balsano, Evelyn; Kühn, Sandra; Pflugmacher, Stephan


    Acetaminophen is a pharmaceutical, frequently found in surface water as a contaminant. Bioremediation, in particular, mycoremediation of acetaminophen is a method to remove this compound from waters. Owing to the lack of quantitative analytical method for acetaminophen in aquatic organisms, the present study aimed to develop a method for the determination of acetaminophen using LC-MS/MS in the aquatic fungus Mucor hiemalis. The method was then applied to evaluate the uptake of acetaminophen by M. hiemalis, cultured in pellet morphology. The method was robust, sensitive and reproducible with a lower limit of quantification of 5 pg acetaminophen on column. It was found that M. hiemalis internalize the pharmaceutical, and bioaccumulate it with time. Therefore, M. hiemalis was deemed a suitable candidate for further studies to elucidate its pharmaceutical tolerance and the longevity in mycoremediation applications.

  5. Systematic errors in detecting biased agonism: Analysis of current methods and development of a new model-free approach (United States)

    Onaran, H. Ongun; Ambrosio, Caterina; Uğur, Özlem; Madaras Koncz, Erzsebet; Grò, Maria Cristina; Vezzi, Vanessa; Rajagopal, Sudarshan; Costa, Tommaso


    Discovering biased agonists requires a method that can reliably distinguish the bias in signalling due to unbalanced activation of diverse transduction proteins from that of differential amplification inherent to the system being studied, which invariably results from the non-linear nature of biological signalling networks and their measurement. We have systematically compared the performance of seven methods of bias diagnostics, all of which are based on the analysis of concentration-response curves of ligands according to classical receptor theory. We computed bias factors for a number of β-adrenergic agonists by comparing BRET assays of receptor-transducer interactions with Gs, Gi and arrestin. Using the same ligands, we also compared responses at signalling steps originated from the same receptor-transducer interaction, among which no biased efficacy is theoretically possible. In either case, we found a high level of false positive results and a general lack of correlation among methods. Altogether this analysis shows that all tested methods, including some of the most widely used in the literature, fail to distinguish true ligand bias from “system bias” with confidence. We also propose two novel semi quantitative methods of bias diagnostics that appear to be more robust and reliable than currently available strategies. PMID:28290478

  6. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku


    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.


    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA


    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  8. Method development for liquid chromatographic/triple quadrupole mass spectrometric analysis of trace level perfluorocarboxylic acids in articles of commerce (United States)

    An analytical method to identify and quantify trace levels of C5 to C12 perfluorocarboxylic acids (PFCAs) in articles of commerce (AOC) is developed and rigorously validated. Solid samples were extracted in methanol, and liquid samples were diluted with a solvent consisting of 60...

  9. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  10. Comparison of infrared spectroscopy techniques: developing an efficient method for high resolution analysis of sediment properties from long records (United States)

    Hahn, Annette; Rosén, Peter; Kliem, Pierre; Ohlendorf, Christian; Persson, Per; Zolitschka, Bernd; Pasado Science Team


    the sample is necessary. This could not be accomplished, therefore absorbance in higher wavelengths was not recorded correctly. As a result of the poor spectral quality no calibration model was established for BSi using the Equinox device. Since this is by far the most time-consuming and elaborate conventional measurement, results give clear advantages for the Alpha device. Further calibration models were developed using spectra from the Visible Near Infrared Spectroscopy (VNIRS) region (400-2500 nm). Sample preparation for VNIRS analysis also is faster than for DRIFTS. However, FTIRS calibrations seem to perform better than those for VNIRS which show an R of 0.75 (BSi), 0.93 (TOC), 0.93 (TN), and 0.89 (TIC). NIRS primarily measures overtones of molecular vibrations and is typically used for quantitative measurement of organic functional groups. FTIRS is similar to NIRS, but uses longer wavelengths and directly monitors molecular vibrations. As a consequence, FTIRS allows more detailed structural and compositional analyses of both organic and inorganic compounds. Statistical analysis of the FTIRS-PLS models shows that the calibration depends on specific wave numbers, which compare well with spectra of pure compounds. The VNIRS technique gives rise to a spectrum with broad peaks and many overlapping signals which makes interpretation difficult without statistical analyses. In conclusion, the DRIFTS technique shows the best statistical performance for the analysis of biogeochemical properties. However, the VNIRS techniques and especially the ATR-FTIRS Alpha device show comparable results and can also be used as a rapid screening tool when time and costs are limiting factors. Kellner R, Mermet J-M, Otto M, Widmer HM (1998) Analytical chemistry. Wiley-VCH, Weinheim, etc. Rosén P, Vogel H, Cunnigham L, Reuss N, Conley DJ, Persson P (2009) Fourier transform infrared spectroscopy, a new method for rapid determination of total organic and inorganic carbon and biogenic silica

  11. Guidelines for Analysis of Health Sector Financing in Developing Countries. Volume 8: Health Sector Financing in Developing Countries. International Health Planning Methods Series. (United States)

    Robertson, Robert L.; And Others

    Intended to assist Agency for International Development officers, advisors, and health officials in incorporating health planning into national plans for economic development, this eighth of ten manuals in the International Health Planning Methods series provides a methodology for conducting a study of health sector financing. It presents an…



    Rozet, Eric; Debrus, Benjamin; Lebrun, Pierre; Boulanger, B.; Hubert, Philippe


    As defined by ICH [1] and FDA, Quality by Design (QbD) stands for “a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management”. A risk–based QbD–compliant approach is proposed for the robust development of analytical methods. This methodology based on Design of Experiments (DoE) to study the experimental domain models the retention times at the beginning, t...

  13. Development and validation of an alternative to conventional pretreatment methods for residue analysis of butachlor in water, soil, and rice. (United States)

    Xue, Jiaying; Jiang, Wenqing; Liu, Fengmao; Zhao, Huiyu; Wang, Suli; Peng, Wei


    A rapid and effective alternative analytical method for residues of butachlor in water, soil, and rice was established. The operating variables affecting performance of this method, including different extraction conditions and cleanup adsorbents, were evaluated. The determination of butachlor residues in soil, straw, rice hull, and husked rice was performed using GC/MS after extraction with n-hexane and cleanup with graphite carbon black. The average recoveries ranged from 81.5 to 102.7%, with RSDs of 0.6-7.7% for all of the matrixes investigated. The limits of quantitation were 0.05 mg/kg in water and rice plant, and 0.01 mg/kg in soil, straw, rice hull, and husked rice. A comparison among this proposed method, the conventional liquid-liquid extraction, the Quick, Easy, Cheap, Effective, Rugged, and Safe method, and Soxhlet extraction indicated that this method was more suitable for analyzing butachlor in rice samples. The further validation of the proposed method was carried out by Soxhlet extraction for the determination of butachlor residues in the husked rice samples, and the residue results showed there was no obvious difference obtained from these two methods. Samples from a rice field were found to contain butachlor residues below the maximum residue limits set by China (0.5 mg/kg) and Japan (0.1 mg/kg). The proposed method has a strong potential for application in routine screening and processing of large numbers of samples. This study developed a more effective alternative to the conventional analytical methods for analyzing butachlor residues in various matrixes.

  14. Development of a Direct Headspace Collection Method from Arabidopsis Seedlings Using HS-SPME-GC-TOF-MS Analysis

    Directory of Open Access Journals (Sweden)

    Kazuki Saito


    Full Text Available Plants produce various volatile organic compounds (VOCs, which are thought to be a crucial factor in their interactions with harmful insects, plants and animals. Composition of VOCs may differ when plants are grown under different nutrient conditions, i.e., macronutrient-deficient conditions. However, in plants, relationships between macronutrient assimilation and VOC composition remain unclear. In order to identify the kinds of VOCs that can be emitted when plants are grown under various environmental conditions, we established a conventional method for VOC profiling in Arabidopsis thaliana (Arabidopsis involving headspace-solid-phase microextraction-gas chromatography-time-of-flight-mass spectrometry (HS-SPME-GC-TOF-MS. We grew Arabidopsis seedlings in an HS vial to directly perform HS analysis. To maximize the analytical performance of VOCs, we optimized the extraction method and the analytical conditions of HP-SPME-GC-TOF-MS. Using the optimized method, we conducted VOC profiling of Arabidopsis seedlings, which were grown under two different nutrition conditions, nutrition-rich and nutrition-deficient conditions. The VOC profiles clearly showed a distinct pattern with respect to each condition. This study suggests that HS-SPME-GC-TOF-MS analysis has immense potential to detect changes in the levels of VOCs in not only Arabidopsis, but other plants grown under various environmental conditions.

  15. Methods and Model Development for Coupled RELAP5/PARCS Analysis of the Atucha-II Nuclear Power Plant

    Directory of Open Access Journals (Sweden)

    Andrew M. Ward


    Full Text Available In order to analyze the steady state and transient behavior of CNA-II, several tasks were required. Methods and models were developed in several areas. HELIOS lattice models were developed and benchmarked against WIMS/MCNP5 results generated by NA-SA. Cross-sections for the coupled RELAP5/PARCS calculation were extracted from HELIOS within the GenPMAXS framework. The validation of both HELIOS and PARCS was performed primarily by comparisons to WIMS/PUMA and MCNP for idealized models. Special methods were developed to model the control rods and boron injection systems of CNA-II. The insertion of the rods is oblique, and a special routine was added to PARCS to treat this effect. CFD results combined with specialized mapping routines were used to model the boron injection system. In all cases there was good agreement in the results which provided confidence in the neutronics methods and modeling. A coupled code benchmark between U of M and U of Pisa is ongoing and results are still preliminary. Under a LOCA transient, the best estimate behavior of the core appears to be acceptable.

  16. Developing a New Sampling And Analysis Method For Hydrazine And Monomethyl Hydrazine: Using a Derivatizing Agent With Solid Phase Microextraction (United States)

    Allen, John


    Solid phase microextraction (SPME) will be used to develop a method for detecting monomethyl hydrazine (MMH) and hydrazine (Hz). A derivatizing agent, pentafluorobenzoyl chloride (PFBCI), is known to react readily with MMH and Hz. The SPME fiber can either be coated with PFBCl and introduced into a gaseous stream containing MMH, or PFBCl and MMH can react first in a syringe barrel and after a short equilibration period a SPME is used to sample the resulting solution. These methods were optimized and compared. Because Hz and MMH can degrade the SPME, letting the reaction occur first gave better results. Only MMH could be detected using either of these methods. Future research will concentrate on constructing calibration curves and determining the detection limit.

  17. Analysis of Scientific and Methodical Approaches to Portfolio Investment as a Tool of Financial Provision of Sustainable Economic Development

    Directory of Open Access Journals (Sweden)

    Leus Daryna V.


    Full Text Available The article analyses scientific and methodical approaches to portfolio investment. It develops recommendations on specification of the categorical apparatus of portfolio investment in the context of differentiation of strategic (direct and portfolio investments as alternative approaches to the conduct of investment activity. It identifies the composition and functions of objects and subjects of portfolio investment under conditions of globalisation of the world financial markets. It studies main postulates of the portfolio theory and justifies a necessity of identification of the place, role and functions of subjects of portfolio investment in them for ensuring sustainable development of the economy. It offers to specify, as one of the ways of further development of portfolio theories, a separate direction in the financial provision of economy with consideration of ecologic and social components – socio responsible investment.

  18. Contribution of ion beam analysis methods to the development of 2nd generation high temperature superconducting (HTS) wires

    Energy Technology Data Exchange (ETDEWEB)

    Usov, Igor O [Los Alamos National Laboratory; Arendt, Paul N [Los Alamos National Laboratory; Stan, Liliana [Los Alamos National Laboratory; Holesinger, Terry G [Los Alamos National Laboratory; Foltyn, Steven R [Los Alamos National Laboratory; Depaula, Raymond F [Los Alamos National Laboratory


    One of the crucial steps in the second generation high temperature superconducting wire program was development of the buffer layer architecture. The architecture designed at the Superconductivity Technology Center at Los Alamos National Laboratory consists of several oxide layers wherein each layer plays a specific role, namely: nucleation layer, diffusion barrier, biaxially textured template, and an intermediate layer with a good match to the lattice parameter of superconducting Y{sub 1}Ba{sub 2}Cu{sub 3}O{sub 7} (YBCO) compound. This report demonstrates how a wide range of ion beam analysis techniques (SIMS, RBS, channeling, PIXE, PIGE, NRA, ERD) was employed for analysis of each buffer layer and the YBCO films. These results assisted in understanding of a variety of physical processes occurring during the buffet layer fabrication and helped to optimize the buffer layer architecture as a whole.

  19. Developing a 3D constrained variational analysis method to obtain accurate gridded atmospheric vertical velocity and horizontal advections (United States)

    Tang, S.; Zhang, M.


    Based on the constrained variational analysis (CVA) algorithm developed by Zhang and Lin (1997), a 3-dimensional (3D) version of CVA is developed. The new algorithm used gridded surface and TOA observations as constraints to adjust atmospheric state variables in each grid point to satisfy column-integrated mass, moisture and static energy conservation. From the process of adjustment a set of high-quality 3D large-scale forcing data (vertical velocity and horizontal advections) can be derived to drive Single-Column models (SCM), Cloud-Resolving Models (CRM) and Large-Eddy Simulations (LES) to evaluate and improve parameterizations. Since the 3D CVA can adjust gridded state variables from any data source with observed precipitation, radiation and surface fluxes, it also gives a potential possibility to use this algorithm in data assimilation system to assimilate precipitation and radiation data.

  20. Development of a method of analysis and computer program for calculating the inviscid flow about the windward surfaces of space shuttle configurations at large angles of attack (United States)

    Maslen, S. H.


    A general method developed for the analysis of inviscid hypersonic shock layers is discussed for application to the case of the shuttle vehicle at high (65 deg) angle of attack. The associated extensive subsonic flow region caused convergence difficulties whose resolution is discussed. It is required that the solution be smoother than anticipated.

  1. Development of microwave-assisted drying methods for sample preparation for dried spot micro-X-ray fluorescence analysis. (United States)

    Link, Dirk D; Kingston, H M; Havrilla, George J; Colletti, Lisa P


    Although dried spot micro X-ray fluorescence (MXRF) is an effective analytical technique for trace elemental analysis, the sample preparation procedures currently used suffer from a number of drawbacks. These drawbacks include relatively long preparation times, lack of control of the sample preparation environment, and possibility of loss of volatile analytes during the drying process. Microwave-assisted drying offers several advantages for dried spot preparation, including control of the environment and minimized volatility because of the differences between microwave heating and conventional heating. A microwave-assisted drying technique has been evaluated for use in preparing dried spots for trace analysis. Two apparatus designs for microwave drying were constructed and tested using multielement standard solutions, a standard reference material, and a "real-world" semiconductor cleaning solution. Following microwave-assisted drying of these aqueous samples, the residues were redissolved and analyzed by ICPMS. Effective recovery was obtained using the microwave drying methods, demonstrating that the microwave drying apparatus and methods described here may be more efficient alternatives for dried spot sample preparation.

  2. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)


    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  3. The development of human behavior analysis techniques - A study on knowledge representation methods for operator cognitive model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Young Tack [Soongsil University, Seoul (Korea, Republic of)


    The main objective of this project is modeling of human operator in a main control room of Nuclear Power Plant. For this purpose, we carried out research on knowledge representation and inference method based on Rasmussen`s decision ladder structure. And we have developed SACOM(Simulation= Analyzer with a Cognitive Operator Model) using G2 shell on Sun workstations. SACOM consists of Operator Model, Interaction Analyzer, Situation Generator. Cognitive model aims to build a more detailed model of human operators in an effective way. SACOM is designed to model knowledge-based behavior of human operators more easily. The followings are main research topics carried out this year. First, in order to model knowledge-based behavior of human operators, more detailed scenarios are constructed. And, knowledge representation and inference methods are developed to support the scenarios. Second, meta knowledge structures are studied to support human operators 4 types of diagnoses. This work includes a study on meta and scheduler knowledge structures for generate-and-test, topographic, decision tree and case-based approaches. Third, domain knowledge structure are improved to support meta knowledge. Especially, domain knowledge structures are developed to model topographic diagnosis model. Fourth, more applicable interaction analyzer and situation generator are designed and implemented. The new version is implemented in G2 on Sun workstations. 35 refs., 49 figs. (author)

  4. Negotiating a Systems Development Method (United States)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  5. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies. (United States)

    Torrens, George Edward


    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  6. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies (United States)

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen


    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm × 0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed. PMID:25114921

  7. Development of an optimized method for the detection of airborne viruses with real-time PCR analysis

    Directory of Open Access Journals (Sweden)

    Legaki Euaggelia


    Full Text Available Abstract Background Airborne viruses remain one of the major public health issues worldwide. Detection and quantification of airborne viruses is essential in order to provide information regarding public health risk assessment. Findings In this study, an optimized new, simple, low cost method for sampling of airborne viruses using Low Melting Agarose (LMA plates and a conventional microbial air sampling device has been developed. The use of LMA plates permits the direct nucleic acids extraction of the captured viruses without the need of any preliminary elution step. Molecular detection and quantification of airborne viruses is performed using real-time quantitative (RT-PCR (Q(RT-PCR technique. The method has been tested using Adenoviruses (AdVs and Noroviruses (NoVs GII, as representative DNA and RNA viruses, respectively. Moreover, the method has been tested successfully in outdoor experiments, by detecting and quantifying human adenoviruses (HAdVs in the airborne environment of a wastewater treatment plant. Conclusions The great advantage of LMA is that nucleic acids extraction is performed directly on the LMA plates, while the eluted nucleic acids are totally free of inhibitory substances. Coupled with QPCR the whole procedure can be completed in less than three (3 hours.

  8. Structural analysis of plate-type fuel assemblies and development of a non-destructive method to assess their integrity

    Energy Technology Data Exchange (ETDEWEB)

    Caresta, Mauro, E-mail: [School of Mechanical and Manufacturing Engineering, The University of New South Wales, Sydney 2052, NSW (Australia); Wassink, David [Australian Nuclear Science and Technology Organisation (ANSTO), Lucas Heights 2234, NSW (Australia)


    Highlights: • A plate-type fuel assembly is made of thin plates mounted in a box-like structure. • Drag force from the coolant can shift the plates. • A non invasive method is proposed to test the strength of the plate connections. • The natural frequencies’ shift is used to assess the fuel integrity. -- Abstract: This work is concerned with the structural behaviour and the integrity of parallel plate-type nuclear fuel assemblies. A plate-type assembly consists of several thin plates mounted in a box-like structure and is subjected to a coolant flow that can result in a considerable drag force. A finite element model of an assembly is presented to study the sensitivity of the natural frequencies to the stiffness of the plates’ junctions. It is shown that the shift in the natural frequencies of the torsional modes can be used to check the global integrity of the fuel assembly while the local natural frequencies of the inner plates can be used to estimate the maximum drag force they can resist. Finally a non-destructive method is developed to assess the resistance of the inner plates to bear an applied load. Extensive computational and experimental results are presented to prove the applicability of the method presented.

  9. Development of New Method for Simultaneous Analysis of Piracetam and Levetiracetam in Pharmaceuticals and Biological Fluids: Application in Stability Studies

    Directory of Open Access Journals (Sweden)

    Farhan Ahmed Siddiqui


    Full Text Available RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70 : 30 v/v mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  10. Development of new method for simultaneous analysis of piracetam and levetiracetam in pharmaceuticals and biological fluids: application in stability studies. (United States)

    Siddiqui, Farhan Ahmed; Sher, Nawab; Shafi, Nighat; Wafa Sial, Alisha; Ahmad, Mansoor; Mehjebeen; Naseem, Huma


    RP-HPLC ultraviolet detection simultaneous quantification of piracetam and levetiracetam has been developed and validated. The chromatography was obtained on a Nucleosil C18 column of 25 cm×0.46 cm, 10 μm, dimension. The mobile phase was a (70:30 v/v) mixture of 0.1 g/L of triethylamine and acetonitrile. Smooth flow of mobile phase at 1 mL/min was set and 205 nm wavelength was selected. Results were evaluated through statistical parameters which qualify the method reproducibility and selectivity for the quantification of piracetam, levetiracetam, and their impurities hence proving stability-indicating properties. The proposed method is significantly important, permitting the separation of the main constituent piracetam from levetiracetam. Linear behavior was observed between 20 ng/mL and 10,000 ng/mL for both drugs. The proposed method was checked in bulk drugs, dosage formulations, physiological condition, and clinical investigations and excellent outcome was witnessed.

  11. The Role of R&D and Innovation in Regional Development: An Interregional Analysis with DEMATEL-Based Analytic Network Process (DANP and TOPSIS Methods

    Directory of Open Access Journals (Sweden)

    Enver Çakın


    Full Text Available R&D, innovation and information-based activities has become more important in recent years for regional development and elimination of regional development differences. Countries have also recognized that innovation is the most important factor of economic growth and begun to allocate more resource to R&D investments. In this study, for years of 2010, 2011 and 2012 the innovation performance of 12 regions sited at first level of Nomenclature of Units for Territorial Statistics (NUTS in Turkey, have been evaluated by taking basic R&D and innovation indicators into consideration. In this context regression analysis, DEMATEL-Based Analytic Network Process (DANP and TOPSIS methods have been applied. The criteria have been weighted by using the regression coefficients obtained through regression analysis in DEMATEL method and subsequently performance ranking of regions has been performed by TOPSIS method.

  12. Semianalytical analysis of shear walls with the use of discrete-continual finite element method. Part 2: Numerical examples, future development

    Directory of Open Access Journals (Sweden)

    Akimov Pavel


    Full Text Available The distinctive paper is devoted to the two-dimensional semi-analytical solution of boundary problems of analysis of shear walls with the use of discrete-continual finite element method (DCFEM. This approach allows obtaining the exact analytical solution in one direction (so-called “basic” direction, also decrease the size of the problem to one-dimensional common finite element analysis. Two numerical examples of structural analysis with the use of DCFEM are considered, conventional finite element method (FEM is used for verification purposes. The presented examples show some of the advantages of the suggested approach to semianalytical analysis of the shear wall. Future development of DCFEM, particularly associated with multigrid approach, is under consideration as well.

  13. Trial Sequential Methods for Meta-Analysis (United States)

    Kulinskaya, Elena; Wood, John


    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  14. Recent development of positron annihilation methods

    CERN Document Server

    Doyama, M


    When positron comes into solid or liquid, it moves in the matter and emitted two gamma rays at the opposite direction, by pair annihilation with electron. Each gamma ray is about 511 keV. The experiments of positron annihilation has been developed by three methods such as angular correlation between two gamma rays, energy analysis of emission gamma ray and positron lifetime. The angular correlation between two gamma rays is determined by gamma ray position detector.The energy analysis was measured by S-W analysis and Coincidence Doppler Broadening (CDB) method. Positron lifetime are determined by gamma-gamma lifetime measurement method, beta sup + -gamma lifetime measurement method and other method using waveform of photomultiplier, and determination of time and frequency of gamma-ray. Positron beam is applied to positron scattering, positron diffraction, low energy positron diffraction (LEPD), PELS, LEPSD, PAES, positron re-emission imaging microscope (PRIM) and positron channeling. The example of CDB method...

  15. Hair analysis for drugs of abuse: evaluation of analytical methods, environmental issues, and development of reference materials. (United States)

    Welch, M J; Sniegoski, L T; Allgood, C C; Habram, M


    Methods for extraction of cocaine, some of its metabolites, morphine, and codeine from hair and methods for analyzing the extracts have been investigated. Results of these studies have shown that extractions with 0.1N HCl are efficient at removing the target compounds from hair and appear to be as effective as enzymatic digestions that dissolve the hair. GC/MS with either electron ionization or chemical ionization was found to provide accurate and unambiguous determinations of the target compounds. Tandem mass spectrometry (MS/MS) also provided accurate results when performed on extracts from hair, but results were ambiguous when MS/MS was performed on hair segments directly. Environmental issues, including the removal of powdered and vapor-deposited cocaine from the hair surface and the effect of various hair treatments on the levels of cocaine entrapped in hair, have also been investigated. Removal of cocaine deposited on hair was incomplete by all approaches tested, making differentiation of hair of cocaine users from hair with environmental exposure of cocaine difficult. Cocaethylene, a cocaine metabolite believed to be formed in the liver, was found in the hair of some cocaine users and may be a good marker for proving drug use. Common hair treatments, such as shampoos, conditioners, and peroxide bleaches, reduced cocaine levels in a fortified hair material by 60 to 80% after 30 treatments. Finally, to assist laboratories in evaluating the accuracy of their methods, two human hair reference materials with recommended concentrations of cocaine, benzoylecgonine, morphine, and codeine determined by GC/MS have been developed.

  16. New developments for the analysis of archaeological and artistic artifacts by optical and ion beam methods at LAMFI

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Barbosa, Marcel D. L.; Added, Nemitala; Curado, Jessica F.; Kajiya, Elizabet M.; Campos, Pedro H.O.V. de [Universidade de Sao Paulo (USP), SP (Brazil). Inst. de Fisica


    Full text: Since 2005, the analysis of artistic and cultural heritage objects at LAMFI-USP (Laboratorio de Analises de Materiais com Feixes Ionicos), initially restricted to ion beam methods, is growing steadily. Since then, alternative methodologies and procedures have been incorporated to better characterize these objects, that possess distinctive physical characteristics and also due to their high cultural and monetary value. The examinations were expanded to other non-destructive analytical techniques like portable XRF (X-ray fluorescence) analysis, X-ray radiography, visible, UV (ultraviolet) and IR (infrared) light imaging that are helping to better understand these art objects, particularly paintings, where the techniques are helping to access the conservation state and also reveal underlying drawings, which help understanding the creative process of the artist. The external beam arrangement at LAMFI was recently updated for simultaneous PIXE (Particle induced X-ray emission), RBS (Rutherford back scattering), PIGE (Particle induced gamma-ray emission) and IBL (Ion beam luminescence) analysis in open air. The new setup comprises a 2 {pi} star-like detector assembly with 7 collimated telescopes: two openings have laser beams for optical alignment of the target, 2 are used for X-ray detectors, 1 for a particle detector, 1 for an optical spectrometer, and 1 for a image. The particle and X-ray detector telescopes can be evacuated to reduce signal losses. The 2 telescopes with the X-ray detectors have absorbers to selectively filter low energy X-rays, optimizing the PIXE detection limits. The beam exit window is made of an 8 {mu}m aluminum foil to monitoring integrated beam charge by measuring the Al gamma rays with a NaI detector. The geometry and materials of the assembly have been carefully designed to shield the X-ray detectors from measuring the X-rays from the exit beam window as well as reducing the detection of Ar K {alpha} from the in air beam path. The

  17. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard


    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  18. Hydrophilic interaction liquid chromatography in analysis of granisetron HCl and its related substances. Retention mechanisms and method development. (United States)

    Maksić, Jelena; Tumpa, Anja; Stajić, Ana; Jovanović, Marko; Rakić, Tijana; Jančić-Stojanović, Biljana


    In this paper separation of granisetron and its two related substances in HILIC mode is presented. Separation was done on silica column derivatized with sulfoalkylbetaine groups (ZIC-HILIC). Firstly, retention mechanisms were assessed whereby retention factors of substances were followed in wide range of acetonitrile content (80-97%), at constant concentration of aqueous buffer (10mM) as well as at constant pH value of 3.0. Further, in order to developed optimal HILIC method, Design of Experiments (DoE) methodology was applied. For optimization full factorial design 3(2) was employed. Influence of acetonitrile content and ammonium acetate concentration were investigated while pH of the water phase was kept at 3.3. Adequacy of obtained mathematical models was confirmed by ANOVA. Optimization goals (α>1.15 and minimal run time) were accomplished with 94.7% of acetonitrile in mobile phase and 70 mM of ammonium acetate in water phase. Optimal point was in the middle of defined Design Space. In the next phase, robustness was experimetally tested by Rechtschaffen design. The investigated factors and their levels were: acetonitrile content (±1%), ammonium acetate molarity in water phase (±2 mM), pH value of water phase (±0.2) and column temperature (±4 °C). The validation scope included selectivity, linearity, accuracy and precision as well as determination of limit of detection (LOD) and limit of quantification (LOQ) for the related substances. Additionally, the validation acceptance criteria were met in all cases. Finally, the proposed method could be successfully utilized for estimation of granisetron HCl and its related substances in tablets and parenteral dosage forms, as well as for monitoring degradation under various stress conditions.

  19. Double divisor mean centering of ratio spectra as a developed spectrophotometric method for the analysis of five-component mixture in drug analysis

    Directory of Open Access Journals (Sweden)

    R’afat Mahmoud Nejem


    Full Text Available In this paper a simple method was developed for the simultaneous determination of five-component mixtures, without prior separation steps. The method is based on the combination of double divisor-ratio derivative method and mean centering of ratio spectra method. The mathematical explanation of the procedure is illustrated. The linear determination ranges were 0–30, 0–20, 0–20, 0–45 and 0–100 μg ml−1 for paracetamol, methylparaben, propylparaben, chloropheniramine maleate and pseudoephedrine hydrochloride in 0.1 M HCl, respectively. The proposed method was validated by using synthetic five-component mixtures and applied to the simultaneous determination of these drugs in Decamol Flu syrup. No published spectrophotometric method has been reported for simultaneous determination of the five components of this mixture. So the results of the double divisor mean centering of ratio method (DD-MCR were statistically compared with those of a proposed classical least squares method (CLS.

  20. Development of CAD based on ANN analysis of power spectra for pneumoconiosis in chest radiographs: effect of three new enhancement methods. (United States)

    Okumura, Eiichiro; Kawashita, Ikuo; Ishida, Takayuki


    We have been developing a computer-aided detection (CAD) scheme for pneumoconiosis based on a rule-based plus artificial neural network (ANN) analysis of power spectra. In this study, we have developed three enhancement methods for the abnormal patterns to reduce false-positive and false-negative values. The image database consisted of 2 normal and 15 abnormal chest radiographs. The International Labour Organization standard chest radiographs with pneumoconiosis were categorized as subcategory, size, and shape of pneumoconiosis. Regions of interest (ROIs) with a matrix size of 32 × 32 were selected from normal and abnormal lungs. Three new enhanced methods were obtained by window function, top-hat transformation, and gray-level co-occurrence matrix analysis. We calculated the power spectrum (PS) of all ROIs by Fourier transform. For the classification between normal and abnormal ROIs, we applied a combined analysis using the ruled-based plus the ANN method. To evaluate the overall performance of this CAD scheme, we employed ROC analysis for distinguishing between normal and abnormal ROIs. On the chest radiographs of the highest categories (severe pneumoconiosis) and the lowest categories (early pneumoconiosis), this CAD scheme achieved area under the curve (AUC) values of 0.93 ± 0.02 and 0.72 ± 0.03. The combined rule-based plus ANN method with the three new enhanced methods obtained the highest classification performance for distinguishing between abnormal and normal ROIs. Our CAD system based on the three new enhanced methods would be useful in assisting radiologists in the classification of pneumoconiosis.

  1. Methods in algorithmic analysis

    CERN Document Server

    Dobrushkin, Vladimir A


    …helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010

  2. Development and validation of a stability-indicating RP-UPLC method for the quantitative analysis of nabumetone in tablet dosage form. (United States)

    Sethi, Neha; Anand, Ankit; Chandrul, Kaushal K; Jain, Garima; Srinivas, Kona S


    High efficiency and less run time are the basic requirements of high-speed chromatographic separations. To fulfill these requirements, a new separation technique, ultra-performance liquid chromatography (UPLC), has shown promising developments. A rapid, specific, sensitive, and precise reverse-phase UPLC method is developed for the determination of nabumetone in tablet dosage form. In this work, a new isocratic chromatographic method is developed. The newly developed method is applicable for assay determination of the active pharmaceutical ingredient. The chromatographic separation is achieved on a Waters Acquity BEH column (100 mm, i.d., 2.1 mm, 1.7 µm) within a short runtime of 2 min using a mobile phase of 5 mM ammonium acetate-acetonitrile (25:75, v/v), at a flow rate of 0.3 mL/min at an ambient temperature. Quantification is achieved with photodiode array detection at 230 nm, over the concentration range of 0.05-26 µg/mL. Forced degradation studies are also performed for nabumetone bulk drug samples to demonstrate the stability-indicating power of the UPLC method. Comparison of system performance with conventional high-performance liquid chromatography is made with respect to analysis time, efficiency, and sensitivity. The method is validated according to the ICH guidelines and is applied successfully for the determination of nabumetone in tablets.

  3. A thermogravimetric analysis (TGA) method developed for estimating the stoichiometric ratio of solid-state {alpha}-cyclodextrin-based inclusion complexes

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yuxiang; Wang, Jinpeng; Bashari, Mohanad; Hu, Xiuting [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Feng, Tao [School of Perfume and Aroma Technology, Shanghai Institute of Technology, Shanghai 201418 (China); Xu, Xueming [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Jin, Zhengyu, E-mail: [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China); Tian, Yaoqi, E-mail: [The State Key Laboratory of Food Science and Technology, School of Food Science and Technology, Jiangnan University, Wuxi 214122 (China)


    Highlights: Black-Right-Pointing-Pointer We develop a TGA method for the measurement of the stoichiometric ratio. Black-Right-Pointing-Pointer A series of formulas are deduced to calculate the stoichiometric ratio. Black-Right-Pointing-Pointer Four {alpha}-CD-based inclusion complexes were successfully prepared. Black-Right-Pointing-Pointer The developed method is applicable. - Abstract: An approach mainly based on thermogravimetric analysis (TGA) was developed to evaluate the stoichiometric ratio (SR, guest to host) of the guest-{alpha}-cyclodextrin (Guest-{alpha}-CD) inclusion complexes (4-cresol-{alpha}-CD, benzyl alcohol-{alpha}-CD, ferrocene-{alpha}-CD and decanoic acid-{alpha}-CD). The present data obtained from Fourier transform-infrared (FT-IR) spectroscopy showed that all the {alpha}-CD-based inclusion complexes were successfully prepared in a solid-state form. The stoichiometric ratios of {alpha}-CD to the relative guests (4-cresol, benzyl alcohol, ferrocene and decanoic acid) determined by the developed method were 1:1, 1:2, 2:1 and 1:2, respectively. These SR data were well demonstrated by the previously reported X-ray diffraction (XRD) method and the NMR confirmatory experiments, except the SR of decanoic acid with a larger size and longer chain was not consistent. It is, therefore, suggested that the TGA-based method is applicable to follow the stoichiometric ratio of the polycrystalline {alpha}-CD-based inclusion complexes with smaller and shorter chain guests.

  4. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam


    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  5. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug


    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  6. Development of liquid chromatography methods coupled to mass spectrometry for the analysis of substances with a wide variety of polarity in meconium. (United States)

    Meyer-Monath, Marie; Chatellier, Claudine; Cabooter, Deirdre; Rouget, Florence; Morel, Isabelle; Lestremau, Francois


    Meconium is the first fecal excretion of newborns. This complex accumulative matrix allows assessing the exposure of the fetus to xenobiotics during the last 6 months of pregnancy. To determine the eventual effect of fetal exposure to micropollutants in this matrix, robust and sensitive analytical methods must be developed. This article describes the method development of liquid chromatography methods coupled to triple quadrupole mass spectrometry for relevant pollutants. The 28 selected target compounds had different physico-chemical properties from very polar (glyphosate) to non-polar molecules (pyrethroids). Tests were performed with three different types of columns: reversed phase, ion exchange and HILIC. As a unique method could not be determined for the simultaneous analysis of all compounds, three columns were selected and suitable chromatographic methods were optimized. Similar results were noticed for the separation of the target compounds dissolved in either meconium extract or solvent for reversed phase and ion exchange columns. However, for HILIC, the matrix had a significant influence on the peak shape and robustness of the method. Finally, the analytical methods were applied to "real" meconium samples.

  7. Developments in CTG analysis. (United States)

    Van Geijn, H P


    FHR monitoring has been the subject of many debates. The technique, in itself, can be considered to be accurate and reliable both in the antenatal period, when using the Doppler signal in combination with autocorrelation techniques, and during the intrapartum period, in particular when the FHR signal can be obtained from a fetal ECG electrode placed on the presenting part. The major problems with FHR monitoring relate to the reading and interpretation of the CTG tracings. Since the FHR pattern is primarily an expression of the activity of the control by the central and peripheral nervous system over cardiovascular haemodynamics, it is possibly too indirect a signal. In other specialities such as neonatology, anaesthesiology and cardiology, monitoring and graphic display of heart rate patterns have not gained wide acceptance among clinicians. Digitized archiving, numerical analysis and even more advanced techniques, as described in this chapter, have primarily found a place in obstetrics. This can be easily explained, since the obstetrician is fully dependent on indirectly collected information regarding the fetal condition, such as (a) movements experienced by the mother, observed with ultrasound or recorded with kinetocardiotocography (Schmidt, 1994), (b) perfusion of various vessels, as assessed by Doppler velocimetry, (c) the amount of amniotic fluid or (d) changes reflected in the condition of the mother, such as the development of gestation-induced hypertension and (e) the easily, continuously obtainable FHR signal. It is of particular comfort to the obstetrician that a normal FHR tracing reliably predicts the birth of the infant in a good condition, which makes cardiotocography so attractive for widespread application. However, in the intrapartum period, many traces cannot fulfil the criteria of normality, especially in the second stage. In this respect, cardiotocography remains primarily a screening and not so much a diagnostic method. As long as continuous

  8. Development of a potentiometric EDTA method for determination of molybdenum. Use of the analysis for molybdenite concentrates (United States)

    Khristova, R.; Vanmen, M.


    Based on considerations of principles and experimental data, the interference of sulfate ions in poteniometric titration of EDTA with FeCl3 was confirmed. The method of back complexometric titration of molybdenum of Nonova and Gasheva was improved by replacing hydrazine sulfate with hydrazine hydrochloride for reduction of Mo(VI) to Mo(V). The method can be used for one to tenths of mg of molybdenum with 0.04 mg standard deviation. The specific method of determination of molybdenum in molybdenite concentrates is presented.

  9. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments. (United States)

    Magnusson, R; Nordlander, T; Östin, A


    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea.

  10. Studies on application of neutron activation analysis -Applied research on air pollution monitoring and development of analytical method of environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Chung, Young Ju; Jeong, Eui Sik; Lee, Sang Mi; Kang, Sang Hun; Cho, Seung Yeon; Kwon, Young Sik; Chung, Sang Wuk; Lee, Kyu Sung; Chun, Ki Hong; Kim, Nak Bae; Lee, Kil Yong; Yoon, Yoon Yeol; Chun, Sang Ki


    This research report is written for results of applied research on air pollution monitoring using instrumental neutron activation analysis. For identification and standardization of analytical method, 24 environmental samples are analyzed quantitatively, and accuracy and precision of this method are measured. Using airborne particulate matter and biomonitor chosen as environmental indicators, trace elemental concentrations of sample collected at urban and rural site monthly are determined ant then the calculation of statistics and the factor analysis are carried out for investigation of emission source. Facilities for NAA are installed in a new HANARO reactor, functional test is performed for routine operation. In addition, unified software code for NAA is developed to improve accuracy, precision and abilities of analytical processes. (author). 103 refs., 61 tabs., 19 figs.

  11. Development and validation of stability-indicating TLC-densitometric method for determination of betaxolol with LC-ESI/MS analysis of degradation product. (United States)

    Kwiecień, Anna; Krzek, Jan; Walczak, Maria; Mazur, Mateusz


    The purpose of this work was to develop a sensitive stability indicating TLC-densitometric method for the determination of betaxolol (Bx) in pharmaceutical preparations and to study the stability of Bx in acidic solutions. The method was developed on TLC aluminium plates precoated with silica gel F254using the mobile phase chloroform-methanol-ammonia 25% (18 : 4: 0.2, v/v/v) which gives compact spots for Bx (R(f) approximately equal to 0.64) and its degradation product (R(f) approximately equal to 0.39). Densitometric analysis was carried out in UV at 280 nm. The developed method is highly sensitive (LOD = 66.6 ng/spot, LOQ = 200 ng/spot), precise (RSD = 2.73%) and accurate (mean recovery = 100.28% at 100% level). Bx was subjected to acidic and alkaline hydrolysis but degradation was observed only in acidic solutions. The degradation process was described with kinetic and thermodynamic parameters. Based on LC-ESI/MS analysis, it was found that Bx decomposes in acidic solution to produce ethoxyphenoxy-3-[(1-methylethyl)amino]propan-2-ol.


    Directory of Open Access Journals (Sweden)

    Ivan Valeriu


    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  13. Development of methods for multiresidue analysis of rice post-emergence herbicides in loam soil and their possible applications to soils of different composition. (United States)

    Niell, Silvina; Pareja, Lucia; Asteggiante, Lucía Geis; Roehrs, Rafael; Pizzutti, Ionara R; García, Claudio; Heinzen, Horacio; Cesio, María Verónica


    Two simple and straightforward sample preparation methods were developed for the multiresidue analysis of post-emergence herbicides in loam soil that are commonly used in rice crop cultivation. A number of strategic soil extraction and cleanup methods were evaluated. The instrumental analysis was performed by HPLC with a diode array detector. The best compromise between the recoveries (69-98%) and good repeatability (RSD clomazone were analyzed simultaneously. Quinclorac and bispyribac sodium were also assayed, but their recoveries were below 50%. Both methods had an LOD of 0.7 microg/kg and could accurately determine the residues at the 2 microg/kg level. These two methods could not be applied directly to other soil types as the recoveries strongly depended on the soil composition. The developed methodologies were successfully applied in monitoring 87 real-world soil samples, in which only propanil (6 to 12 microg/kg) and clomazone (15 to 20 microg/kg) residues could be detected.

  14. Analysis of human serum and whole blood for mineral content by ICP-MS and ICP-OES: development of a mineralomics method. (United States)

    Harrington, James M; Young, Daniel J; Essader, Amal S; Sumner, Susan J; Levine, Keith E


    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This paper describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μL) and serum (250 μL) samples was measured for eight essential minerals--sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se)--by plasma spectrometric methods and ranged from 0.635 to 10.1% relative standard deviation (RSD) for serum and 0.348-5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals and potentially of other minerals.

  15. Analysis of Human Serum and Whole Blood for Mineral Content by ICP-MS and ICP-OES: Development of a Mineralomics Method (United States)

    Harrington, James M.; Young, Daniel J.; Essader, Amal S.; Sumner, Susan J.; Levine, Keith E.


    Minerals are inorganic compounds that are essential to the support of a variety of biological functions. Understanding the range and variability of the content of these minerals in biological samples can provide insight into the relationships between mineral content and the health of individuals. In particular, abnormal mineral content may serve as an indicator of illness. The development of robust, reliable analytical methods for the determination of the mineral content of biological samples is essential to developing biological models for understanding the relationship between minerals and illnesses. This manuscript describes a method for the analysis of the mineral content of small volumes of serum and whole blood samples from healthy individuals. Interday and intraday precision for the mineral content of the blood (250 μl) and serum (250 μl) samples was measured for eight essential minerals, sodium (Na), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), zinc (Zn), copper (Cu), and selenium (Se) by plasma spectrometric methods and ranged from 0.635 – 10.1% relative standard deviation (RSD) for serum and 0.348 – 5.98% for whole blood. A comparison of the determined ranges for ten serum samples and six whole blood samples provided good agreement with literature reference ranges. The results demonstrate that the digestion and analysis methods can be used to reliably measure the content of these minerals, and potentially to add other minerals. PMID:24917052

  16. Development of an SDS-gel electrophoresis method on SU-8 microchips for protein separation with LIF detection: Application to the analysis of whey proteins. (United States)

    Del Mar Barrios-Romero, Maria; Crevillén, Agustín G; Diez-Masa, José Carlos


    This work describes the development of an SDS-gel electrophoresis method for the analysis of major whey proteins (α-lactalbumin, β-lactoglobulin, and BSA) carried out in SU-8 microchips. The method uses a low-viscosity solution of dextran as a sieving polymer. A commercial coating agent (EOTrol LN) was added to the separation buffer to control the EOF of the chips. The potential of this coating agent to prevent protein adsorption on the walls of the SU-8 channels was also evaluated. Additionally, the fluorescence background of the SU-8 material was studied to improve the sensitivity of the method. By selecting an excitation wavelength of 532 nm at which the background fluorescence remains low and by replacing the mercury arc lamp by a laser in the detection system, an LOD in the nanomolar range was achieved for proteins derivatized with the fluorogenic reagent Chromeo P540. Finally, the method was applied to the analysis of milk samples, demonstrating the potential of SU-8 microchips for the analysis of proteins in complex food samples.


    Directory of Open Access Journals (Sweden)

    Bhoomi P. Shah*, Suresh Jain, Krishna K. Prajapati and Nasimabanu Y. Mansuri


    Full Text Available High performance liquid chromatography is one of the most accurate methods widely used for the quantitative as well as qualitative analysis of drug product and is used for determining drug product stability. Stability indicating HPLC methods are used to separate various drug related impurities that are formed during the synthesis or manufacture of drug product. This article discusses the strategies and issues regarding the development of stability indicating HPLC system for drug substance. A number of key chromatographic factors were evaluated in order to optimize the detection of all potentially relevant degradants. The method should be carefully examined for its ability to distinguish the primary drug components from the impurities. New chemical entities and drug products must undergo forced degradation studies which would be helpful in developing and demonstrating the specificity of such stability indicating methods. At every stage of drug development practical recommendations are provided which will help to avoid failures.

  18. Comparative urine analysis by liquid chromatography-mass spectrometry and multivariate statistics : Method development, evaluation, and application to proteinuria

    NARCIS (Netherlands)

    Kemperman, Ramses F. J.; Horvatovich, Peter L.; Hoekman, Berend; Reijmers, Theo H.; Muskiet, Frits A. J.; Bischoff, Rainer


    We describe a platform for the comparative profiling of urine using reversed-phase liquid chromatography-mass spectrometry (LC-MS) and multivariate statistical data analysis. Urinary compounds were separated by gradient elution and subsequently detected by electrospray Ion-Trap MS. The lower limit o

  19. Ethnographic Contributions to Method Development

    DEFF Research Database (Denmark)

    Leander, Anna


    of IR—Critical Security Studies. Ethnographic research works with what has been termed a “strong” understanding of objectivity. When this understanding is taken seriously, it must lead to a refashioning of the processes of gathering, analyzing, and presenting data in ways that reverse many standard......Contrary to common assumptions, there is much to be learned about methods from constructivist/post-structuralist approaches to International Relations (IR) broadly speaking. This article develops this point by unpacking the contributions of one specific method—ethnography—as used in one subfield...

  20. Development of seismic sloshing analysis method of liquid coolant sodium in the KALIMER reactor vessel including several cylindrical components

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Han; Yoo, Bong


    It is important to establish a highly accurate technique of evaluating the sloshing behavior of liquid sodium coolant during earthquake for structural integrity of KALIMER reactor vessel and internals. The analysis procedure of sloshing behaviors is established using finite element computer program ANSYS, and the effectiveness of the procedure is confirmed by comparison with theoretical and experimental results in the literature. The analysis results agree well with experimental ones. Based on the procedure, the sloshing characteristics of liquid sodium coolant in the KALIMER reactor vessel including reactor internal components are evaluated. The maximum response height of sodium free surface at the reactor vessel is about 55cm when subjected to horizontal safe shutdown earthquake (SSE) of 0.3g for seismically isolated reactor building.

  1. Development of liquid chromatography-tandem mass spectrometry method for analysis of polyphenolic compounds in liquid samples of grape juice, green tea and coffee. (United States)

    Sapozhnikova, Yelena


    A simple and fast method for the analysis of a wide range of polyphenolic compounds in juice, tea, and coffee samples was developed using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The method was based on a simple sample preparation "dilute and shoot" approach, and LC-MS/MS quantification using genistein-d4 as an internal standard. The performance of six different syringeless filter devices was tested for sample preparation. The method was evaluated for recoveries of polyphenols at three spiking levels in juice, tea, and coffee samples. The recoveries of the majority of polyphenols were satisfactory (70-120%), but some varied significantly (20-138%) depending on the matrix. NIST Standard Reference Materials (SRM) 3257 Catechin Calibration Solutions and 3255 Camellia sinensis (Green Tea) Extract with certified concentrations of catechin and epicatechin were used for method validation. The measurement accuracy in two SRMs was 71-113%. The method was successfully applied to the analysis of liquid samples of grape juice, green tea, and coffee.

  2. Quality by Design approach in the development of hydrophilic interaction liquid chromatographic method for the analysis of iohexol and its impurities. (United States)

    Jovanović, Marko; Rakić, Tijana; Tumpa, Anja; Jančić Stojanović, Biljana


    This study presents the development of hydrophilic interaction liquid chromatographic method for the analysis of iohexol, its endo-isomer and three impurities following Quality by Design (QbD) approach. The main objective of the method was to identify the conditions where adequate separation quality in minimal analysis duration could be achieved within a robust region that guarantees the stability of method performance. The relationship between critical process parameters (acetonitrile content in the mobile phase, pH of the water phase and ammonium acetate concentration in the water phase) and critical quality attributes is created applying design of experiments methodology. The defined mathematical models and Monte Carlo simulation are used to evaluate the risk of uncertainty in models prediction and incertitude in adjusting the process parameters and to identify the design space. The borders of the design space are experimentally verified and confirmed that the quality of the method is preserved in this region. Moreover, Plackett-Burman design is applied for experimental robustness testing and method is fully validated to verify the adequacy of selected optimal conditions: the analytical column ZIC HILIC (100 mm × 4.6 mm, 5 μm particle size); mobile phase consisted of acetonitrile-water phase (72 mM ammonium acetate, pH adjusted to 6.5 with glacial acetic acid) (86.7:13.3) v/v; column temperature 25 °C, mobile phase flow rate 1 mL min(-1), wavelength of detection 254 nm.

  3. High-performance liquid chromatography analysis methods developed for quantifying enzymatic esterification of flavonoids in ionic liquids

    DEFF Research Database (Denmark)

    Lue, Bena-Marie; Guo, Zheng; Xu, X.B.


    Methods using reversed-phase high-performance liquid chromatography (RP-HPLC) with ELSD were investigated to quantify enzymatic reactions of flavonoids with fatty acids in the presence of diverse room temperature ionic liquids (RTILs). A buffered salt (preferably triethylamine-acetate) was found...

  4. Neurocognitive Pattern Analysis of an Auditory and Visual Numeric Motor Control Task. Part 1. Development of Methods. (United States)


    of stereophotogrammetry developed for orthodontic use. A series of stereo pictures of the participant’s head with electrode cap in place was taken...Prossalentis, A., B~ordas-Ferrer, M., Covello, L., lacob, M. and Mempel, E, Atlas of Stereotaxic Anatomy of the Tolencephalon: Anatomo-Radiological

  5. Method development for the redox speciation analysis of iron by ion chromatography-inductively coupled plasma mass spectrometry and carryover assessment using isotopically labeled analyte analogues. (United States)

    Wolle, Mesay Mulugeta; Fahrenholz, Timothy; Rahman, G M Mizanur; Pamuku, Matt; Kingston, H M 'Skip'; Browne, Damien


    An ion chromatography-inductively coupled plasma mass spectrometry (IC-ICP-MS) method was developed for the redox speciation analysis of iron (Fe) based on in-column complexation of Fe(2+) and Fe(3+) by dipicolinic acid (DPA). The effects of column type, mobile phase composition and molecular ion interference were studied in the method optimization. The carryover of the target species in the IC-ICP-MS method was uniquely and effectively evaluated using isotopically enriched analogues of the analytes ((54)Fe(2+) and (57)Fe(3+)). Standard solutions of the enriched standards were injected into the system following analysis of a sample, and the ratios of the isotopes of iron in the enriched standards were calculated based on the chromatographic peak areas. The concentrations of the analytes carried over from the sample to the enriched standards were determined using the quantitative relationship in isotope dilution mass spectrometry (IDMS). In contrast to the routine way of evaluating carryover effect by injecting a blank solution after sample analysis, the use of isotopically enriched standards identified significant analyte carryover in the present method. Extensive experiments were carried out to systematically identify the source of the carryover and to eliminate the problem; the separation column was found to be the exclusive source. More than 95% of the analyte carryover was eliminated by reducing the length of the column. The detection limit of the IC-ICP-MS method (MDL) for the iron species was 2ngg(-1). The method was used to determine Fe(2+) and Fe(3+) in synthetic aqueous standard solutions and a beverage sample.

  6. Development and validation of a LC-MS/MS method for quantitative analysis of uraemic toxins p-cresol sulphate and indoxyl sulphate in saliva. (United States)

    Giebułtowicz, Joanna; Korytowska, Natalia; Sankowski, Bartłomiej; Wroczyński, Piotr


    p-Cresol sulphate (pCS) and indoxyl sulphate (IS) are uraemic toxins, the concentration of which in serum correlate with the stage of renal failure. The aim of this study was to develop and validate a high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the analysis of pCS and IS in saliva. This is the first time, to our knowledge, that such a method has been developed using saliva. Unstimulated, fasting saliva was collected from healthy volunteers in the morning and pooled for validation assay. The method was validated for linearity, precision, accuracy, stability (freeze/thaw stability, stability in autosampler, short- and long-term stability, stock solution stability), dilution integrity and matrix effect. The analysed validation criteria were fulfilled. No influence of salivary flow (pCS: p=0.678; IS: p=0.238) nor type of swab in the Salivette device was detected. Finally, using the novel validated method, the saliva samples of healthy people (n=70) of various ages were analysed. We observed a tendency for an increase of concentration of toxins in saliva in the elderly. This could be a result of age-related diseases, e.g., diabetes and kidney function decline. We can conclude that the novel LC-MS/MS method can be used for the determination of pCS and IS in human saliva. The results encourage the validation of saliva as a clinical sample for monitoring toxin levels in organisms.

  7. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)


    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  8. Development of genetic diagnosing method for diabetes and cholecystitis based on gene analysis of CCK-A receptor

    Energy Technology Data Exchange (ETDEWEB)

    Kono, Akira [National Kyushu Cancer Center, Fukuoka (Japan)


    Based on the gene analysis of cholecystokinin type A receptor (CCKAR) from normal mouse and its sequence analysis in the previous year, CCKAR knock-out gene which allows mRNA expression of {beta}-galactosidase gene in stead of CCKAR gene was constructed. Since some abnormality in CCKAR gene is thought to be a causal factor of diabetes and cholecystitis, a knock-out mouse that expressed LacZ but not CCKAR was constructed to investigate the correlation between the clinical features of diabetes and cholecystitis, and CCKAR gene abnormalities. F2 mice that had mutations in CCKAR gene were born according to the Mendel's low. The expression of CCKAR gene was investigated in detail based on the expression of LacZ gene in various tissues of homo (-/-) and hetero (-/+) knockout mice. Comparative study on blood sugar level, blood insulin level, the formation of biliary calculus, etc. is underway with the wild mouse, hetero and homo knockout mouse. (M.N.)

  9. Ultra-fast gradient LC method for omeprazole analysis using a monolithic column: assay development, validation, and application to the quality control of omeprazole enteric-coated pellets. (United States)

    Borges, Keyller Bastos; Sánchez, Antonio José Macías; Pupo, Mônica Tallarico; Bonato, Pierina Sueli; Collado, Isidro González


    A method was optimized for the analysis of omeprazole (OMZ) by ultra-high speed LC with diode array detection using a monolithic Chromolith Fast Gradient RP 18 endcapped column (50 x 2.0 mm id). The analyses were performed at 30 degrees C using a mobile phase consisting of 0.15% (v/v) trifluoroacetic acid (TFA) in water (solvent A) and 0.15% (v/v) TFA in acetonitrile (solvent B) under a linear gradient of 5 to 90% B in 1 min at a flow rate of 1.0 mL/min and detection at 220 nm. Under these conditions, OMZ retention time was approximately 0.74 min. Validation parameters, such as selectivity, linearity, precision, accuracy, and robustness, showed results within the acceptable criteria. The method developed was successfully applied to OMZ enteric-coated pellets, showing that this assay can be used in the pharmaceutical industry for routine QC analysis. Moreover, the analytical conditions established allow for the simultaneous analysis of OMZ metabolites, 5-hydroxyomeprazole and omeprazole sulfone, in the same run, showing that this method can be extended to other matrixes with adequate procedures for sample preparation.

  10. Development of a gas-liquid chromatographic method for the analysis of fatty acid tryptamides in cocoa products. (United States)

    Hug, Bernadette; Golay, Pierre-Alain; Giuffrida, Francesca; Dionisi, Fabiola; Destaillats, Frédéric


    The determination of the occurrence and level of cocoa shells in cocoa products and chocolate is an important analytical issue. The recent European Union directive on cocoa and chocolate products (2000/36/EC) has not retained the former limit of a maximum amount of 5% of cocoa shells in cocoa nibs (based on fat-free dry matter), previously authorized for the elaboration of cocoa products such as cocoa mass. In the present study, we report a reliable gas-liquid chromatography procedure suitable for the determination of the occurrence of cocoa shells in cocoa products by detection of fatty acid tryptamides (FATs). The precision of the method was evaluated by analyzing nine different samples (cocoa liquors with different ranges of shells) six times (replicate repeatability). The variations of the robust coefficient of variation of the repeatability demonstrated that FAT(C22), FAT(C24), and total FATs are good markers for the detection of shells in cocoa products. The trueness of the method was evaluated by determining the FAT content in two spiked matrices (cocoa liquors and cocoa shells) at different levels (from 1 to 50 mg/100 g). A good relation was found between the results obtained and the spiking (recovery varied between 90 and 130%), and the linearity range was established between 1 and 50 mg/100 g in cocoa products. For total FAT contents of cocoa liquor containing 5% shells, the measurement uncertainty allows us to conclude that FAT is equal to 4.01 +/- 0.8 mg/100 g. This validated method is perfectly suitable to determine shell contents in cocoa products using FAT(C22), FAT(C24), and total FATs as markers. The results also confirmed that cocoa shells contain FAT(C24) and FAT(C22) in a constant ratio of nearly 2:1.

  11. Development of a low cost method to estimate the seismic signature of a geothermal field form ambient noise analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Tibuleac, Ileana [Univ. of Nevada, Reno, NV (United States)


    A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. The material included in this report demonstrates that, with the advantage of initial S-velocity models estimated from ambient noise surface waves, the seismic reflection survey, although with lower resolution, reproduces the results of the active survey when the ambient seismic noise is not contaminated by strong cultural noise. Ambient noise resolution is less at depth (below 1000m) compared to the active survey. In general, the results are promising and useful information can be recovered from ambient seismic noise, including dipping features and fault locations.

  12. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Meera Jay Desai


    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  13. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    Energy Technology Data Exchange (ETDEWEB)

    Desai, Meera Jay [Iowa State Univ., Ames, IA (United States)


    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude. Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then

  14. ARK methods: some recent developments (United States)

    Moir, Nicolette


    Almost Runge-Kutta methods are a sub-class of the family of methods known as general linear methods, used for solving ordinary differential equations. They combine many of the favourable properties of traditional Runge-Kutta methods with some additional advantages. We will introduce these methods, concentrating on methods of order four, and present some recent results.


    Directory of Open Access Journals (Sweden)

    I. V. Zhukovski


    Full Text Available The paper considers a problem on efficiency evaluation of innovation activity in 63 countries with developed and developing economies while using a method for data envelopment analysis. The following results of innovation activity have been used for calculation of an efficiency factor: export of high-technology products as percentage of industrial product export, export of ICT services as percentage of services export and payments obtained due to realization of intellectual property rights (in US dollars. A model of the data envelopment analysis with a changeable scale-dependent effect and which is directed on maximization of the obtained results (output-oriented VRS model has been used for the analysis. The evaluation has shown that such countries as the USA, Israel, Sweden and some others have maximum efficiency of resource transformation into innovative activity output. The executed analysis has revealed that the Republic of Belarus has a potential for improvement of indices on innovation results.

  16. [Development of Determination Method of Fluoroquinolone Antibiotics in Sludge Based on Solid Phase Extraction and HPLC-Fluorescence Detection Analysis]. (United States)

    Dai, Xiao-hu; Xue, Yong-gang; Liu, Hua-jie; Dai, Ling-ling; Yan, Han; Li, Ning


    Fluoroquinolone antibiotics (FQs), as the common pharmaceuticals and personal care products (PPCPs), are widespread in the environment. FQs contained in wastewater would be ultimately enriched in sludge, posing a potential threat to the consequent sludge utilization. To optimize the analytical method applicable to the determination of FQs in sludge, the authors selected ofloxacin (OFL), norfioxacin (NOR), ciprofloxacin (CIP) and lomefloxacin (LOM) as the target FQs, and established a method which was based on cell lysis, FQs extraction with triethylamine/methanol/water solution, Solid Phase Extraction (SPE) and HPLC-Fluorescence Detection (FLD) determination. After the investigation, phosphoric acid-triethylamine was decided to be the buffer salt, and methanol was chosen as the organic mobile phase. The gradient fluorescence scanning strategy was proved to be necessary for the optimal detection as well. Furthermore, by the designed orthogonal experiments, the effects of the extraction materials, pH, and the eluents on the efficiency of SPE extraction were evaluated, by which the optimal extraction conditions were determined. As a result, FQs in liquid samples could be analyzed by utilizing HLB extraction cartridge, and the recovery rates of the four FQs were in the range of 82%-103%. As for solid samples, the recovery rates of the four FQs contained reached up to 71%-101%. Finally, the adsorptivity of the sludge from the different tanks ( anaerobic, anoxic and oxic tanks) was investigated, showing gradual decrease in the adsorption capacity, but all adsorbed over 90% of the EQs. This conclusion also confirmed that 50% removal of FQs in the domestic wastewater treatment plant was realized by sludge adsorption.

  17. Remote sensing analysis of depositional landforms in alluvial settings: Method development and application to the Taquari megafan, Pantanal (Brazil) (United States)

    Zani, Hiran; Assine, Mario Luis; McGlue, Michael Matthew


    Traditional Shuttle Radar Topography Mission (SRTM) topographic datasets hold limited value in the geomorphic analysis of low-relief terrains. To address this shortcoming, this paper presents a series of techniques designed to enhance digital elevation models (DEMs) of environments dominated by low-amplitude landforms, such as a fluvial megafan system. These techniques were validated through the study of a wide depositional tract composed of several megafans located within the Brazilian Pantanal. The Taquari megafan is the most remarkable of these features, covering an area of approximately 49,000 km2. To enhance the SRTM-DEM, the megafan global topography was calculated and found to be accurately represented by a second order polynomial. Simple subtraction of the global topography from altitude produced a new DEM product, which greatly enhanced low amplitude landforms within the Taquari megafan. A field campaign and optical satellite images were used to ground-truth features on the enhanced DEM, which consisted of both depositional (constructional) and erosional features. The results demonstrate that depositional lobes are the dominant landforms on the megafan. A model linking baselevel change, avulsion, clastic sedimentation, and erosion is proposed to explain the microtopographic features on the Taquari megafan surface. The study confirms the potential promise of enhanced DEMs for geomorphological research in alluvial settings.

  18. 武器装备建设方案的组合分析方法%A Portfolio-Analysis Method for Selecting Armament Development Candidates

    Institute of Scientific and Technical Information of China (English)



    In the overarching design of armament development, it's important to construct and select an armament development scheme from on many candidates. A portfolio-analysis method is studied.Firstly, a series of armament development scheme candidates are constructed and to be assessed in detail.Secondly, various measures are selected to assess every candidate. The measures include the degree of mission accomplished, the risk in executing multi-mission, the cost of armament options and the risk in development process. Finally, the decision maker ranks these candidates according to their objectives and intensions based on the assessment results. The example followed verifies the portfolio-analysis method in selecting armament development candidates.%在武器装备建设顶层设计中,将若干备选的装备项目构建成武器装备建设方案是一项非常重要的工作.使用组合分析的理论对此问题进行了研究.首先,按照装备种类由备选的装备项目组合出多种武器装备建设方案,作为决策分析的对象.然后,选择满足任务程度、满足任务风险、建设费用和建设风险作为指标评价每种方案的优劣.最后,按照不同的决策目的和重点,对建设方案进行排序和选优.所附算例说明了方法的适用性.

  19. Development and Validation of a Reliable and Robust Method for the Analysis of Cannabinoids and Terpenes in Cannabis. (United States)

    Giese, Matthew W; Lewis, Mark A; Giese, Laura; Smith, Kevin M


    The requirements for an acceptable cannabis assay have changed dramatically over the years resulting in a large number of laboratories using a diverse array of analytical methodologies that have not been properly validated. Due to the lack of sufficiently validated methods, we conducted a single- laboratory validation study for the determination of cannabinoids and terpenes in a variety of commonly occurring cultivars. The procedure involves high- throughput homogenization to prepare sample extract, which is then profiled for cannabinoids and terpenes by HPLC-diode array detector and GC-flame ionization detector, respectively. Spike recovery studies for terpenes in the range of 0.03-1.5% were carried out with analytical standards, while recovery studies for Δ9-tetrahydrocannabinolic acid, cannabidiolic acid, Δ9-tetrahydrocannabivarinic acid, and cannabigerolic acid and their neutral counterparts in the range of 0.3-35% were carried out using cannabis extracts. In general, accuracy at all levels was within 5%, and RSDs were less than 3%. The interday and intraday repeatabilities of the procedure were evaluated with five different cultivars of varying chemotype, again resulting in acceptable RSDs. As an example of the application of this assay, it was used to illustrate the variability seen in cannabis coming from very advanced indoor cultivation operations.

  20. Novel methods to help develop healthier eating habits for eating and weight disorders: A systematic review and meta-analysis. (United States)

    Turton, Robert; Bruidegom, Kiki; Cardi, Valentina; Hirsch, Colette R; Treasure, Janet


    This paper systematically reviews novel interventions developed and tested in healthy controls that may be able to change the over or under controlled eating behaviours in eating and weight disorders. Electronic databases were searched for interventions targeting habits related to eating behaviours (implementation intentions; food-specific inhibition training and attention bias modification). These were assessed in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. In healthy controls the implementation intention approach produces a small increase in healthy food intake and reduction in unhealthy food intake post-intervention. The size of these effects decreases over time and no change in weight was found. Unhealthy food intake was moderately reduced by food-specific inhibition training and attention bias modification post-intervention. This work may have important implications for the treatment of populations with eating and weight disorders. However, these findings are preliminary as there is a moderate to high level of heterogeneity in implementation intention studies and to date there are few food-specific inhibition training and attention bias modification studies.

  1. The behavioral satiety sequence in pigeons (Columba livia). Description and development of a method for quantitative analysis. (United States)

    Spudeit, William Anderson; Sulzbach, Natalia Saretta; Bittencourt, Myla de A; Duarte, Anita Maurício Camillo; Liang, Hua; Lino-de-Oliveira, Cilene; Marino-Neto, José


    The postprandial event known as the specific dynamic action is an evolutionarily conserved physiological set of metabolic responses to feeding. Its behavioral counterpart, a sequence of drinking, maintenance (e.g., grooming) and sleep-like behaviors known as the behavioral satiety sequence (BSS), has been thoroughly described in rodents and has enabled the refined evaluation of potential appetite modifiers. However, the presence and attributes of a BSS have not been systematically studied in non-mammalian species. Here, we describe the BSS induced in pigeons (Columba livia) by 1) the presentation of a palatable seed mixture (SM) food to free-feeding animals (SM+FF condition) and 2) re-feeding after a 24-h fasting period (FD24h+SM), which was examined by continuous behavioral recording for 2h. We then compare these patterns to those observed in free-feeding (FF) animals. A set of graphic representations and indexes, drawn from these behaviors (latency, time-to-peak, inter-peak intervals and the first intersection between feeding curves and those of other BSS-typical behaviors) were used to describe the temporal structure and sequential relationships between the pigeon's BSS components. Cramér-von Mises-based statistical procedures and bootstrapping-based methods to compare pairs of complex behavioral curves were described and used for comparisons among the behavioral profiles during the free-feeding recordings and after fasting- and SM-induced BSS. FD24h+SM- and SM+FF-induced feeding were consistently followed by a similar sequence of increased bouts of drinking, followed by preening and then sleep, which were significantly different from that of FF birds. The sequential and temporal patterns of the pigeon's BSS were not affected by differences in food intake or by dissimilarity in motivational content of feeding stimuli. The present data indicated that a BSS pattern can be reliably evoked in the pigeon, in a chronological succession and sequence that strongly

  2. Development of the Method of Bacterial Leaching of Metals out of Low-Grade Ores, Rocks, and Industrial Wastes Using Neutron Activation Analysis

    CERN Document Server

    Tsertsvadze, L A; Petriashvili, Sh G; Chutkerashvili, D G; Kirkesali, E I; Frontasyeva, M V; Pavlov, S S; Gundorina, S F


    The results of preliminary investigations aimed at the development of an economical and easy to apply technique of bacterial leaching of rare and valuable metals out of low-grade ores, complex composition ores, rocks, and industrial wastes in Georgia are discussed. The main groups of microbiological community of the peat suspension used in the experiments of bacterial leaching are investigated and the activity of particular microorganisms in the leaching of probes with different mineral compositions is assessed. The element composition of the primary and processed samples was investigated by the epithermal neutron activation analysis method and the enrichment/subtraction level is estimated for various elements. The efficiency of the developed technique to purify wastes, extract some scrace metals, and enrich ores or rocks in some elements, e.g. Au, U, Th, Cs, Sr, Rb, Sc, Zr, Hf, Ta, Gd, Er, Lu, Ce, etc., is demonstrated.

  3. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS. (United States)

    Vestner, Jochen; Fritsch, Stefanie; Rauhut, Doris


    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 degrees C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d(5)) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.

  4. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis. (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan


    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications.

  5. Development of the HS-SPME-GC-MS/MS method for analysis of chemical warfare agent and their degradation products in environmental samples. (United States)

    Nawała, Jakub; Czupryński, Krzysztof; Popiel, Stanisław; Dziedzic, Daniel; Bełdowski, Jacek


    After World War II approximately 50,000 tons of chemical weapons were dumped in the Baltic Sea by the Soviet Union under the provisions of the Potsdam Conference on Disarmament. These dumped chemical warfare agents still possess a major threat to the marine environment and to human life. Therefore, continue monitoring of these munitions is essential. In this work, we present the application of new solid phase microextraction fibers in analysis of chemical warfare agents and their degradation products. It can be concluded that the best fiber for analysis of sulfur mustard and its degradation products is butyl acrylate (BA), whereas for analysis of organoarsenic compounds and chloroacetophenone, the best fiber is a co-polymer of methyl acrylate and methyl methacrylate (MA/MMA). In order to achieve the lowest LOD and LOQ the samples should be divided into two subsamples. One of them should be analyzed using a BA fiber, and the second one using a MA/MMA fiber. When the fast analysis is required, the microextraction should be performed by use of a butyl acrylate fiber because the extraction efficiency of organoarsenic compounds for this fiber is acceptable. Next, we have elaborated of the HS-SPME-GC-MS/MS method for analysis of CWA degradation products in environmental samples using laboratory obtained fibers The analytical method for analysis of organosulfur and organoarsenic compounds was optimized and validated. The LOD's for all target chemicals were between 0.03 and 0.65 ppb. Then, the analytical method developed by us, was used for the analysis of sediment and pore water samples from the Baltic Sea. During these studies, 80 samples were analyzed. It was found that 25 sediments and 5 pore water samples contained CWA degradation products such as 1,4-dithiane, 1,4-oxathiane or triphenylarsine, the latter being a component of arsine oil. The obtained data is evidence that the CWAs present in the Baltic Sea have leaked into the general marine environment.

  6. A new method for change-point detection developed for on-line analysis of the heart beat variability during sleep (United States)

    Staudacher, M.; Telser, S.; Amann, A.; Hinterhuber, H.; Ritsch-Marte, M.


    We present a novel scaling-dependent measure for times series analysis, the progressive detrended fluctuation analysis (PDFA). Since this method progressively includes and analyzes all data points of the time series, it is suitable for on-line change-point detection: Sudden changes in the statistics of the data points, in the type of correlation or in the statistical variance, or both, are reliably indicated and localized in time. This is first shown for numerous artificially generated data sets of Gaussian random numbers. Also time series with various non-stationarities, such as non-polynomial trends and “spiking”, are included as examples. Although generally applicable, our method was specifically developed as a tool for numerical sleep evaluation based on heart rate variability in the ECG-channel of polysomnographic whole night recordings. It is demonstrated that PDFA can detect specific sleep stage transitions, typically ascending transitions involving sympathetic activation as for example short episodes of wakefulness, and that the method is capable to discern between NREM sleep and REM sleep.

  7. Development, validation and comparison of two stability-indicating RP-LC methods using charged aerosol and UV detectors for analysis of lisdexamfetamine dimesylate in capsules

    Directory of Open Access Journals (Sweden)

    Graciela Carlos


    Full Text Available Two new stability-indicating liquid chromatographic methods using two detectors, an ultraviolet (UV and a charged aerosol detector (CAD simultaneously connected in series were validated for the assessment of lisdexamfetamine dimesylate (LDX in capsule. The method was optimized and the influence of individual parameters on UV and CAD response and sensitivity was studied. Chromatography was performed on a Zorbax CN column (250 mm × 4.6 mm, 5 μm in an isocratic elution mode, using acetonitrile and 20 mM ammonium formate at pH 4.0 (50:50, v/v as mobile phase and UV detection at 207 nm. The developed method was validated according to ICH guidelines and the parameters’ specificity, limit of detection, limit of quantitation, linearity, accuracy, precision and robustness were evaluated. CAD is designated to be a non-linear detector in a wide dynamic range, however, the method was linear over the concentration range of 70–130 μg mL−1 in both detectors. The method was precise and accurate. Robustness study was performed by a Plackett–Burman design, delivering results within the acceptable range. Neither the excipients nor the degradation products showed interference in the method after studies of specificity as well as under stress conditions. The results of the LC-UV and LC-CAD methods were statistically compared through ANOVA and showed no significant difference (p > 0.05. Both proposed methods could be considered interchangeable and stability-indicating, and can be applied as an appropriate quality control tool for routine analysis of LDX in capsule.

  8. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum. (United States)

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie


    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  9. Development of a method for comprehensive and quantitative analysis of plant hormones by highly sensitive nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Izumi, Yoshihiro; Okazawa, Atsushi; Bamba, Takeshi; Kobayashi, Akio [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan); Fukusaki, Eiichiro, E-mail: [Department of Biotechnology, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871 (Japan)


    In recent plant hormone research, there is an increased demand for a highly sensitive and comprehensive analytical approach to elucidate the hormonal signaling networks, functions, and dynamics. We have demonstrated the high sensitivity of a comprehensive and quantitative analytical method developed with nanoflow liquid chromatography-electrospray ionization-ion trap mass spectrometry (LC-ESI-IT-MS/MS) under multiple-reaction monitoring (MRM) in plant hormone profiling. Unlabeled and deuterium-labeled isotopomers of four classes of plant hormones and their derivatives, auxins, cytokinins (CK), abscisic acid (ABA), and gibberellins (GA), were analyzed by this method. The optimized nanoflow-LC-ESI-IT-MS/MS method showed ca. 5-10-fold greater sensitivity than capillary-LC-ESI-IT-MS/MS, and the detection limits (S/N = 3) of several plant hormones were in the sub-fmol range. The results showed excellent linearity (R{sup 2} values of 0.9937-1.0000) and reproducibility of elution times (relative standard deviations, RSDs, <1.1%) and peak areas (RSDs, <10.7%) for all target compounds. Further, sample purification using Oasis HLB and Oasis MCX cartridges significantly decreased the ion-suppressing effects of biological matrix as compared to the purification using only Oasis HLB cartridge. The optimized nanoflow-LC-ESI-IT-MS/MS method was successfully used to analyze endogenous plant hormones in Arabidopsis and tobacco samples. The samples used in this analysis were extracted from only 17 tobacco dry seeds (1 mg DW), indicating that the efficiency of analysis of endogenous plant hormones strongly depends on the detection sensitivity of the method. Our analytical approach will be useful for in-depth studies on complex plant hormonal metabolism.

  10. Developments of an Interactive Sail Design Method

    Directory of Open Access Journals (Sweden)

    S. M. Malpede


    Full Text Available This paper presents a new tool for performing the integrated design and analysis of a sail. The features of the system are the geometrical definition of a sail shape, using the Bezier surface method, the creation of a finite element model for the non-linear structural analysis and a fluid-dynamic model for the aerodynamic analysis. The system has been developed using MATLAB(r. Recent sail design efforts have been focused on solving the aeroelastic behavior of the sail. The pressure distribution on a sail changes continuously, by virtue of cloth stretch and flexing. The sail shape determines the pressure distribution and, at the same time, the pressure distribution on the sail stretches and flexes the sail material determining its shape. This characteristic non-linear behavior requires iterative solution strategies to obtain the equilibrium configuration and evaluate the forces involved. The aeroelastic problem is tackled by combining structural with aerodynamic analysis. Firstly, pressure loads for a known sail-shape are computed (aerodynamic analysis. Secondly, the sail-shape is analyzed for the obtained external loads (structural analysis. The final solution is obtained by using an iterative analysis process, which involves both aerodynamic and the structural analysis. When the solution converges, it is possible to make design modifications.

  11. Bayesian Methods for Statistical Analysis


    Puza, Borek


    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  12. Scientific methods for developing ultrastable structures

    Energy Technology Data Exchange (ETDEWEB)

    Gamble, M.; Thompson, T.; Miller, W.


    Scientific methods used by the Los Alamos National Laboratory for developing an ultrastable structure for study of silicon-based elementary particle tracking systems are addressed. In particular, the design, analysis, and monitoring of this system are explored. The development methodology was based on a triad of analytical, computational, and experimental techniques. These were used to achieve a significant degree of mechanical stability (alignment accuracy >1 {mu}rad) and yet allow dynamic manipulation of the system. Estimates of system thermal and vibratory stability and component performance are compared with experimental data collected using laser interferometry and accelerometers. 8 refs., 5 figs., 4 tabs.

  13. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.


    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  14. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities. (United States)

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra


    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets.

  15. Development of a mixed-mode solid phase extraction method and further gas chromatography mass spectrometry for the analysis of 3-alkyl-2-methoxypyrazines in wine. (United States)

    López, Ricardo; Gracia-Moreno, Elisa; Cacho, Juan; Ferrreira, Vicente


    A new method for analysing 3-isopropyl-2-methoxypyrazine, 3-sec-butyl-2-methoxypyrazine and 3-isobutyl-2-methoxypyrazine in wine has been developed and applied to wine. The analytes are extracted from 25 mL of wine in a solid-phase extraction cartridge filled with 60 mg of cation-exchange mixed-mode sorbent. Analytes are recovered with triethylamine in dichloromethane and the organic extract is analysed by GC-SIM-MS using 3-isopropyl-2-ethoxypyrazine as internal standard. The detection limits of the method are in all cases under 1 ng/L, below the olfactory thresholds of the compounds in wine. The repeatability of the method is around 15% for levels in wine of 2 ng/L. Linearity is satisfactory and recoveries are in all cases close to 100% with RSD between 13% and 20%. The method has been applied to the analysis of 12 Chilean white and 8 Spanish red wines. The levels found suggest that 3-alkyl-2-methoxypyrazines can exert a significant sensory contribution to the aroma of Chilean Sauvignon Blanc wines, while most likely they play a nearly negligible role on traditional Ribera and Rioja Spanish red wines.

  16. Development of the MAE/UHPLC-MS-TOF method for determination of benzodiazepines in human bio-fluids for toxicological analysis. (United States)

    Woźniakiewicz, Aneta; Wietecha-Posłuszny, Renata; Woźniakiewicz, Michał; Nowak, Julia; Kościelniak, Paweł


    A rapid method of microwave-assisted extraction (MAE) followed by ultrahigh performance liquid chromatography with mass spectrometry with time of flight detection (UHPLC-MS-TOF) was optimized and validated for the purpose of determination of five benzodiazepines in human serum and blood samples. Extraction parameters and conditions of the UHPLC-MS-TOF method were defined. Validation of the developed method was performed at three concentration levels: 10, 100 and 250 ng/mL of each drug for both serum and blood samples. For serum and blood the limit of detection was found in the ranges 0.46-2.58 ng/mL and 0.43-1.87 ng/mL, precision (RSD): 0.3-6.7% and 0.9-8.4%, accuracy of the assay (RE): -5.3 to +2.4% and -5.7 to +7.6%, recovery: 80.5-104.3% and 79.9-106.9%, matrix effects: 95.9-110.5% and 97.5-114.2%, respectively. Moreover, the optimized and validated MAE/UHPLC-MS-TOF method was applied to analysis of blood samples.

  17. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.


    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T......, which is shown to be approximated by a χ2 distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling...... conditions was shown to give not only increased precision, but also improved accuracy of the results....

  18. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)


    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  19. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities. (United States)

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S


    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets.

  20. Method development and analysis of free HS and HS in proteoglycans from pre- and postmenopausal women: evidence for biosynthetic pathway changes in sulfotransferase and sulfatase enzymes. (United States)

    Wei, Wei; Miller, Rebecca L; Leary, Julie A


    Heparan sulfate (HS) is one of the most complex and informative biopolymers found on the cell surface or in the extracellular matrix as either free HS fragments or constituents of HS proteoglycans (HSPGs). Analysis of free HS and HSPG sugar chains in human serum at the disaccharide level has great potential for early disease diagnosis and prognosis; however, the low concentration of HS in human serum, together with the complexity of the serum matrix, limits the information on HS. In this study, we present and validate the development of a new sensitive method for in-depth compositional analysis of free HS and HSPG sugar chains. This protocol involved several steps including weak anion exchange chromatography, ultrafiltration, and solid-phase extraction for enhanced detection prior to LC-MS/MS analysis. Using this protocol, a total of 51 serum samples from 26 premenopausal and 25 postmenopausal women were analyzed. Statistically significant differences in heparin/HS disaccharide profiles were observed. The proportion of N-acetylation and N-sulfation in both free HS and HSPG sugar chains were significantly different between pre- and postmenopausal women, indicating changes in N-deacetylase/N-sulfotransferases (NDSTs), the enzymes involved in the initial step of the biosynthetic pathway. Differences in the proportion of 6-O-sulfation suggest that 6-O-sulfotransferase and/or 6-O-sulfatase enzymes may also be implicated.

  1. Development of the high-order decoupled direct method in three dimensions for particulate matter: enabling advanced sensitivity analysis in air quality models

    Directory of Open Access Journals (Sweden)

    W. Zhang


    Full Text Available The high-order decoupled direct method in three dimensions for particulate matter (HDDM-3D/PM has been implemented in the Community Multiscale Air Quality (CMAQ model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity analysis of ISORROPIA, the inorganic aerosol module of CMAQ. A case-specific approach has been applied, and the sensitivities of activity coefficients and water content are explicitly computed. Stand-alone tests are performed for ISORROPIA by comparing the sensitivities (first- and second-order computed by HDDM and the brute force (BF approximations. Similar comparison has also been carried out for CMAQ sensitivities simulated using a week-long winter episode for a continental US domain. Second-order sensitivities of aerosol species (e.g., sulfate, nitrate, and ammonium with respect to domain-wide SO2, NOx, and NH3 emissions show agreement with BF results, yet exhibit less noise in locations where BF results are demonstrably inaccurate. Second-order sensitivity analysis elucidates poorly understood nonlinear responses of secondary inorganic aerosols to their precursors and competing species. Adding second-order sensitivity terms to the Taylor series projection of the nitrate concentrations with a 50% reduction in domain-wide NOx or SO2 emissions rates improves the prediction with statistical significance.

  2. Development of the high-order decoupled direct method in three dimensions for particulate matter: enabling advanced sensitivity analysis in air quality models

    Directory of Open Access Journals (Sweden)

    W. Zhang


    Full Text Available The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM has been implemented in the Community Multiscale Air Quality (CMAQ model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity analysis of ISORROPIA, the inorganic aerosol module of CMAQ. A case-specific approach has been applied, and the sensitivities of activity coefficients and water content are explicitly computed. Stand-alone tests are performed for ISORROPIA by comparing the sensitivities (first- and second-order computed by HDDM and the brute force (BF approximations. Similar comparison has also been carried out for CMAQ results simulated using a week-long winter episode for a continental US domain. Second-order sensitivities of aerosol species (e.g., sulfate, nitrate, and ammonium with respect to domain-wide SO2, NOx, and NH3 emissions show agreement with BF results, yet exhibit less noise in locations where BF results are demonstrably inaccurate. Second-order sensitivity analysis elucidates nonlinear responses of secondary inorganic aerosols to their precursors and competing species that have not yet been well-understood with other approaches. Including second-order sensitivity coefficients in the Taylor series projection of the nitrate concentrations with a 50% reduction in domain-wide NOx emission shows a statistically significant improvement compared to the first-order Taylor series projection.

  3. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma. (United States)

    Kepekci Tekkeli, Serife Evrim


    A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML), olmesartan medoxomil (OLM), valsartan (VAL), and hydrochlorothiazide (HCT) in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I) and AML, VAL, and HCT (combination II). The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v) was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1-18.5  μ g/mL, 0.4-25.6  μ g/mL, 0.3-15.5  μ g/mL, and 0.3-22  μ g/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME) ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  4. Development of an HPLC-UV Method for the Analysis of Drugs Used for Combined Hypertension Therapy in Pharmaceutical Preparations and Human Plasma

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli


    Full Text Available A simple, rapid, and selective HPLC-UV method was developed for the determination of antihypertensive drug substances: amlodipine besilat (AML, olmesartan medoxomil (OLM, valsartan (VAL, and hydrochlorothiazide (HCT in pharmaceuticals and plasma. These substances are mostly used as combinations. The combinations are found in various forms, especially in current pharmaceuticals as threesome components: OLM, AML, and HCT (combination I and AML, VAL, and HCT (combination II. The separation was achieved by using an RP-CN column, and acetonitrile-methanol-10 mmol orthophosphoric acid pH 2.5 (7 : 13 : 80, v/v/v was used as a mobile phase; the detector wavelength was set at 235 nm. The linear ranges were found as 0.1–18.5 μg/mL, 0.4–25.6 μg/mL, 0.3–15.5 μg/mL, and 0.3–22 μg/mL for AML, OLM, VAL, and HCT, respectively. In order to check the selectivity of the method for pharmaceutical preparations, forced degradation studies were carried out. According to the validation studies, the developed method was found to be reproducible and accurate as shown by RSD ≤6.1%, 5.7%, 6.9%, and 4.6% and relative mean error (RME ≤10.6%, 5.8%, 6.5%, and 6.8% for AML, OLM, VAL, and HCT, respectively. Consequently, the method was applied to the analysis of tablets and plasma of the patients using drugs including those substances.

  5. Validation of a Non-Targeted LC-MS Approach for Identifying Ancient Proteins: Method Development on Bone to Improve Artifact Residue Analysis

    Directory of Open Access Journals (Sweden)

    Andrew Barker


    Full Text Available Identification of protein residues from prehistoric cooking pottery using mass spectrometry is challenging because proteins are removed from original tissues, are degraded from cooking, may be poorly preserved due to diagenesis, and occur in a palimpsest of exogenous soil proteins. In contrast, bone proteins are abundant and well preserved. This research is part of a larger method-development project for innovation and improvement of liquid chromatography – mass spectrometry analysis of protein residues from cooking pottery; here we validate the potential of our extraction and characterization approach via application to ancient bone proteins. Because of its preservation potential for proteins and given that our approach is destructive, ancient bone identified via skeletal morphology represents an appropriate verification target. Proteins were identified from zooarchaeological turkey (Meleagris gallopavo Linnaeus Phasianidae, rabbit (Lagomorpha, and squirrel (Sciuridae remains excavated from ancient pueblo archaeological sites in southwestern Colorado using a non-targeted LC-MS/MS approach. The data have been deposited to the ProteomeXchange Consortium with the dataset identifier PXD002440. Improvement of highly sensitive targeted LC-MS/MS approaches is an avenue for future method development related to the study of protein residues from artifacts such as stone tools and pottery.

  6. Research on Development Strategy of YO Company Based on SWOT Analysis Method%基于SWOT分析法的YO公司发展战略研究

    Institute of Scientific and Technical Information of China (English)



    为应对当前企业经营者高度重视战略管理,且决策模式多为命令式和愿景式使得员工主动参与程度不高的问题。低价格、低成本、多种类的原始竞争手段已不能满足现代社会的多元化需求。依据SWOT分析方法,结合YO建筑公司实际,对公司发展战略从多维度研究,从产品定价策略、营销渠道、企业文化、人力资源管理、财务等方面提出了YO公司发展战略优化方法,通过案例验证发展战略可行。%Current business operators attach great importance to the strategic management, however, its decision model is imperative and visionary, making employees not willing to participate. Low price, low cost, many kinds of primitive means of competition can not meet the diverse needs of modern society. For this, based on the SWOT analysis method, combining the reality of YO construction company, this paper studies the company development strategy from multi-dimensions, and proposes the development strategy optimization method for YO company.

  7. Application of Software Safety Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, S. J.; Koo, Y. H. [Doosan Heavy Industries and Construction Co., Daejeon (Korea, Republic of)


    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  8. Chromolithic method development, validation and system suitability analysis of ultra-sound assisted extraction of glycyrrhizic acid and glycyrrhetinic acid from Glycyrrhiza glabra. (United States)

    Gupta, Suphla; Sharma, Rajni; Pandotra, Pankaj; Jaglan, Sundeep; Gupta, Ajai Prakash


    An ultrasound-assisted extraction and chromolithic LC method was developed for simultaneous determination of glycyrrhizic acid (GA) and glycyrrhetinic acid (GL) from the root extract of Glycyrrhizza glabra using RPLC-PDA. The developed method was validated according to the International Conference on Harmonisation. The method exhibited good linearity (r2 > 0.9989) with high precision and achieved good accuracies between 97.5 to 101.3% of quantitative results. The method is more sensitive and faster (resolved within ten minutes) than the earlier developed methods using normal LC columns.

  9. Development of a microwave assisted extraction method for the analysis of 2,4,6-trichloroanisole in cork stoppers by SIDA-SBSE-GC-MS

    Energy Technology Data Exchange (ETDEWEB)

    Vestner, Jochen [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Hochschule RheinMain, Fachbereich Geisenheim, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Fritsch, Stefanie [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany); Rauhut, Doris, E-mail: [Forschungsanstalt Geisenheim, Fachgebiet Mikrobiologie und Biochemie, Von-Lade-Strasse 1, D-65366 Geisenheim (Germany)


    The aim of this research work was focused on the replacement of the time-consuming soaking of cork stoppers which is mainly used as screening method for cork lots in connection with sensory analysis and/or analytical methods to detect releasable 2,4,6-trichloroanisole (TCA) of natural cork stoppers. Releasable TCA from whole cork stoppers was analysed with the application of a microwave assisted extraction method (MAE) in combination with stir bar sorptive extraction (SBSE). The soaking of corks (SOAK) was used as a reference method to optimise MAE parameters. Cork lots of different quality and TCA contamination levels were used to adapt MAE. Pre-tests indicated that an MAE at 40 deg. C for 120 min with 90 min of cooling time are suitable conditions to avoid an over-extraction of TCA of low and medium tainted cork stoppers in comparison to SOAK. These MAE parameters allow the measuring of almost the same amounts of releasable TCA as with the application of the soaking procedure in the relevant range (<25 ng L{sup -1} releasable TCA from one cork) to evaluate the TCA level of cork stoppers. Stable isotope dilution assay (SIDA) was applied to optimise quantification of the released TCA with deuterium-labelled TCA (TCA-d{sub 5}) using a time-saving GC-MS technique in single ion monitoring (SIM) mode. The developed MAE method allows the measuring of releasable TCA from the whole cork stopper under improved conditions and in connection with a low use of solvent and a higher sample throughput.


    Directory of Open Access Journals (Sweden)



    Full Text Available A sensitive, specific, precise and cost effective High Performance Liquid Chromatographic method of analysis for dipyridamole in presence of its degradation products is developed and validated. The method employed Targa C8 column i.e., (250 X 4.6 mm 5 μm particle size column as stationary phase. The mobile phase consists of acetonitrile and pH3.0 buffer in the ratio of 35:65 %. It is pumped through the chromatographic system at a flow rate of 1.2 ml/min. The UV detector is operated at 282 nm. This system was found to give good resolution between dipyridamole and its degradation products. Method was validated as per ICH guidelines

  11. Development of unconventional forming methods

    Directory of Open Access Journals (Sweden)

    S. Rusz


    Full Text Available Purpose: Paper presents results of progress ECAP processing method for UFG structure reached (gained.The properties and microstructure are influenced by technological factors during application ECAP method.Design/methodology/approach: Summary of methods studied on Department of technology at Machining faculty of VŠB-TU Ostrava through of co-operation with Institute of Engineering Materials and Biomaterials, Silesian University of Technology is presented.Findings: Achievement of ultra-fine grained structure in initial material leads to substantial increase of plasticity and makes it possible to form materials in conditions of „superplastic state“. Achievement of the required structure depends namely of the tool geometry, number of passes through the matrix, obtained deformation magnitude and strain rate, process temperature and lubrication conditions. High deformation at comparatively low homologous temperatures is an efficient method of production of ultra-fine grained solid materials.The new technologies, which use severe plastic deformation, comprise namely these techniques: High Pressure Torsion, Equal Channel Angular Pressing = ECAP, Cyclic Channel Die Compression = CCDC, Cyclic Extrusion Compression = CEC, Continuous Extrusion Forming = CONFORM, Accumulative Roll Bonding, Constrained Groove Pressing.Research limitations/implications: Achieved hardness and microstructure characteristics will be determined by new research.Practical implications: The results may be utilized for a relation between structure and properties of the investigated materials in future process of manufacturing.Originality/value: These results contribute to complex evaluation of properties new metals after application unconventional forming methods. The results of this paper are determined for research workers deal by the process severe plastic deformation.

  12. Development of sample preparation method for auxin analysis in plants by vacuum microwave-assisted extraction combined with molecularly imprinted clean-up procedure. (United States)

    Hu, Yuling; Li, Yuanwen; Zhang, Yi; Li, Gongke; Chen, Yueqin


    A novel sample preparation method for auxin analysis in plant samples was developed by vacuum microwave-assisted extraction (VMAE) followed by molecularly imprinted clean-up procedure. The method was based on two steps. In the first one, conventional solvent extraction was replaced by VMAE for extraction of auxins from plant tissues. This step provided efficient extraction of 3-indole acetic acid (IAA) from plant with dramatically decreased extraction time, furthermore prevented auxins from degradation by creating a reduced oxygen environment under vacuum condition. In the second step, the raw extract of VMAE was further subjected to a clean-up procedure by magnetic molecularly imprinted polymer (MIP) beads. Owing to the high molecular recognition ability of the magnetic MIP beads for IAA and 3-indole-butyric acid (IBA), the two target auxins in plants can be selectively enriched and the interfering substance can be eliminated by dealing with a magnetic separation procedure. Both the VMAE and the molecularly imprinted clean-up conditions were investigated. The proposed sample preparation method was coupled with high-performance liquid chromatogram and fluorescence detection for determination of IAA and IBA in peas and rice. The detection limits obtained for IAA and IBA were 0.47 and 1.6 ng/mL and the relative standard deviation were 2.3% and 2.1%, respectively. The IAA contents in pea seeds, pea embryo, pea roots and rice seeds were determined. The recoveries were ranged from 70.0% to 85.6%. The proposed method was also applied to investigate the developmental profiles of IAA concentration in pea seeds and rice seeds during seed germination.

  13. Developing Scoring Algorithms (Earlier Methods) (United States)

    We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.


    Directory of Open Access Journals (Sweden)



    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  15. Learning Global Citizenship Skills in a Democratic Setting: An Analysis of the Efficacy of a Moral Development Method Applied in China



    To reduce the limitations of current measures of morality, and to focus on an effective approach to moral development among university students, the researchers adopted the Konstanz Method of Dilemma Discussion, to probe whether this type of method can also be applied among Chinese students, regarding the development of moral competence. ‘Moral Competence’, also known as ‘moral-democratic’ competence, is a concept developed by German psychologist, Professor Georg Lind. It...

  16. Measurement of -OH groups in coals of different rank using microwave methodology, and the development of quantitative solid state n.m.r. methods for in situ analysis

    Energy Technology Data Exchange (ETDEWEB)

    Monsef-Mirzai, P.; McWhinnie, W.R.; Perry, M.C.; Burchill, P. [Aston University, Birmingham (United Kingdom). Dept. of Chemical Engineering and Applied Chemistry


    Experiments with both model compounds (substituted phenols) and with 11 coals (nine British and two American) have established that microwave heating will greatly accelerate silylation reactions of the phenolic -OH groups, e.g. for Creswell coal complete silylation of -OH groups occurs in 35 min in the microwave oven, whereas 24 h is required using a bench reflux technique. Microwave reaction times for coals vary from 35 min to 3 h for more dense coals such as Cortonwood. The above observations have allowed the development of a `one pot` silylation of coal, followed by an in situ analysis of the added Me{sub 3}Si- groups by quantitative {sup 29}Si magic angle spinning nuclear magnetic resonance (MAS n.m.r.) spectroscopy. The development of a quantitative n.m.r. method required the determination of {sup 29}Si spin lattice relaxation times, T{sub 1}, e.g. for silylated coals T{sub 1} {approximately} 8s; for silylated phenols, T{sub 1} {approximately} 25s; for the synthetic smectite clay laponite, T{sub 1} {approximately} 25 s; and for Ph{sub 3}SiH, T{sub 1} {approximately} 64 s. Inert laponite was selected as the standard. The requirement to wait for five T{sub 1 max} between pulses, together with the relatively low natural abundance of {sup 29}Si (4.71%), results in rather long accumulation times to obtain spectra of analytical quality (8-48 h). However, in comparison with other methods, even in the most unfavourable case, the total time from commencement of analysis to result may be described as `rapid`. The results for O{sub OH}/O{sub total} obtained are compared with other literature data. Comparison with ketene data, for example, shows agreement to vary from excellent (Creswell) through satisfactory (Cortonwood) to poor (Pittsburgh). Even in cases where agreement with ketene data is less good, the silylation results may be close to estimates made via other acetylation methods. Possible reasons for the variations observed are discussed. 18 refs., 2 figs., 7 tabs.

  17. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad


    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  18. Developments in geophysical exploration methods

    CERN Document Server


    One of the themes in current geophysical development is the bringing together of the results of observations made on the surface and those made in the subsurface. Several benefits result from this association. The detailed geological knowledge obtained in the subsurface can be extrapolated for short distances with more confidence when the geologi­ cal detail has been related to well-integrated subsurface and surface geophysical data. This is of value when assessing the characteristics of a partially developed petroleum reservoir. Interpretation of geophysical data is generally improved by the experience of seeing the surface and subsurface geophysical expression of a known geological configuration. On the theoretical side, the understanding of the geophysical processes themselves is furthered by the study of the phenomena in depth. As an example, the study of the progress of seismic wave trains downwards and upwards within the earth has proved most instructive. This set of original papers deals with some of ...

  19. Development of a fast isocratic LC-MS/MS method for the high-throughput analysis of pyrrolizidine alkaloids in Australian honey. (United States)

    Griffin, Caroline T; Mitrovic, Simon M; Danaher, Martin; Furey, Ambrose


    Honey samples originating from Australia were purchased and analysed for targeted pyrrolizidine alkaloids (PAs) using a new and rapid isocratic LC-MS/MS method. This isocratic method was developed from, and is comparable with, a gradient elution method and resulted in no loss of sensitivity or reduction in chromatographic peak shape. Isocratic elution allows for significantly shorter run times (6 min), eliminates the requirement for column equilibration periods and, thus, has the advantage of facilitating a high-throughput analysis which is particularly important for regulatory testing laboratories. In excess of two hundred injections are possible, with this new isocratic methodology, within a 24-h period which is more than 50% improvement on all previously published methodologies. Good linear calibrations were obtained for all 10 PAs and four PA N-oxides (PANOs) in spiked honey samples (3.57-357.14 µg l(-1); R(2) ≥ 0.9987). Acceptable inter-day repeatability was achieved for the target analytes in honey with % RSD values (n = 4) less than 7.4%. Limits of detection (LOD) and limits of quantitation (LOQ) were achieved with spiked PAs and PANOs samples; giving an average LOD of 1.6 µg kg(-1) and LOQ of 5.4 µg kg(-1). This method was successfully applied to Australian and New Zealand honey samples sourced from supermarkets in Australia. Analysis showed that 41 of the 59 honey samples were contaminated by PAs with the mean total sum of PAs being 153 µg kg(-1). Echimidine and lycopsamine were predominant and found in 76% and 88%, respectively, of the positive samples. The average daily exposure, based on the results presented in this study, were 0.051 µg kg(-1) bw day(-1) for adults and 0.204 µg kg(-1) bw day(-1) for children. These results are a cause for concern when compared with the proposed European Food Safety Authority (EFSA), Committee on Toxicity (COT) and Bundesinstitut für Risikobewertung (BfR - Federal Institute of Risk Assessment Germany) maximum

  20. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R


    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  1. Development of liquid chromatographic method for the analysis of dabigatran etexilate mesilate and its ten impurities supported by quality-by-design methodology. (United States)

    Pantović, Jasmina; Malenović, Anđelija; Vemić, Ana; Kostić, Nađa; Medenica, Mirjana


    In this paper, the development of reversed-phase liquid chromatographic method for the analysis of dabigatran etexilate mesilate and its ten impurities supported by quality by design (QbD) approach is presented. The defined analytical target profile (ATP) was the efficient baseline separation and the accurate determination of the investigated analytes. The selected critical quality attributes (CQAs) were the separation criterions between the critical peak pairs because the mixture complexity imposed a gradient elution mode. The critical process parameters (CPPs) studied in this research were acetonitrile content at the beginning of gradient program, acetonitrile content at the end of gradient program and the gradient time. Plan of experiments was defined by Box-Behnken design. The experimental domains of the three selected factors x1--content of the acetonitrile at the start of linear gradient, x2--content of the acetonitrile at the end of linear gradient and x3--gradient time (tG) were [10%, 30%], [48%, 60%] and [8 min, 15 min], respectively. In order to define the design space (DS) as a zone where the desired quality criteria is met providing also the quality assurance, Monte Carlo simulations were performed. The uniform error distribution equal to the calculated standard error was added to the model coefficient estimates. Monte Carlo simulation included 5000 iterations in each of 3969 defined grid points and the region having the probability π ≥ 95% to achieve satisfactory values of all defined CQAs was computed. As a working point, following chromatographic conditions suited in the middle of the DS were chosen: 22% acetonitrile at the start of gradient program, 55.5% acetonitrile at the end of gradient program end and the gradient time of 11.5 min. The developed method was validated in order to prove its reliability.

  2. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai


    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  3. Analysis methods for airborne radioactivity


    Ala-Heikkilä, Jarmo J


    High-resolution gamma-ray spectrometry is an analysis method well suitable for monitoring airborne radioactivity. Many of the natural radionuclides and a majority of anthropogenic nuclides are prominent gamma-ray emitters. With gamma-ray spectrometry different radionuclides are readily observed at minute concentrations that are far from health hazards. The gamma-ray spectrometric analyses applied in air monitoring programmes can be divided into particulate measurements and gas measurements. I...

  4. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда


    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  5. Chromatographic methods for analysis of triazine herbicides. (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y


    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  6. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.


    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  7. Development of a multi-preservative method based on solid-phase microextraction-gas chromatography-tandem mass spectrometry for cosmetic analysis. (United States)

    Alvarez-Rivera, Gerardo; Vila, Marlene; Lores, Marta; Garcia-Jares, Carmen; Llompart, Maria


    A simple methodology based on solid-phase microextraction (SPME) followed by gas chromatography-tandem mass spectrometry (GC-MS/MS) has been developed for the simultaneous analysis of different classes of preservatives including benzoates, bronidox, 2-phenoxyethanol, parabens, BHA, BHT and triclosan in cosmetic products. In situ acetylation and subsequent organic modifier addition have been successfully implemented in the SPME process as an effective extractive strategy for matrix effect compensation and chromatographic performance improvement. Main factors affecting SPME procedure such as fiber coating, sampling mode, extraction temperature and salt addition (NaCl) were evaluated by means of a 3×2(3-1) factorial experimental design. The optimal experimental conditions were established as follows: direct solid-phase microextraction (SPME) at 40°C and addition of NaCl (20%, w/v), using a DVB/CAR/PDMS fiber coating. Due to the complexity of the studied matrices, method performance was evaluated in a representative variety of both rinse-off and leave-on samples, demonstrating to have a broad linear range (R(2)>0.9964). In general, quantitative recoveries (>85% in most cases) and satisfactory precision (RSDcreams, deodorants, sunscreen, bath gel, dental cream and make-up products amongst others, demonstrating to be a reliable multi-preservative methododology for routine control.

  8. Development and validation of an improved reversed-phase liquid chromatographic method combined with pulsed electrochemical detection for the analysis of netilmicin. (United States)

    Manyanga, Vicky; Hoogmartens, Jos; Adams, Erwin


    Netilmicin is one of the aminoglycoside antibiotics that lacks a strong UV absorbing chromophore. However, the application of pulsed electrochemical detection has been used successfully for the direct analysis of aminoglycoside antibiotics. This study describes an improved LC method combined with pulsed electrochemical detection for the analysis of netilmicin. Using a Zorbax SB C-18 column (250 mm x 4.6 mm id, 5 microm), isocratic elution was carried out with a mobile phase containing sodium sulfate (20 g/L), sodium octanesulfonate (0.3 g/L), THF (20 mL/L), and 0.2 M phosphate buffer pH 3.0 (50.0 mL/L). The robustness of the method was examined by means of an experimental design. The method proved to be sensitive, repeatable, linear, and robust. The method has also been used to analyze some commercial netilmicin samples.

  9. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States)


    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  10. Moral counselling: a method in development. (United States)

    de Groot, Jack; Leget, Carlo


    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  11. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S


    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  12. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and A Posteriori Error Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ginting, Victor


    it was demonstrated that a posteriori analyses in general and in particular one that uses adjoint methods can accurately and efficiently compute numerical error estimates and sensitivity for critical Quantities of Interest (QoIs) that depend on a large number of parameters. Activities include: analysis and implementation of several time integration techniques for solving system of ODEs as typically obtained from spatial discretization of PDE systems; multirate integration methods for ordinary differential equations; formulation and analysis of an iterative multi-discretization Galerkin finite element method for multi-scale reaction-diffusion equations; investigation of an inexpensive postprocessing technique to estimate the error of finite element solution of the second-order quasi-linear elliptic problems measured in some global metrics; investigation of an application of the residual-based a posteriori error estimates to symmetric interior penalty discontinuous Galerkin method for solving a class of second order quasi-linear elliptic problems; a posteriori analysis of explicit time integrations for system of linear ordinary differential equations; derivation of accurate a posteriori goal oriented error estimates for a user-defined quantity of interest for two classes of first and second order IMEX schemes for advection-diffusion-reaction problems; Postprocessing finite element solution; and A Bayesian Framework for Uncertain Quantification of Porous Media Flows.

  13. Development and Validation of a Stability-Indicating HPTLC Method for Analysis of Rasagiline Mesylate in the Bulk Drug and Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    Singaram Kathirvel


    Full Text Available A simple and sensitive thin-layer chromatographic method has been established for analysis of rasagiline mesylate in pharmaceutical dosage form. Chromatography on silica gel 60 F254 plates with 6 : 1 : 2(v/v/v butanol-methanol water as mobile phase furnished compact spots at Rf  0.76±0.01. Densitometric analysis was performed at 254 nm. To show the specificity of the method, rasagiline mesylate was subjected to acid, base, neutral hydrolysis, oxidation, photolysis, and thermal decomposition, and the peaks of degradation products were well resolved from that of the pure drug. Linear regression analysis revealed a good linear relationship between peak area and amount of rasagiline mesylate in the range of 100–350 ng/band. The minimum amount of rasagiline mesylate that could be authentically detected and quantified was 11.12 and 37.21 ng/band, respectively. The method was validated, in accordance with ICH guidelines for precision, accuracy, and robustness. Since the method could effectively separate the drug from its degradation products, it can be regarded as stability indicating.

  14. Practical Fourier analysis for multigrid methods

    CERN Document Server

    Wienands, Roman


    Before applying multigrid methods to a project, mathematicians, scientists, and engineers need to answer questions related to the quality of convergence, whether a development will pay out, whether multigrid will work for a particular application, and what the numerical properties are. Practical Fourier Analysis for Multigrid Methods uses a detailed and systematic description of local Fourier k-grid (k=1,2,3) analysis for general systems of partial differential equations to provide a framework that answers these questions.This volume contains software that confirms written statements about convergence and efficiency of algorithms and is easily adapted to new applications. Providing theoretical background and the linkage between theory and practice, the text and software quickly combine learning by reading and learning by doing. The book enables understanding of basic principles of multigrid and local Fourier analysis, and also describes the theory important to those who need to delve deeper into the detai...

  15. Simple gas chromatographic method for furfural analysis. (United States)

    Gaspar, Elvira M S M; Lopes, João F


    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDTOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet.

  16. Quality and methods of developing practice guidelines

    Directory of Open Access Journals (Sweden)

    Clark Otavio


    Full Text Available Abstract Background It is not known whether there are differences in the quality and recommendations between evidence-based (EB and consensus-based (CB guidelines. We used breast cancer guidelines as a case study to assess for these differences. Methods Five different instruments to evaluate the quality of guidelines were identified by a literature search. We also searched MEDLINE and the Internet to locate 8 breast cancer guidelines. These guidelines were classified in three categories: evidence based, consensus based and consensus based with no explicit consideration of evidence (CB-EB. Each guideline was evaluated by three of the authors using each of the instruments. For each guideline we assessed the agreement among 14 decision points which were selected from the NCCN (National Cancer Comprehensive Network guidelines algorithm. For each decision point we recorded the level of the quality of the information used to support it. A regression analysis was performed to assess if the percentage of high quality evidence used in the guidelines development was related to the overall quality of the guidelines. Results Three guidelines were classified as EB, three as CB-EB and two as CB. The EB guidelines scored better than CB, with the CB-EB scoring in the middle among all instruments for guidelines quality assessment. No major disagreement in recommendations was detected among the guidelines regardless of the method used for development, but the EB guidelines had a better agreement with the benchmark guideline for any decision point. When the source of evidence used to support decision were of high quality, we found a higher level of full agreement among the guidelines' recommendations. Up to 94% of variation in the quality score among guidelines could be explained by the quality of evidence used for guidelines development. Conclusion EB guidelines have a better quality than CB guidelines and CB-EB guidelines. Explicit use of high quality evidence

  17. Development of rapid slurry methods for flame and direct current plasma emission and graphite furnace atomic absorption analysis of solid animal tissue

    Energy Technology Data Exchange (ETDEWEB)

    Fietkau, R.


    Studies are presented describing developments in the rapid, direct atomic spectrochemical analysis of meat samples by the technique of slurry atomization. The number of elements that can be determined in meat slurry samples has been increased and the concentration range that can be detected extended to included analysis at the part per billion as well as the percent level. Slurry atomization involves the rapid preparation procedure whereby the sample is simple homogenized with deionized distilled water prior to analysis. In this manner, rapid, quantitative analysis of hot dogs (processed meat) for dietary salt (Na, K) was achieved by premixed air-natural gas flame emission spectrometry. Quantitative analysis of mechanically separated meat for residual bone fragments (as Ca) was attained using a simple photometer when the premixed air-acetylene flame was used. The phosphate interference of the Ca emission signal was overcome by placing an insert in the spray chamber which decreased the droplet size of the aerosol reaching the flame. Slight matrix modification in the form of 2% nitric acid was necessary to solubilize the Ca from the bone fragments. Determining elements present at very low concentrations i.e. part per billion levels, in homogenized beef liver was evaluated using graphite furnace atomic absorption and shown to be viable for determinations of Pb, Cd, Cr, and Ni. Qualitative multielement analysis of several types of meat slurries by direct current plasma (DCP) emission spectrometry using both photographic and electronic modes of detection was reported for the first time.

  18. How composition methods are developed and validated. (United States)

    Rogers, Hilary A


    Method validation is a critical prerequisite to performing analytical methods in the laboratory. A given analytical method is validated for a specific matrix or matrices. If the matrix to be tested is not included in the original scope of method validation, a validation must be performed to determine if the method is applicable to that particular matrix. A number of organizations, such as AOAC and ISO, publish peer-reviewed methods for cross-industry matrices, whereas others, such as AOCS and AACC, are focused on specific industry segments (fats/oils and cereal grains). When no validated method is available for the analyte of interest, method development and validation must first be performed to ensure that correct identification and quantification of the analyte are being observed and measured. Development of a new method requires an understanding of the chemistry and properties of the analyte to be tested, as well as the various types of instrumentation currently available. Method development and improvement is a continuous process, as technology advances and new instrumentation and techniques become available. This paper addresses some of the decisions related to method development but will primarily focus on validation as it applies to compositional testing of foods, crops, and commodities, the factors that determine method selection, and how extensive the validation need be.

  19. Development of a Radial Deconsolidation Method

    Energy Technology Data Exchange (ETDEWEB)

    Helmreich, Grant W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Montgomery, Fred C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hunn, John D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)


    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  20. Development of sample preparation method for eleutheroside B and E analysis in Acanthopanax senticosus by ionic liquids-ultrasound based extraction and high-performance liquid chromatography detection. (United States)

    Yang, Lei; Ge, Hongshuang; Wang, Wenjie; Zu, Yuangang; Yang, Fengjian; Zhao, Chunjian; Zhang, Lin; Zhang, Ying


    An ionic liquids-based ultrasonic-assisted extraction (ILUAE) method was successfully developed for extracting eleutheroside B and E from Radix Acanthopanax senticosus. Thirteen 1-alkyl-3-methylimidazolium ionic liquids with different cations and anions were investigated and 1-butyl-3-methylimidazolium bromide ([C4mim]Br) solution was selected as the solvent. The conditions for ILUAE, including the ionic liquid concentration, soaking time, ultrasonic power, ultrasonic time, solid-liquid ratio and number of extraction cycles, were optimized. With the proposed method, the energy consumption time was reduced to 30 min, whereas conventional method requires about 4h. The proposed method had good recovery (97.96-103.39%) and reproducibility (RSD, n=5; 3.3% for eleutheroside B, 4.6% for eleutheroside E). ILUAE was an efficient, rapid and simple sample preparation technique that showed high reproducibility and was environmental friendly.

  1. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich


    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  2. Development of a liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous analysis of intact glucosinolates and isothiocyanates in Brassicaceae seeds and functional foods. (United States)

    Franco, P; Spinozzi, S; Pagnotta, E; Lazzeri, L; Ugolini, L; Camborata, C; Roda, A


    A new high pressure liquid chromatography-electrospray ionization-tandem mass spectrometry method for the simultaneous determination of glucosinolates, as glucoraphanin and glucoerucin, and the corresponding isothiocyanates, as sulforaphane and erucin, was developed and applied to quantify these compounds in Eruca sativa defatted seed meals and enriched functional foods. The method involved solvent extraction, separation was achieved in gradient mode using water with 0.5% formic acid and acetonitrile with 0.5% formic acid and using a reverse phase C18 column. The electrospray ion source operated in negative and positive mode for the detection of glucosinolates and isothiocyanates, respectively, and the multiple reaction monitoring (MRM) was selected as acquisition mode. The method was validated following the ICH guidelines. Replicate experiments demonstrated a good accuracy (bias%<10%) and precision (CV%<10%). Detection limits and quantification limits are in the range of 1-400ng/mL for each analytes. Calibration curves were validated on concentration ranges from 0.05 to 50μg/mL. The method proved to be suitable for glucosinolates and isothiocyanates determination both in biomasses and in complex matrices such as food products enriched with glucosinolates, or nutraceutical bakery products. In addition, the developed method was applied to the simultaneous determination of glucosinolates and isothiocyanates in bakery product enriched with glucosinolates, to evaluate their thermal stability after different industrial processes from cultivation phases to consumer processing.

  3. Fundamental analysis and development of the current and voltage control method by changing the driving frequency for the transcutaneous energy transmission system. (United States)

    Miura, Hidekazu; Yamada, Akihiro; Shiraishi, Yasuyuki; Yambe, Tomoyuki


    We have been developing transcutaneous energy transmission system (TETS) for a ventricular assist device, shape memory alloy (SMA) fibered artificial organs and so on, the system has high efficiency and a compact size. In this paper, we summarize the development, design method and characteristics of the TETS. New control methods for stabilizing output voltage or current of the TETS are proposed. These methods are primary side, are outside of the body, not depending on a communication system from the inside the body. Basically, the TETS operates at the fixed frequency with a suitable compensation capacitor so that the internal impedance is minimalized and a flat load characteristic is obtained. However, when the coil shifted from the optimal position, the coupling factor changes and the output is fluctuated. TETS has a resonant property; its output can be controlled by changing the driving frequency. The continuous current to continuous voltage driving method was implemented by changing driving frequency and setting of limitation of low side frequency. This method is useful for battery charging system for electrically driven artificial hearts and also useful for SMA fibered artificial organs which need intermittent high peak power comsumption. In this system, the internal storage capacitor is charged slowly while the fibers are turned off and discharge the energy when the fibers are turned on. We examined the effect of the system. It was found that the size and maximum output of the TETS would able to be reduced.

  4. Development and comparison of two multi-residue methods for the analysis of select pesticides in honey bees, pollen, and wax by gas chromatography-quadrupole mass spectrometry. (United States)

    Li, Yuanbo; Kelley, Rebecca A; Anderson, Troy D; Lydy, Michael J


    One of the hypotheses that may help explain the loss of honey bee colonies worldwide is the increasing potential for exposure of honey bees to complex mixtures of pesticides. To better understand this phenomenon, two multi-residue methods based on different extraction and cleanup procedures have been developed, and compared for the determination of 11 relevant pesticides in honey bees, pollen, and wax by gas chromatography-quadrupole mass spectrometry. Sample preparatory methods included solvent extraction followed by gel permeation chromatography (GPC) cleanup and cleanup using a dispersive solid-phase extraction with zirconium-based sorbents (Z-Sep). Matrix effects, method detection limits, recoveries, and reproducibility were evaluated and compared. Method detection limits (MDL) of the pesticides for the GPC method in honey bees, pollen, and wax ranged from 0.65 to 5.92 ng/g dw, 0.56 to 6.61 ng/g dw, and 0.40 to 8.30 ng/g dw, respectively, while MDLs for the Z-Sep method were from 0.33 to 4.47 ng/g dw, 0.42 to 5.37 ng/g dw, and 0.51 to 5.34 ng/g dw, respectively. The mean recoveries in all matrices and at three spiking concentrations ranged from 64.4% to 149.5% and 71.9% to 126.2% for the GPC and Z-Sep methods, with relative standard deviation between 1.5-25.3% and 1.3-15.9%, respectively. The results showed that the Z-Sep method was more suitable for the determination of the target pesticides, especially chlorothalonil, in bee hive samples. The Z-Sep method was then validated using a series of field-collected bee hive samples taken from honey bee colonies in Virginia.

  5. On Software Development of Characteristic Set Method

    Institute of Scientific and Technical Information of China (English)

    WU Yong-wei; WANG Ding-kang; YANG Hong; LIN Dong-dai


    Characteristic set method of polynomial equation solving has been widely spread and its implementation in software has been urged to consider in recent years. Several packages for the method are implemented in some computer algebra systems, such as REDUCE and Maple. In order to improve the efficiency of the method, we have developed a computer algebra system "ELIMINO" written in C language and implemented on Linux operation system on a PC. The authors wish to share with the reader the knowledge and experiences about the design and development of software package of the characteristic set method.


    Institute of Scientific and Technical Information of China (English)

    LI Jinghua


    Over the past two decades,structural decomposition analysis(SDA)has developed into a major analytical tool in the field of input-output(IO)techniques,but the method was found to suffer from one or more of the following problems.The decomposition forms,which are used to measure the contribution of a specific determinant,are not unique due to the existence of a multitude of equivalent forms,irrational due to the weights of different determinants not matching,inexact due to the existence of large interaction terms.In this paper,a decomposition method is derived to overcome these deficiencies,and we prove that the result of this approach is equal to the Shapley value in cooperative games,and so some properties of the method are obtained.Beyond that,the two approaches that have been used predominantly in the literature have been proved to be the approximate solutions of the method.

  7. Development and validation of an automated extraction method (accelerated solvent extraction) and a reverse-phase HPLC analysis method for assay of ivermectin in a meat-based chewable formulation. (United States)

    Abend, Andreas M; Chung, Le; McCollum, David G; Wuelfing, W Peter


    A new method for monitoring ivermectin content in HEARTGARD CHEWABLES has been developed and validated. The method consists of the automated extraction of ivermectin from the meat-based formulation under conditions of elevated temperature and pressure (accelerated solvent extraction, ASE, and determination of the active by reverse-phase high performance liquid chromatography (HPLC). The method resolves both active species of ivermectin (components H(2)B(1a) and H(2)B(1b)) from the formulation matrix.

  8. Method development for compositional analysis of low molecular weight poly(vinyl acetate) by matrix-assisted/laser desorption-mass spectrometry and its application to analysis of chewing gum. (United States)

    Tisdale, Evgenia; Wilkins, Charles


    The influence of the sample preparation parameters (the choice of the solvent and of the matrix:analyte ratio) was investigated and optimal conditions were established for MALDI mass spectrometry analysis of the pristine low molecular weight polyvinyl acetate (PVAc). It was demonstrated that comparison of polymer's and solvent's Hansen solubility parameters could be used as a guide when choosing the solvent for MALDI sample preparation. The highest intensity PVAc signals were obtained when ethyl acetate was used as a solvent along with the lowest matrix-analyte ratio (2,5-dihydroxybenzoic acid was used as a matrix in all experiments). The structure of the PVAc was established with high accuracy using the matrix-assisted laser desorption/ionization-Fourier transform mass spectrometry (MALDI-FTMS) analysis. It was demonstrated that PVAc undergoes unimolecular decomposition by losing acetic acid molecules from its backbone under the conditions of FTMS measurements. Number and weight average molecular weights as well as polydispersity indices were determined with both MALDI-TOF and MALDI-FTMS methods. The sample preparation protocol developed was applied to the analysis of a chewing gum and the molecular weight and structure of the polyvinyl acetate present in the sample were established. Thus, it was shown that optimized MALDI mass spectrometry could be used successfully for characterization of polyvinyl acetate in commercially available chewing gum.

  9. Development of a method for the analysis of four plant growth regulators (PGRs) residues in soybean sprouts and mung bean sprouts by liquid chromatography-tandem mass spectrometry. (United States)

    Zhang, Fengzu; Zhao, Pengyue; Shan, Weili; Gong, Yong; Jian, Qiu; Pan, Canping


    A method has been developed for the simultaneous determination of four plant growth regulators (PGRs) residues in soybean sprouts and mung bean sprouts. The sample preparation procedure was based on a QuEChERS method. The method showed excellent linearity (r(2) ≥ 0.9985) and precision (RSDs ≤ 13.0%). Average recoveries of four PGRs ranged between 74.9% and 106.3% at spiking levels 0.05, 0.5 and 1 mg kg(-1). The LODs and LOQs were in the ranges of 0.27-9.3 μg kg(-1) and 0.90-31 μg kg(-1), respectively. The procedure was applied to 18 bean sprout samples, and benzyladenine was found in some of the analyzed samples.

  10. Development of a quantitative analysis method for mRNA from Mycobacterium leprae and slow-growing acid-fast bacteria

    Energy Technology Data Exchange (ETDEWEB)

    Nakanaga, Kazue; Maeda Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko [National Inst. of Infectious Diseases, Tokyo (Japan)


    This study aimed to develop a specific method for detection and quantitative determination of mRNA that allows estimation of viable counts of M. leprae and other mycobacteria. Of heart-shock protein of 65 kDa (hsp65), mRNA was used as an indicator to discriminate the living cells and died ones. To compare mRNA detections by RNase protection assay (RPA) and Northern blot hybridization (NBH), labelled anti-sense RNA for hsp65 gene of M. leprae was synthesized using plasmid pUC8/N5. The anti-sense RNA synthesized from the template DNA containing about 580 bp (194 to 762) of hsp65 gene. When compared with NBH method, the amount of probe required for the detection by RPA method was 1/30 or less and the detection sensitivity of RPA was also 10 times higher. In addition, complicated procedures were needed to eliminate non-specific reactions in NBH method. These results indicated that RPA method is more convenient and superior for the mRNA detection. However, isotope degradation in the probe used for RPA method might affect the results. Therefore, {sup 33}P of {sup 35}P, of which degradation energy is less that {sup 32}P should be used for labelling. Total RNA was effectively extracted from M. chelonae, M. marinum by AGPC method, but not from M. leprae. In conclusion, RPA is a very effective detection method for these mRNA, but it seems necessary to further improve the sensitivity of detection for a small amount of test materials. (M.N.)

  11. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L


    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  12. Development of a method to determine the specific environment as a starting point for the strategic analysis and the approach to competitors' Knowledge: presentation and applications

    Directory of Open Access Journals (Sweden)

    Emilio García Vega


    Full Text Available The determination of the specific environment is important for the formulation of efficient enterprise strategies, on the basis of a strategic analysis properly focused. This paper suggests a method to help its limitation and identification. With its use, it pretends to offer a simple and practical tool that allows to have a more accurate approach to the identification of the industry that will be analysed, as well as, a clarification of the specification of the direct and substitute competition. Also, with the use of this tool, the managers of a business idea, an experienced or new organization, will have an approximation to the mentioned themes that are of a strategic importance in any management type. Likewise, two applications of the proposed method are presented: the first orientated to a business idea and the second to supermarkets with a high service charge in Lima, Peru.

  13. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe


    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  14. Towards A Broader Adoption of Agile Software Development Methods

    Directory of Open Access Journals (Sweden)

    Abdallah Alashqur


    Full Text Available Traditionally, software design and development has been following the engineering approach as exemplified by the waterfall model, where specifications have to be fully detailed and agreed upon prior to starting the software construction process. Agile software development is a relatively new approach in which specifications are allowed to evolve even after the beginning of the development process, among other characteristics. Thus, agile methods provide more flexibility than the waterfall model, which is a very useful feature in many projects. To benefit from the advantages provided by agile methods, the adoption rate of these methods in software development projects can be further encouraged if certain practices and techniques in agile methods are improved. In this paper, an analysis is provided of several practices and techniques that are part of agile methods that may hinder their broader acceptance. Further, solutions are proposed to improve such practices and consequently facilitate a wider adoption rate of agile methods in software development.

  15. 语义Web应用程序开发方法及实例分析%Semantic Web Application Developing Method and Example Analysis

    Institute of Scientific and Technical Information of China (English)

    李新龙; 黄映辉


    随着语义Web技术的不断发展,语义Web应用程序越来越受到重视,但现在国内对语义Web应用程序的研究却比较少,缺少语义Web 应用程序的开发方法。文中通过对语义Web应用程序的研究,结合与Web应用程序的对比分析,给出了语义Web应用程序的定义、架构以及开发方法,并详细说明了基于数据层、逻辑层和表现层三层架构的语义Web应用程序的结构特征和构建过程,进而通过构建一个语义Web应用程序实例对所提出的开发方法进行了验证,取得了预期的成果。%With the continues development of semantic Web technology,more and more attention has been drawn to semantic Web appli-cations. However,less research has done on the semantic Web applications in the domestic,lacking of development methods. By study,the definition of the semantic Web application is given,as well as the basic framework and the developing method. The structure characteris-tics and building process of each layer in semantic Web application based on data layer,logic layer and presentation layer,are also de-scribed. Through constructing a semantic Web application example,a novel development method is verified and achieve the prospect re-sult.


    N-Nitrosodimethylamine (NDMA) is a probable human carcinogen that has been identified as a drinking water contaminant of concern. United States Environmental Protection Agency (USEPA) Method 521 has been developed for the analysis of NDMA and six additional N-nitrosamines in dri...

  17. Strategic Options Development and Analysis (United States)

    Ackermann, Fran; Eden, Colin

    Strategic Options Development and Analysis (SODA) enables a group or individual to construct a graphical representation of a problematic situation, and thus explore options and their ramifications with respect to a complex system of goals or objectives. In addition the method aims to help groups arrive at a negotiated agreement about how to act to resolve the situation. It is based upon the use of causal mapping - a formally constructed means-ends network - as representation form. Because the picture has been constructed using the natural language of the problem owners it becomes a model of the situation that is ‘owned' by those who define the problem. The use of formalities for the construction of the model makes it amenable to a range of analyses as well as encouraging reflection and a deeper understanding. These analyses can be used in a ‘rough and ready' manner by visual inspection or through the use of specialist causal mapping software (Decision Explorer). Each of the analyses helps a group or individual discover important features of the problem situation, and these features facilitate agreeing agood solution. The SODA process is aimed at helping a group learn about the situation they face before they reach agreements. Most significantly the exploration through the causal map leads to a higher probability of more creative solutions and promotes solutions that are more likely to be implemented because the problem construction process is wider and more likely to include richer social dimensions about the blockages to action and organizational change. The basic theories that inform SODA derive from cognitive psychology and social negotiation, where the model acts as a continuously changing representation of the problematic situation - changing as the views of a person or group shift through learning and exploration. This chapter, jointly written by two leading practitioner academics and the original developers of SODA, Colin Eden and Fran Ackermann

  18. Development and application of a multilocus sequence analysis method for the identification of genotypes within genus Bradyrhizobium and for establishing nodule occupancy of soybean (Glycine max L. Merr). (United States)

    van Berkum, Peter; Elia, Patrick; Song, Qijian; Eardly, Bertrand D


    A multilocus sequence typing (MLST) method based on allelic variation of seven chromosomal loci was developed for characterizing genotypes (GT) within the genus Bradyrhizobium. With the method, 29 distinct multilocus GT were identified among 190 culture collection soybean strains. The occupancy of 347 nodules taken from uninoculated field-grown soybean plants also was determined. The bacteroid GT were either the same as or were closely related to GT identified among strains in the culture collection. Double-nodule occupancy estimates of 2.9% were much lower than values published based on serology. Of the 347 nodules examined, 337 and 10 were occupied by Bradyrhizobium japonicum and B. elkanii, respectively. The collection strains within the species B. japonicum and B. elkaniialso were compared with Bradyrhizobium cultures from other legumes. In many cases, the observed GT varied more according to their geographic origin than by their trap hosts of isolation. In other cases, there were no apparent relationships with either the legume or geographic source. The MLST method that was developed should be a useful tool in determining the influence of geographic location, temperature, season, soil type, and host plant cultivar on the distribution of GT of Bradyrhizobium spp.

  19. Uni-dimensional double development HPTLC-densitometry method for simultaneous analysis of mangiferin and lupeol content in mango (Mangifera indica) pulp and peel during storage. (United States)

    Jyotshna; Srivastava, Pooja; Killadi, Bharti; Shanker, Karuna


    Mango (Mangifera indica) fruit is one of the important commercial fruit crops of India. Similar to other tropical fruits it is also highly perishable in nature. During storage/ripening, changes in its physico-chemical quality parameters viz. firmness, titrable acidity, total soluble solid content (TSSC), carotenoids content, and other biochemicals are inevitable. A uni-dimensional double-development high-performance thin-layer chromatography (UDDD-HPTLC) method was developed for the real-time monitoring of mangiferin and lupeol in mango pulp and peel during storage. The quantitative determination of both compounds of different classes was achieved by densitometric HPTLC method. Silica gel 60F254 HPTLC plates and two solvent systems viz. toluene/EtOAC/MeOH and EtOAC/MeOH, respectively were used for optimum separation and selective evaluation. Densitometric quantitation of mangiferin was performed at 390nm, while lupeol at 610nm after post chromatographic derivatization. Validated method was used to real-time monitoring of mangiferin and lupeol content during storage in four Indian cultivars, e.g. Bombay green (Bgreen), Dashehari, Langra, and Chausa. Significant correlations (ppeel during storage were also observed.

  20. The Impact of Wind Development on County-Level Income and Employment: A Review of Methods and an Empirical Analysis (Fact Sheet). Wind And Water Power Program (WWPP).

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Jason P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Pender, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lantz, Eric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hoen, Ben [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)


    The economic development potential from wind power installations has been a driver of public and policy support for the industry at the local and state levels for many years. The possibility for economic development has been particularly salient in rural areas of the country where new investment, earnings growth, and employment opportunities have, in many cases, otherwise trended downward for some time. Despite frequent mention of the economic development potential of wind power projects, however, questions persist on the magnitude, distribution, and durability of these impacts. Of particular concern for rural communities is whether new investment in wind power projects stimulates long-term local economic growth and employment. Questions about the economic development and employment impacts of wind power also persist at the national level. However, such debates tend to be more concerned with potential economic losses associated with displacement of other energy sources or land uses and the macroeconomic effects of policy support for renewable energy and changes in electricity rates that might result from wind energy deployment. The present analysis focuses solely on county-level impacts.

  1. Genetic analysis of the Saimiri breeding colony of the Pasteur Institute (French Guiana): development of a molecular typing method using a combination of nuclear and mitochondrial DNA markers. (United States)

    Lavergne, Anne; Catzeflis, François; Lacôte, Sandra; Barnaud, Antoine; Bordier, Marion; Mercereau-Puijalon, Odile; Contamin, Hugues


    Saimiri (Cebidae) groups a complex of species and subspecies, which present a large morphological plasticity. Genetic analysis is complicated by the absence of consensus on classification criteria and the paucity of molecular tools available for the genus. As the squirrel monkey is widely used in biomedical research, breeding centers have been established, but the genetic make up and diversity of many of the existing colonies is unknown precluding a rationale breeding policy. To develop a genetic typing strategy for the Saimiri breeding colony of Pasteur Institute of French Guiana, we have used Cytochrome b, a mitochondrial marker, and nuclear microsatellites. Cytochrome b sequences from wild-caught Saimiri boliviensis, Saimiri sciureus sciureus and S. s. collinsi reference specimens and captive animals identified 11 haplotypes, grouped into three distinct clades. An estimate of genetic variability within each captive morphotype, and of the extent of molecular divergence between the Bolivian, Guyanese and Brazilian breeds was obtained from the analysis of three nuclear microsatellites. Taxon-specific microsatellites enabled typing of F0-F3 animals, but did not differentiate Brazilian from Guyanese animals. Three locus microsatellite analysis of a representative sample from each generation showed no trend for loss of heterozygosity, and identified hybrid animals between Bolivian and the two others sub-species. These data provide novel evidence for taxonomic classification and a rationale strategy to further type the whole colony.

  2. A Method for Collection Development Based on Circulation Analysis%基于流通数据的信息资源建设方法研究

    Institute of Scientific and Technical Information of China (English)

    赖院根; 丹英; 王星


    网络环境下流通数据的可获得性明显提高,为利用流通数据开展信息资源建设研究提供了便利。本文提出从用户请求量和用户请求特征两个角度对流通数据展开分析。其中,用户请求量分析主要了解用户需求在不同学科间的分布,为学科之间的资源配置提供参考依据;用户请求特征分析了解用户需求模式,为不同学科资源遴选标准的建立提供借鉴。实证研究结果表明本文提出的方法能为优化图书馆信息资源建设提供有力支撑。%Library automation has improved access to circulation data under the network environment,which facilitates circulation analysis for collection development.In view of the current research limitations,the paper presents a model based on usage quantity and usage pattern.The analysis of the usage quantity is to understand the disparity of the users' needs among subjects and to provide references for resource allocation within different areas,while the analysis of the usage pattern is to reveal the needs patterns and then to give accordance to the establishment of selection standard.The empirical results show that the model can provide strong support for decisions about collection development in libraries.

  3. Development of an extraction method and LC-MS analysis for N-acylated-l-homoserine lactones (AHLs) in wastewater treatment biofilms. (United States)

    Wang, Jinfeng; Ding, Lili; Li, Kan; Schmieder, Wilhelm; Geng, Jinju; Xu, Ke; Zhang, Yan; Ren, Hongqiang


    N-Acylated-l-homoserine lactones (AHLs) play a vital role in Gram-negative bacteria communication by promoting the formation of extracellular polymeric substances (EPS) and biofilms. However, the low concentration of these AHL signals makes the process difficult to understand. A robust and sensitive pretreatment method for AHL detection was developed in this work. Compared with eight different solid-phase extraction (SPE) columns and three various solid extraction method, we found that the UE (ultrasonic extraction) and an Oasis hydrophilic-lipophilic-balanced (HLB) sorbent in column format combined with ultra-performance liquid chromatography linked to tandem mass spectrometry (UPLC-MS/MS) can be successfully used for systematic pretreating moving bed biofilm reactor (MBBR) biological samples to extract AHLs and determine concentration of AHLs in wastewater treatment biofilms. This easy-to-follow protocol makes it ideal for quantitative analyses of AHLs in wastewater treatment biofilms.

  4. Development and validation of automatic HS-SPME with a gas chromatography-ion trap/mass spectrometry method for analysis of volatiles in wines. (United States)

    Paula Barros, Elisabete; Moreira, Nathalie; Elias Pereira, Giuliano; Leite, Selma Gomes Ferreira; Moraes Rezende, Claudia; Guedes de Pinho, Paula


    An automated headspace solid-phase microextraction (HS-SPME) combined with gas chromatography-ion trap/mass spectrometry (GC-IT/MS) was developed in order to quantify a large number of volatile compounds in wines such as alcohols, ester, norisoprenoids and terpenes. The procedures were optimized for SPME fiber selection, pre-incubation temperature and time, extraction temperature and time, and salt addition. A central composite experimental design was used in the optimization of the extraction conditions. The volatile compounds showed optimal extraction using a DVB/CAR/PDMS fiber, incubation of 5 ml of wine with 2g NaCl at 45 °C during 5 min, and subsequent extraction of 30 min at the same temperature. The method allowed the identification of 64 volatile compounds. Afterwards, the method was validated successfully for the most significant compounds and was applied to study the volatile composition of different white wines.

  5. Parallel development of chromatographic and mass-spectrometric methods for quantitative analysis of glycation on an IgG1 monoclonal antibody. (United States)

    Viski, Kornél; Gengeliczki, Zsolt; Lenkey, Krisztián; Baranyáné Ganzler, Katalin


    Monitoring post-translational modifications (PTMs) in biotherapeutics is of paramount importance. In pharmaceutical industry, chromatography with optical detection is the standard choice of quantitation of product related impurities; and mass spectrometry is used only for characterization. Parallel development of a boronate affinity chromatographic (BAC) and a mass spectrometric methods for quantitative measurement of glycation on a monoclonal antibody (mAb) shed light on the importance of certain characteristics of the individual methods. Non-specific interactions in BAC has to be suppressed with the so-called shielding reagent. We have found that excessive amount of shielding reagents in the chromatographic solvents may cause significant underestimation of glycation. Although contamination of the retained peak with the non-glycated isoforms in BAC is unavoidable, our work shows that it can be characterized and quantitated by mass spectrometry. It has been demonstrated that glycation can be measured by mass spectrometry at the intact protein level with an LOQ value of 3.0% and error bar of ±0.5%. The BAC and MS methods have been found to provide equivalent results. These methods have not been compared from these points of view before.

  6. Data Analysis Methods for Library Marketing (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  7. Developing a multi-method approach to data collection and analysis for explaining the learning during simulation in undergraduate nurse education. (United States)

    Bland, Andrew J; Tobbell, Jane


    Simulation has become an established feature of undergraduate nurse education and as such requires extensive investigation. Research limited to pre-constructed categories imposed by some questionnaire and interview methods may only provide partial understanding. This is problematic in understanding the mechanisms of learning in simulation-based education as contemporary distributed theories of learning posit that learning can be understood as the interaction of individual identity with context. This paper details a method of data collection and analysis that captures interaction of individuals within the simulation experience which can be analysed through multiple lenses, including context and through the lens of both researcher and learner. The study utilised a grounded theory approach involving 31 under-graduate third year student nurses. Data was collected and analysed through non-participant observation, digital recordings of simulation activity and focus group deconstruction of their recorded simulation by the participants and researcher. Focus group interviews enabled further clarification. The method revealed multiple levels of dynamic data, concluding that in order to better understand how students learn in social and active learning strategies, dynamic data is required enabling researchers and participants to unpack what is happening as it unfolds in action.

  8. Rapid coal proximate analysis by thermogravimetric method

    Energy Technology Data Exchange (ETDEWEB)

    Mao Jianxiong; Yang Dezhong; Zhao Baozhong


    A rapid coal proximate analysis by thermogravimetric analysis (TGA) can be used as an alternative method for the standard proximate analysis. This paper presents a program set up to rapidly perform coal proximate analysis by using a thermal analyzer and TGA module. A comparison between coal proximate analyses by standard method (GB) and TGA is also given. It shows that most data from TGA fall within the tolerance limit of standard method.

  9. Quantitative gold nanoparticle analysis methods: A review. (United States)

    Yu, Lei; Andriola, Angelo


    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  10. Development of Safety Analysis Technology for LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. B.; Kwon, Y. M.; Kim, E. K. [KAERI, Daejeon (Korea, Republic of)


    In the safety analysis code system development area, the development of an analysis code for a flow blockage could be brought to completion throughout an integrated validation of MATRA-LMR-FB. The safety analysis code of SSC-K has been evolved by building detailed reactivity models and a core 3 dimensional T/H model into it, and developing its window version. A basic analysis module for SFR features also have been developed incorporating a numerical method, best estimated correlations, and a code structure module. For the analysis of the HCDA initiating phase, a sodium boiling model to be linked to SSC-K and a fuel transient performance/cladding failure model have been developed with a state-of-the-art study on the molten fuel movement models. Besides, scoping analysis models for the post-accident heat removal phase have been developed as well. In safety analysis area, the safety criteria for the KALIMER-600 have been set up, and an internal flow channel blockage and local faults have been analyzed for the assembly safety evaluation, while key safety concepts of the KALIMER-600 has been investigated getting through the analyses of ATWS as well as design basis accidents like TOP and LOF, from which the inherent safety due to a core reactivity feedback has been assessed. The HCDA analysis for the initiating phase and an estimation of the core energy release, subsequently, have been followed with setup of the safety criteria as well as T/H analysis for the core catcher. The thermal-hydraulic behaviors, and released radioactivity sources and dose rates in the containment have been analyzed for its performance evaluation in this area. The display of a data base for research products on the KALIMER Website and the detailed process planning with its status analysis, have become feasible from achievements in the area of the integrated technology development and establishment

  11. Development of a flow method for the determination of phosphate in estuarine and freshwaters-Comparison of flow cells in spectrophotometric sequential injection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)


    Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.

  12. Numerical analysis method for linear induction machines. (United States)

    Elliott, D. G.


    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  13. Metallurgical and chemical characterization of copper alloy reference materials within laser ablation inductively coupled plasma mass spectrometry: Method development for minimally-invasive analysis of ancient bronze objects

    Energy Technology Data Exchange (ETDEWEB)

    Walaszek, Damian, E-mail: [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Senn, Marianne [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Faller, Markus [Laboratory for Jointing Technology and Corrosion, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland); Philippe, Laetitia [Laboratory for Mechanics of Materials and Nanostructures, Swiss Federal Laboratories for Materials Science and Technology, Feuerwerkstrasse 39, CH-3602 Thun (Switzerland); Wagner, Barbara; Bulska, Ewa [University of Warsaw, Faculty of Chemistry, Pasteura 1, 02-093 Warsaw (Poland); Ulrich, Andrea [Laboratory for Analytical Chemistry, Swiss Federal Laboratories for Materials Science and Technology, Überlandstrasse 129, CH-8600 Dübendorf (Switzerland)


    The chemical composition of ancient metal objects provides important information for manufacturing studies and authenticity verification of ancient copper or bronze artifacts. Non- or minimal-destructive analytical methods are preferred to mitigate visible damage. Laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) enables the determination of major elements as well as impurities down to lower ppm-levels, however, accuracy and precision of analysis strongly depend on the homogeneity of reference materials used for calibration. Moreover, appropriate analytical procedures are required e.g. in terms of ablation strategies (scan mode, spot size, etc.). This study reviews available copper alloy (certified) reference materials — (C)RMs from different sources and contributes new metallurgical data on homogeneity and spatial elemental distribution. Investigations of the standards were performed by optical and scanning electron microscopy with X-ray spectrometry (SEM-EDX) for the following copper alloy and bronze (certified) reference materials: NIST 454, BAM 374, BAM 211, BAM 227, BAM 374, BAM 378, BAS 50.01-2, BAS 50.03-4, and BAS 50.04-4. Additionally, the influence of inhomogeneities on different ablation and calibration strategies is evaluated to define an optimum analytical strategy in terms of line scan versus single spot ablation, variation of spot size, selection of the most appropriate RMs or minimum number of calibration reference materials. - Highlights: ► New metallographic data for copper alloy reference materials are provided. ► Influence of RMs homogeneity on quality of LA-ICPMS analysis was evaluated. ► Ablation and calibration strategies were critically discussed. ► An LA-ICPMS method is proposed for analyzing most typical ancient copper alloys.

  14. Relativity Concept Inventory: Development, Analysis, and Results (United States)

    Aslanides, J. S.; Savage, C. M.


    We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…

  15. Method development and HPLC analysis of retail foods and beverages for copper chlorophyll (E141[i]) and chlorophyllin (E141[ii]) food colouring materials. (United States)

    Scotter, Michael J; Castle, Laurence; Roberts, Dominic


    An analytical method using high performance liquid chromatography with photodiode array and fluorescence detection has been developed and applied to the determination of the food colour additives copper chlorophylls and copper chlorophyllins (E141[i] and [ii]) in foods and beverages. The analytical procedures from previously reported methods have been refined to cover a range of food colour formulations and retail foods. The method was single-laboratory validated. Recoveries of the polar copper chlorophyllins from spiked samples (at 14.5 mg/kg in all but one case) were in the range 79-109%, except for jelly sweets (49%). Recoveries of relatively non-polar copper chlorophylls were in the range 77-107% (except for 'made' jelly at 50%). The %RSD for recoveries was generally below 12%. Quantitative estimates of the total copper chlorophyll/chlorophyllin content of a small range of food commodities are reported, based on the use of trisodium copper chlorophyllin as a surrogate standard. The majority of E141-containing foods and colour formulations analysed exhibited a multiplicity of components due to the various extraction and purification processes that are used to obtain these colour additives. This was confounded by the presence of overwhelming amounts of native chlorophylls in certain samples (e.g. mint sauce). Food commodities containing significant amounts of emulsifiers (i.e. ice cream), gelatine or fats were problematic during extraction hence further development of extraction regimes is desirable for such products. All of the samples analysed with added E141, had estimated total copper chlorophyllin contents of below 15 mg/kg (range 0.7-13.0).

  16. UHPLC/MS-MS Analysis of Six Neonicotinoids in Honey by Modified QuEChERS: Method Development, Validation, and Uncertainty Measurement

    Directory of Open Access Journals (Sweden)

    Michele Proietto Galeano


    Full Text Available Rapid and reliable multiresidue analytical methods were developed and validated for the determination of 6 neonicotinoids pesticides (acetamiprid, clothianidin, imidacloprid, nitenpyram, thiacloprid, and thiamethoxam in honey. A modified QuEChERS method has allowed a very rapid and efficient single-step extraction, while the detection was performed by UHPLC/MS-MS. The recovery studies were carried out by spiking the samples at two concentration levels (10 and 40 μg/kg. The methods were subjected to a thorough validation procedure. The mean recovery was in the range of 75 to 114% with repeatability below 20%. The limits of detection were below 2.5 μg/kg, while the limits of quantification did not exceed 4.0 μg/kg. The total uncertainty was evaluated taking the main independent uncertainty sources under consideration. The expanded uncertainty did not exceed 49% for the 10 μg/kg concentration level and was in the range of 16–19% for the 40 μg/kg fortification level.

  17. Development of a LC-MS/MS Method for the Analysis of Enniatins and Beauvericin in Whole Fresh and Ensiled Maize

    DEFF Research Database (Denmark)

    Sørensen, Jens Laurids; Nielsen, Kristian Fog; Rasmussen, Peter Have


    A LC-MS/MS method for the detection of beauvericin and the four enniatins A, A1, B, and B1 in maize and maize silage was developed. The method uses direct injection of maize extracts without any tedious and laborious cleanup procedures. The limit of quantification was determined at 13 ng g(-1...... was examined in 3-month-old maize silage stacks from 20 different farms. As observed in fresh maize, enniatin B was the most abundant compound in ensiled maize and was found from 19 stacks at levels up to 218 ng g-1. The stability of enniatin B in maize silage was assessed by analyzing samples from 10......) for beauvericin and at 17, 34, 24, and 26 ng g(-1) for enniatins A, A1, B, and 131, respectively. The method was used in surveys of the compounds in fresh maize samples collected at harvest in 2005 and 2006. All samples had the same distribution of the enniatins: B > B1 > A1 > A. Enniatin B was present in 90...

  18. Development and Validation of a Normal Phase Chiral HPLC Method for Analysis of Afoxolaner Using a Chiralpak® AD-3 Column. (United States)

    Zhuang, Jinyou; Kumar, Satish; Rustum, Abu


    Afoxolaner is a new antiparasitic molecule from the isoxazoline family that acts on the insect acarine gamma-aminobutyric acid and glutamate receptors. Isoxazoline family of compounds has been employed as active pharmaceutical ingredient in drug products prescribed for control of fleas and ticks in dogs. Afoxolaner with a chiral center at isoxazoline ring exists as a racemic mixture. A normal phase chiral high performance liquid chromatography analytical method has been developed to verify that afoxolaner is a racemic mixture as demonstrated by specific rotation, as well as to determine enantiomeric purity of single enantiomer samples. A Chiralpak(®) AD-3 column (150 × 4.6 mm I.D.) maintained at 35°C was used in the method. Analytes were analyzed with an isocratic elution using n-Hexane/IPA/MeOH (89:10:1, v/v/v) as the mobile phase with a detection wavelength of 312 nm. Desired separation of the two enantiomers was achieved in <10 minutes with resolution and selectivity factors of 5.0 and 1.54, respectively. The analytical method was appropriately validated according to ICH guidelines for its intended use. ® All marks are the property of their respective owners.

  19. Development of a fast extraction method and optimization of liquid chromatography-mass spectrometry for the analysis of phenolic compounds in lentil seed coats. (United States)

    Mirali, Mahla; Ambrose, Stephen J; Wood, Stephen A; Vandenberg, Albert; Purves, Randy W


    A systematic set of optimization experiments was conducted to design an efficient extraction and analysis protocol for screening six different sub-classes of phenolic compounds in the seed coat of various lentil (Lens culinaris Medik.) genotypes. Different compounds from anthocyanidins, flavan-3-ols, proanthocyanidins, flavanones, flavones, and flavonols sub-classes were first optimized for use as standards for liquid chromatography mass spectrometry (LC-MS) with UV detection. The effect of maceration duration, reconstitution solvent, and extraction solvent were investigated using lentil genotype CDC Maxim. Chromatographic conditions were optimized by examining column separation efficiencies, organic composition, and solvent gradient. The results showed that a 1h maceration step was sufficient and that non-acidified solvents were more appropriate; a 70:30 acetone: water (v/v) solvent was ultimately selected. Using a Kinetex PFP column, the organic concentration, gradient, and flow rate were optimized to maximize the resolution of phenolic compounds in a short 30-min analysis time. The optimized method was applied to three lentil genotypes with different phenolic compound profiles to provide information of value to breeding programs.

  20. Prognostic Analysis System and Methods of Operation (United States)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)


    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  1. Analysis on Development Method of SCM Application System%单片机应用系统的开发方法分析

    Institute of Scientific and Technical Information of China (English)



    At present,along with China’s overall favorable development of science and technology and the deepening of the reform of the relevant technology,single chip microcomputer has been more widely used in many ifelds in our country,and achieves good application effects. To ensure the speciifc use effects of SCM,the relevant personnel must do a good job in the development of SCM application system. This paper will brielfy analyze the development methods of SCM application system.%目前,伴随我国整体利好的科技发展形势及相关技术的深化革新,单片机已经在我国各领域的很多电子产品中得到了较为广泛的应用,并且取得了很好的应用效果。要想保证单片机的具体使用效果,相关人员就要做好单片机应用系统的开发工作。文章将简要分析单片机应用系统的开发方法。

  2. Development and validation of a hydrophilic interaction chromatography method coupled with a charged aerosol detector for quantitative analysis of nonchromophoric α-hydroxyamines, organic impurities of metoprolol. (United States)

    Xu, Qun; Tan, Shane; Petrova, Katya


    The European Pharmacopeia (EP) metoprolol impurities M and N are polar, nonchromophoric α-hydroxyamines, which are poorly retained in a conventional reversed-phase chromatographic system and are invisible for UV detection. Impurities M and N are currently analyzed by TLC methods in the EP as specified impurities and in the United States Pharmacopeia-National Formulary (USP-NF) as unspecified impurities. In order to modernize the USP monographs of metoprolol drug substances and related drug products, a hydrophilic interaction chromatography (HILIC) method coupled with a charged aerosol detector (CAD) was explored for the analysis of the two impurities. A comprehensive column screening that covers a variety of HILIC stationary phases (underivatized silica, amide, diol, amino, zwitterionic, polysuccinimide, cyclodextrin, and mixed-mode) and optimization of HPLC conditions led to the identification of a Halo Penta HILIC column (4.6 × 150 mm, 5 μm) and a mobile phase comprising 85% acetonitrile and 15% ammonium formate buffer (100 mM, pH 3.2). Efficient separations of metoprolol, succinic acid, and EP metoprolol impurities M and N were achieved within a short time frame (metoprolol drug substance (metoprolol succinate) and drug products (metoprolol tartrate injection and metoprolol succinate extended release tablets).

  3. Development of a technique using MCNPX code for determination of nitrogen content of explosive materials using prompt gamma neutron activation analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N., E-mail: [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of); Bakhshi, F.; Jalali, M.; Mohammadi, A. [Department of Nuclear Engineering, Faculty of Advanced Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)


    Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma 10.8 MeV following radioactive neutron capture by {sup 14}N nuclei. We aimed to study the feasibility of using field-portable prompt gamma neutron activation analysis (PGNAA) along with improved nuclear equipment to detect and identify explosives, illicit substances or landmines. A {sup 252}Cf radio-isotopic source was embedded in a cylinder made of high-density polyethylene (HDPE) and the cylinder was then placed in another cylindrical container filled with water. Measurements were performed on high nitrogen content compounds such as melamine (C{sub 3}H{sub 6}N{sub 6}). Melamine powder in a HDPE bottle was placed underneath the vessel containing water and the neutron source. Gamma rays were detected using two NaI(Tl) crystals. The results were simulated with MCNP4c code calculations. The theoretical calculations and experimental measurements were in good agreement indicating that this method can be used for detection of explosives and illicit drugs.

  4. Simultaneous quantitation of 14 active components in Yinchenhao decoction with an ultrahigh performance liquid chromatography-diode array detector: Method development and ingredient analysis of different commonly prepared samples. (United States)

    Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin


    J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy.

  5. Development of dissolution method for benznidazole tablets

    Directory of Open Access Journals (Sweden)

    Ádley Antonini Neves de Lima


    Full Text Available The aim of this work was the development of a dissolution method for benznidazole (BNZ tablets. Three different types of dissolution media, two stirring speeds and apparatus 2 (paddle were used. The accomplishment of the drug dissolution profiles was compared through the dissolution efficiency. The assay was performed by spectrophotometry at 324 nm. The better conditions were: sodium chloridehydrochloride acid buffer pH 1.2 with stirring speed of 75 rpm, volume of 900 mL and paddle as apparatus. Ahead of the results it can be concluded that the method developed consists in an efficient alternative for assays of dissolution for benznidazole tablets.

  6. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)


    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  7. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N


    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist


    Directory of Open Access Journals (Sweden)

    Orlov A. I.


    Full Text Available On the basis of the objective analysis it must be noted that in the arsenal of managers, especially foreign ones, there is practically no fundamentally new methods and tools of controlling. So says the executive director of Russian Association of Controllers prof. S. G. Falco. However, promising mathematical and instrumental methods of controlling actively developed in our country. It is necessary to implement them. For example, managers should be used techniques which discussed in the book by Orlov AI, Lutsenko EV, Loikaw VI "Advanced mathematical and instrumental methods of controlling" (2015. These methods are based on the modern development of mathematics as a whole - on the system interval fuzzy math (see the same named book by Orlov AI and Lutsenko EV, 2014. Considered methods are developed in accordance with the new paradigm of mathematical methods of research. It includes new paradigms of applied statistics, mathematical statistics, mathematical methods of economics, methods of analysis of statistical and expert data in management and control. In the XXI century there were more than 10 books issued, developed in accordance with the new paradigm of mathematical methods of research. The systems approach to solving specific applications often requires going beyond the economy. Very important are the procedures for the introduction of innovative methods and tools. In this article we consider the above research results in their interconnection

  9. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso


    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  10. Development of ELISA-analysis methods for the quantification of bioactive natural products in plants, phytomedicines and in humans or similar. (United States)

    Tanaka, H; Shoyama, Y


    In the course of a program in developing new ELISA-methods for the quantification of bioactive natural products in plants, phytomedicines and animals in a μg and ng scale, monoclonal antibodies against various natural products of medicinal and analytical importance have been developed. The ratio of hapten to bovine serum albumin (BSA) in an antigen conjugate was determined by matrix-assisted laser desorption/ionization (MALDI) of mass spectrometry. A hybridoma secreting monoclonal antibodies (MAb) was produced by fusing splenocytes immunized with an antigen-BSA conjugate with HAT-sensitive mouse myeloma cells. The cross-reaction of anti-forskolin antibodies with 7-deacetyl-forskolin was 5.6%. A very small cross-reaction appeared with other derivatives. The full measuring range of the assay extends from 5 ng to 5 μg/ml of forskolin. Immunoaffinity column chromatography using anti-forskolin MAbs appears to be far superior to previously published separation methods. The capacity of the immunoaffinity column as determined by ELISA is 9 μg/ml. Forskolin has been isolated directly from the crude extracts of tuberous roots and the callus culture of Coleus forskohlii. A MAb against Δ(1)-tetrahydrocannabinolic acid (THCA) was produced. The cross-reaction of anti-Δ(1)-THCA antibody against other cannabinoids was very wide. Many cannabinoids and a spiro-compound were reactive, but did not react with other phenolics. It became evident that this ELISA method was able to be applied to the biotransformation experiments of cannabinoids in plant tissue culture system. Anti-solamargine MAbs were produced. A method of determination for solasodine glycosides by using TLC-immunostaining was established. Solasodine glycosides separated by silica gel TLC were transferred to a polyvinylidene difluoride (PVDF) membrane. The membrane was treated with NaIO(4) solution followed by BSA, resulting in a solasodine glycoside-BSA conjugate. Immunostaining of solasodine glycosides was more

  11. Development of new hole expansion testing method (United States)

    Kim, Hyunok; Shang, Jianhui; Beam, Kevin; Samant, Anoop; Hoschouer, Cliff; Dykeman, Jim


    This paper introduces a new hole expansion (HE) testing method that could be more relevant to the edge cracking problem observed in stamping advanced high strength steel (AHSS). The new testing method adopted a large hole diameter of 75 mm compared to the standard hole diameter of 10 mm. An inline monitoring system was developed to visually monitor the hole edge cracking during the test and synchronize the load-displacement data with the recorded video for capturing the initial crack. A new hole expansion testing method was found to be effective in evaluating the edge cracking by considering the effects of material properties and trimming methods. It showed a much larger difference, up to 11%, of the HE ratio between DP980 and TRIP780 compared to the standard HE testing method giving less than a 2% difference.

  12. Cost Analysis: Methods and Realities. (United States)

    Cummings, Martin M.


    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  13. Development of medical application methods using radiation. Radionuclide therapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Chang Woon; Lim, S. M.; Kim, E.H.; Woo, K. S.; Chung, W. S.; Lim, S. J.; Choi, T. H.; Hong, S. W.; Chung, H. Y.; No, W. C. [Korea Atomic Energy Research Institute. Korea Cancer Center Hospital, Seoul, (Korea, Republic of); Oh, B. H. [Seoul National University. Hospital, Seoul (Korea, Republic of); Hong, H. J. [Antibody Engineering Research Unit, Taejon (Korea, Republic of)


    In this project, we studied following subjects: 1. development of monoclonal antibodies and radiopharmaceuticals 2. clinical applications of radionuclide therapy 3. radioimmunoguided surgery 4. prevention of restenosis with intracoronary radiation. The results can be applied for the following objectives: (1) radionuclide therapy will be applied in clinical practice to treat the cancer patients or other diseases in multi-center trial. (2) The newly developed monoclonal antibodies and biomolecules can be used in biology, chemistry or other basic life science research. (3) The new methods for the analysis of therapeutic effects, such as dosimetry, and quantitative analysis methods of radioactivity, can be applied in basic research, such as radiation oncology and radiation biology.

  14. Using numerical analysis to develop and evaluate the method of high temperature sous-vide to soften carrot texture in different-sized packages. (United States)

    Hong, Yoon-Ki; Uhm, Joo-Tae; Yoon, Won Byong


    The high-temperature sous-vide (HTSV) method was developed to prepare carrots with a soft texture at the appropriate degree of pasteurization. The effect of heating conditions, such as temperature and time, was investigated on various package sizes. Heating temperatures of 70, 80, and 90 °C and heating times of 10 and 20 min were used to evaluate the HTSV method. A 3-dimensional conduction model and numerical simulations were used to estimate the temperature distribution and the rate of heat transfer to samples with various geometries. Four different-sized packages were prepared by stacking carrot sticks of identical size (9.6 × 9.6 × 90 mm) in a row. The sizes of the packages used were as follows: (1) 9.6 × 86.4 × 90, (2) 19.2 × 163.2 × 90, (3) 28.8 × 86.4 × 90, and (4) 38.4 × 86.4 × 90 mm. Although only a moderate change in color (L*, a*, and b*) was observed following HTSV cooking, there was a significant decrease in carrot hardness. The geometry of the package and the heating conditions significantly influenced the degree of pasteurization and the final texture of the carrots. Numerical simulations successfully described the effect of geometry on samples at different heating conditions.

  15. Development and validation of a simple thin-layer chromatographic method for the analysis of p-chlorophenol in treated wastewater

    Directory of Open Access Journals (Sweden)

    Tešić Živoslav


    Full Text Available A thin-layer chromatographic method with densitometric detection was established for quantification of p-chlorophenol in waste water. Degradation efficiency of p-chlorophenol was monitored after each treatment of the wastewater samples. Degradation of p-chlorophenol was performed with advanced oxidation processes (AOPs, using UV, H2O2/UV, O3/H2O2/UV, O3 and O3/UV. Developed TLC procedure has been found to be simple, rapid and precise. The method was characterized by high sensitivity (limit of detection was 11 ng per band and limit of quantification 35 ng per band, linear range (from 75 to 500 ng per band, r = 0.9965, and high precision, accuracy (mean percentage recovery 98.6%, and specificity. Additionally, the efficiency of degradation was monitored using HPLC giving comparable results with RP TLC measurements. [Acknowledgement. This work was performed within the framework of the research project No. 172017 supported by the Ministry of Education and Science of Serbia.

  16. Development of a sensitive and reliable high performance liquid chromatography method with fluorescence detection for high-throughput analysis of multi-class mycotoxins in Coix seed. (United States)

    Kong, Wei-Jun; Li, Jun-Yuan; Qiu, Feng; Wei, Jian-He; Xiao, Xiao-He; Zheng, Yuguo; Yang, Mei-Hua


    As an edible and medicinal plant, Coix seed is readily contaminated by more than one group of mycotoxins resulting in potential risk to human health. A reliable and sensitive method has been developed to determine seven mycotoxins (aflatoxins B1, B2, G1, G2, zearalenone, α-zearalenol, and β-zearalenol) simultaneously in 10 batches of Coix seed marketed in China. The method is based on a rapid ultrasound-assisted solid-liquid extraction (USLE) using methanol/water (80/20) followed by immunoaffinity column (IAC) clean-up, on-line photochemical derivatization (PCD), and high performance liquid chromatography coupled with fluorescence detection (HPLC-FLD). Careful optimization of extraction, clean-up, separation and detection conditions was accomplished to increase sample throughput and to attain rapid separation and sensitive detection. Method validation was performed by analyzing samples spiked at three different concentrations for the seven mycotoxins. Recoveries were from 73.5% to 107.3%, with relative standard deviations (RSDs) lower than 7.7%. The intra- and inter-day precisions, expressed as RSDs, were lower than 4% for all studied analytes. Limits of detection and quantification ranged from 0.01 to 50.2 μg kg(-1), and from 0.04 to 125.5 μg kg(-1), respectively, which were below the tolerance levels for mycotoxins set by the European Union. Samples that tested positive were further analyzed by HPLC tandem electrospray ionization mass spectrometry for confirmatory purposes. This is the first application of USLE-IAC-HPLC-PCD-FLD for detecting the occurrence of multi-class mycotoxins in Coix seed.

  17. Method development for the determination of bromine in coal using high-resolution continuum source graphite furnace molecular absorption spectrometry and direct solid sample analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Éderson R.; Castilho, Ivan N.B. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Welz, Bernhard, E-mail: [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Gois, Jefferson S. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Borges, Daniel L.G. [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil); Carasek, Eduardo [Departamento de Química, Universidade Federal de Santa Catarina, 88040-900 Florianópolis, SC (Brazil); Andrade, Jailson B. de [Instituto Nacional de Ciência e Tecnologia do CNPq, INCT de Energia e Ambiente, Universidade Federal da Bahia, 40170-115 Salvador, BA (Brazil)


    This work reports a simple approach for Br determination in coal using direct solid sample analysis in a graphite tube furnace and high-resolution continuum source molecular absorption spectrometry. The molecular absorbance of the calcium mono-bromide (CaBr) molecule has been measured using the rotational line at 625.315 nm. Different chemical modifiers (zirconium, ruthenium, palladium and a mixture of palladium and magnesium nitrates) have been evaluated in order to increase the sensitivity of the CaBr absorption, and Zr showed the best overall performance. The pyrolysis and vaporization temperatures were 800 °C and 2200 °C, respectively. Accuracy and precision of the method have been evaluated using certified coal reference materials (BCR 181, BCR 182, NIST 1630a, and NIST 1632b) with good agreement (between 98 and 103%) with the informed values for Br. The detection limit was around 4 ng Br, which corresponds to about 1.5 μg g{sup −1} Br in coal, based on a sample mass of 3 mg. In addition, the results were in agreement with those obtained using electrothermal vaporization inductively coupled plasma mass spectrometry, based on a Student t-test at a 95% confidence level. A mechanism for the formation of the CaBr molecule is proposed, which might be considered for other diatomic molecules as well. - Highlights: • Bromine has been determined in coal using direct solid sample analysis. • Calibration has been carried out against aqueous standard solutions. • The coal samples and the molecule-forming reagent have been separated in order to avoid interferences. • The results make possible to draw conclusions about the mechanisms of molecule formation.

  18. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai


    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This mo

  19. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel


    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  20. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos


    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  1. Usability Evaluation Method for Agile Software Development

    Directory of Open Access Journals (Sweden)

    Saad Masood Butt


    Full Text Available Agile methods are the best fit for tremendously growing software industry due to its flexible and dynamic nature. But the software developed using agile methods do meet the usability standards? To answer this question we can see that majority of agile software development projects currently involve interactive user interface designs, which can only be possible by following User Centered Design (UCD in agile methods. The question here is, how to integrate UCD with agile models. Both Agile models and UCD are iterative in nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is proposed and the evaluation is of the proposed model is presented by practically implementing it in three real life projects. . Key results from these projects clearly show: the proposed agile model incorporates usability evaluation methods, improves the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.

  2. Benchmarking Learning and Teaching: Developing a Method (United States)

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah


    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  3. Testing and Further Development of Improved Etches and Etching Methods for the Analysis of Bridgman Grown Semiconductor Crystals with an Emphasis on Lead-Tin-Telluride (United States)

    Barber, Patrick G.


    The goals outlined for the research project for this year have been completed, and the following supporting documentation is attached: 1. A copy of the proposal outlining the principal goals: (a) Improve the characterization of semiconductor crystals through new etches and etching procedures. (b) Developed a novel voltammetric method to characterize semiconductor crystals as a result of searching for improved etches for lead-tin-telluride. (c) Presented paper at ACCG- 10. (d) Prepared manuscripts for publication. Completed additional testing suggested by reviewers and re-submitted manuscripts. (e) Worked with an undergraduate student on this project to provide her an opportunity to have a significant research experience prior to graduation. 2. In addition to the anticipated goals the following were also accomplished: (a) Submitted the newly developed procedures for consideration as a patent or a NASA Tech Brief. (b) Submitted a paper for presentation at the forthcoming ICCG- 12 conference. 3. A copy of the final draft of the publication as submitted to the editors of the Journal of Crystal Growth.

  4. Rapid Method Development in Hydrophilic Interaction Liquid Chromatography for Pharmaceutical Analysis Using a Combination of Quantitative Structure-Retention Relationships and Design of Experiments. (United States)

    Taraji, Maryam; Haddad, Paul R; Amos, Ruth I J; Talebi, Mohammad; Szucs, Roman; Dolan, John W; Pohl, Chris A


    A design-of-experiment (DoE) model was developed, able to describe the retention times of a mixture of pharmaceutical compounds in hydrophilic interaction liquid chromatography (HILIC) under all possible combinations of acetonitrile content, salt concentration, and mobile-phase pH with R(2) > 0.95. Further, a quantitative structure-retention relationship (QSRR) model was developed to predict retention times for new analytes, based only on their chemical structures, with a root-mean-square error of prediction (RMSEP) as low as 0.81%. A compound classification based on the concept of similarity was applied prior to QSRR modeling. Finally, we utilized a combined QSRR-DoE approach to propose an optimal design space in a quality-by-design (QbD) workflow to facilitate the HILIC method development. The mathematical QSRR-DoE model was shown to be highly predictive when applied to an independent test set of unseen compounds in unseen conditions with a RMSEP value of 5.83%. The QSRR-DoE computed retention time of pharmaceutical test analytes and subsequently calculated separation selectivity was used to optimize the chromatographic conditions for efficient separation of targets. A Monte Carlo simulation was performed to evaluate the risk of uncertainty in the model's prediction, and to define the design space where the desired quality criterion was met. Experimental realization of peak selectivity between targets under the selected optimal working conditions confirmed the theoretical predictions. These results demonstrate how discovery of optimal conditions for the separation of new analytes can be accelerated by the use of appropriate theoretical tools.

  5. Development of a Chemoenzymatic-like and Photoswitchable Method for the High-Throughput creation of Protein Microarrays. Application to the Analysis of the Protein/Protein Interactions Involved in the YOP Virulon from Yersinia pestis.

    Energy Technology Data Exchange (ETDEWEB)

    Camarero, J A


    Protein arrays are ideal tools for the rapid analysis of whole proteomes as well as for the development of reliable and cheap biosensors. The objective of this proposal is to develop a new ligand assisted ligation method based in the naturally occurring protein trans-splicing process. This method has been used for the generation of spatially addressable arrays of multiple protein components by standard micro-lithographic techniques. Key to our approach is the use of the protein trans-splicing process. This naturally occurring process allows the development of a truly generic and highly efficient method for the covalent attachment of proteins through its C-terminus to any solid support. This technology has been used for the creation of protein chips containing several virulence factors from the human pathogen Y. pestis.

  6. The development of spectrophotometric and electroanalytical methods for ascorbic acid and acetaminophen and their applications in the analysis of effervescent dosage forms. (United States)

    Săndulescu, R; Mirel, S; Oprean, R


    The electroanalytical study of ascorbic acid, acetaminophen and of several mixtures of these compounds in different ratios has been made by using a carbon paste electrode (CPE-graphite:solid paraffin 2:1) as working electrode and an Ag/AgCl reference electrode. The potential curves were recorded using different concentrations of ascorbic acid and acetaminophen by measuring samples between 10 and 50 microl. The oxidation reactions were studied in a potential range from -0.1 to +1.3 V with different sweep rates, at different current sensitivities, in stationary working conditions and stirring before each replicate. The oxidation of ascorbic acid occurs at +0.31 +/- 0.02 V and the oxidation of acetaminophen at +0.60 +/- 0.05 V; meanwhile, the current has a linear variation for the following concentration ranges: 10(-3)-10(-2) M for the ascorbic acid and 3 x 10(-6)-7.5 x 10(-3) M for acetaminophen (r2 = 0.999 for both ascorbic acid and acetaminophen). The mixtures of ascorbic acid and acetaminophen were made as follows: 1:1, 1:2, 1:3, 2:1, and 3:1. The studies revealed the alteration of the voltammograms processed according to the validation methodology. The best potential variation range for different current sensitivities, the influence of the sweep rate, of the solvent volume and of the pH were studied. The mutual interferences of the compounds in the mixtures and the electroactive compounds in the pharmaceutical dosage forms, especially effervescent ones, also made the object of the research. The same mixtures were studied using the direct spectrophotometric method that revealed a lot of spectral interferences. In order to solve this problem, an appropriate separation or an indirect spectrophotometric method (the apparent content curves method) were used. The spectrophotometric and voltammetric methods developed were used to determine ascorbic acid and acetaminophen in different dosage forms (vials, tablets, suppositories and effervescent dosage forms). The results

  7. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  8. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)



    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  9. 基于企业价值系统方法论的企业发展战略分析方法的探讨%Exploration of Enterprise Development Strategy Analysis Method Based on Enterprise Value System Methodology

    Institute of Scientific and Technical Information of China (English)



    This paper firstly reviews and introduces value engineering theory and the historical development and theoretical system of enterprise value system methodology, then puts forward the enterprise development strategy analysis method based on enterprise value system methodology, which using the PEST analysis, Porter's five forces model analysis, SWOT analysis and AHP analysis method to carry on environmental analysis, using RWFJ analysis to analyze the coupling relationship of key elements for the things and environment, as well as using Boston matrix analysis to analyze and formulate portfolio combination development strategy, finally gives the conclusions and precautions.%本文首先对价值工程理论和企业价值系统方法论的历史发展和理论体系进行了回顾和介绍;然后提出了基于企业价值系统方法论的企业发展战略分析方法,采用PEST分析法、波特五力模型分析法、SWOT分析法和AHP分析法进行环境分析,使用RWFJ分析法分析关键要素对事和环境的耦合关系,应用波士顿矩阵分析法分析制定业务组合发展策略;最后给出了分析企业发展战略的结论和需要注意的事项。

  10. Prioritizing pesticide compounds for analytical methods development (United States)

    Norman, Julia E.; Kuivila, Kathryn M.; Nowell, Lisa H.


    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  11. Simultaneous quantitation of 14 active components in Yinchenhao decoction by using ultra high performance liquid chromatography with diode array detection: Method development and ingredient analysis of different commonly prepared samples. (United States)

    Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin


    We developed a novel quantitative analysis method based on ultra high performance liquid chromatography coupled with diode array detection for the simultaneous determination of the 14 main active components in Yinchenhao decoction. All components were separated on an Agilent SB-C18 column by using a gradient solvent system of acetonitrile/0.1% phosphoric acid solution at a flow rate of 0.4 mL/min for 35 min. Subsequently, linearity, precision, repeatability, and accuracy tests were implemented to validate the method. Furthermore, the method has been applied for compositional difference analysis of 14 components in eight normal-extraction Yinchenhao decoction samples, accompanied by hierarchical clustering analysis and similarity analysis. The result that all samples were divided into three groups based on different contents of components demonstrated that extraction methods of decocting, refluxing, ultrasonication and extraction solvents of water or ethanol affected component differentiation, and should be related to its clinical applications. The results also indicated that the sample prepared by patients in the family by using water extraction employing a casserole was almost same to that prepared using a stainless-steel kettle, which is mostly used in pharmaceutical factories. This research would help patients to select the best and most convenient method for preparing Yinchenhao decoction.

  12. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.


    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  13. Transport Test Problems for Hybrid Methods Development

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.


    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  14. 共形天线分析综合方法研究进展%Development of the Method of Analysis and Synthesis for Conformal Antenna

    Institute of Scientific and Technical Information of China (English)

    刘元柱; 肖绍球; 唐明春; 王秉中


    共形天线阵列技术是天线技术发展的重要方向,在军民用雷达与通信系统中具有广阔的应用前景,其分析与综合问题是天线领域研究的热点与难点课题。本文对共形天线分析综合方法的发展进行了详细总结,并展望了未来共形天线研究的发展趋势。%The technology of conformal antenna array is of significant importance in antenna investi- gation, and has wide potential application in military/civil radars and communication systems, while its a- nalysis and synthesis are witnessing difficulties in such hot research. In this paper, the methods of analy- sis and synthesis for conformal antenna are detailed summarized, and the outlook of developing trend of conformal antenna in the future is also forecasted.

  15. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov


    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  16. The Development of Cluster and Histogram Methods (United States)

    Swendsen, Robert H.


    This talk will review the history of both cluster and histogram methods for Monte Carlo simulations. Cluster methods are based on the famous exact mapping by Fortuin and Kasteleyn from general Potts models onto a percolation representation. I will discuss the Swendsen-Wang algorithm, as well as its improvement and extension to more general spin models by Wolff. The Replica Monte Carlo method further extended cluster simulations to deal with frustrated systems. The history of histograms is quite extensive, and can only be summarized briefly in this talk. It goes back at least to work by Salsburg et al. in 1959. Since then, it has been forgotten and rediscovered several times. The modern use of the method has exploited its ability to efficiently determine the location and height of peaks in various quantities, which is of prime importance in the analysis of critical phenomena. The extensions of this approach to the multiple histogram method and multicanonical ensembles have allowed information to be obtained over a broad range of parameters. Histogram simulations and analyses have become standard techniques in Monte Carlo simulations.

  17. Development of a method for holistic energy renovation

    DEFF Research Database (Denmark)

    Morelli, Martin

    . Measurements of temperature and relative humidity showed that conditions for mould growth were present. However, no signs of mould growth were documented at dismantling of the interior insulation. A method was developed for the design of energy saving measures based on both Failure Mode and Effect Analysis...

  18. Automatization for development of HPLC methods. (United States)

    Pfeffer, M; Windt, H


    Within the frame of inprocess analytics of the synthesis of pharmaceutical drugs a lot of HPLC methods are required for checking the quality of intermediates and drug substances. The methods have to be developed in terms of optimal selectivity and low limit of detection, minimum running time and chromatographic robustness. The goal was to shorten the method development process. Therefore, the screening of stationary phases was automated by means of switching modules equipped with 12 HPLC columns. Mobile phase and temperature could be optimized by using Drylab after evaluating chromatograms of gradient elutions performed automatically. The column switching module was applied for more than three dozens of substances, e.g. steroidal intermediates. Resolution (especially of isomeres), peak shape and number of peaks turned out to be the criteria for selection of the appropriate stationary phase. On the basis of the "best" column the composition of the "best" eluent was usually defined rapidly and with less effort. This approach leads to savings in manpower by more than one third. Overnight, impurity profiles of the intermediates were obtained yielding robust HPLC methods with high selectivity and minimized elution time.

  19. Probabilistic structural analysis by extremum methods (United States)

    Nafday, Avinash M.


    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  20. Recent Developments in the Methods of Estimating Shooting Distance

    Directory of Open Access Journals (Sweden)

    Arie Zeichner


    Full Text Available A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads.

  1. Sensitivity Analysis Using Simple Additive Weighting Method

    Directory of Open Access Journals (Sweden)

    Wayne S. Goodridge


    Full Text Available The output of a multiple criteria decision method often has to be analyzed using some sensitivity analysis technique. The SAW MCDM method is commonly used in management sciences and there is a critical need for a robust approach to sensitivity analysis in the context that uncertain data is often present in decision models. Most of the sensitivity analysis techniques for the SAW method involve Monte Carlo simulation methods on the initial data. These methods are computationally intensive and often require complex software. In this paper, the SAW method is extended to include an objective function which makes it easy to analyze the influence of specific changes in certain criteria values thus making easy to perform sensitivity analysis.

  2. Development of an Aerodynamic Analysis Method and Database for the SLS Service Module Panel Jettison Event Utilizing Inviscid CFD and MATLAB (United States)

    Applebaum, Michael P.; Hall, Leslie, H.; Eppard, William M.; Purinton, David C.; Campbell, John R.; Blevins, John A.


    This paper describes the development, testing, and utilization of an aerodynamic force and moment database for the Space Launch System (SLS) Service Module (SM) panel jettison event. The database is a combination of inviscid Computational Fluid Dynamic (CFD) data and MATLAB code written to query the data at input values of vehicle/SM panel parameters and return the aerodynamic force and moment coefficients of the panels as they are jettisoned from the vehicle. The database encompasses over 5000 CFD simulations with the panels either in the initial stages of separation where they are hinged to the vehicle, in close proximity to the vehicle, or far enough from the vehicle that body interference effects are neglected. A series of viscous CFD check cases were performed to assess the accuracy of the Euler solutions for this class of problem and good agreement was obtained. The ultimate goal of the panel jettison database was to create a tool that could be coupled with any 6-Degree-Of-Freedom (DOF) dynamics model to rapidly predict SM panel separation from the SLS vehicle in a quasi-unsteady manner. Results are presented for panel jettison simulations that utilize the database at various SLS flight conditions. These results compare favorably to an approach that directly couples a 6-DOF model with the Cart3D Euler flow solver and obtains solutions for the panels at exact locations. This paper demonstrates a method of using inviscid CFD simulations coupled with a 6-DOF model that provides adequate fidelity to capture the physics of this complex multiple moving-body panel separation event.

  3. Text analysis methods, text analysis apparatuses, and articles of manufacture (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M


    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  4. Behaviour change techniques: the development and evaluation of a taxonomic method for reporting and describing behaviour change interventions (a suite of five studies involving consensus methods, randomised controlled trials and analysis of qualitative data). (United States)

    Michie, Susan; Wood, Caroline E; Johnston, Marie; Abraham, Charles; Francis, Jill J; Hardeman, Wendy


    BACKGROUND Meeting global health challenges requires effective behaviour change interventions (BCIs). This depends on advancing the science of behaviour change which, in turn, depends on accurate intervention reporting. Current reporting often lacks detail, preventing accurate replication and implementation. Recent developments have specified intervention content into behaviour change techniques (BCTs) - the 'active ingredients', for example goal-setting, self-monitoring of behaviour. BCTs are 'the smallest components compatible with retaining the postulated active ingredients, i.e. the proposed mechanisms of change. They can be used alone or in combination with other BCTs' (Michie S, Johnston M. Theories and techniques of behaviour change: developing a cumulative science of behaviour change. Health Psychol Rev 2012;6:1-6). Domain-specific taxonomies of BCTs have been developed, for example healthy eating and physical activity, smoking cessation and alcohol consumption. We need to build on these to develop an internationally shared language for specifying and developing interventions. This technology can be used for synthesising evidence, implementing effective interventions and testing theory. It has enormous potential added value for science and global health. OBJECTIVE (1) To develop a method of specifying content of BCIs in terms of component BCTs; (2) to lay a foundation for a comprehensive methodology applicable to different types of complex interventions; (3) to develop resources to support application of the taxonomy; and (4) to achieve multidisciplinary and international acceptance for future development. DESIGN AND PARTICIPANTS Four hundred participants (systematic reviewers, researchers, practitioners, policy-makers) from 12 countries engaged in investigating, designing and/or delivering BCIs. Development of the taxonomy involved a Delphi procedure, an iterative process of revisions and consultation with 41 international experts; hierarchical structure

  5. Analysis of Cryptocurrencies Price Development

    Directory of Open Access Journals (Sweden)

    Jan Lansky


    Full Text Available Cryptocurrencies are a type of digital currencies based on cryptography principles. Cryptocurrencies are a unique combination of three characteristics: they provide anonymity, they are independent of central authority and they provide protection from double spending attack. The aim of this paper is to capture trends in the area of significant cryptocurrencies price developments and to explain their causes. The current research in this area is exclusively limited to an analysis of the price developments of the most important Bitcoin cryptocurrency; our research is the first to focus on other cryptocurrencies too. The economic perspective on cryptocurrencies is based on IT knowledge regarding the principles of their functioning. We have created a database of prices of 1278 cryptocurrencies from 2013 to 2016. This database is publicly available. To analyse the data, SQL query language was used.

  6. Developing an interactive microsimulation method in pharmacology. (United States)

    Collins, Angela S; Graves, Barbara A; Gullette, Donna; Edwards, Rebecca


    Pharmacology decision making requires clinical judgment. The authors created interactive microsimulation applying drug information to varying patients' situations. The theory-based microsimulation requires situational analysis for each scenario. The microsimulation uses an interactive format that allows the participant to navigate through three separate virtual clients' situations. Correct clinical decisions are rewarded by sounds and by video footage of the patient improving. Conversely, incorrect choices show video footage of the patient decompensating. This micro-simulation was developed to help students learn from the consequences of incorrect medication decision making in the virtual world without harming patients. The feedback of watching an incorrect decision on a patient helps students associate cause and effect on patient outcomes. The microsimulation reinforces the ease with which medication errors can occur and the extent of possible sequalae. The development process used to incorporate the technology in the nursing curriculum is discussed.

  7. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.


    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...

  8. Matrix methods for bare resonator eigenvalue analysis. (United States)

    Latham, W P; Dente, G C


    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  9. Current situation and development of analysis methods of power system transient stability%电力系统暂态稳定分析方法的现状与发展

    Institute of Scientific and Technical Information of China (English)

    李晨; 蒋德珑; 程生安


    For the interconnection power grid expands rapidly with the development of power system, its transient stability becomes increasingly seriously. The reliable transient stability analysis is one of the keys in safe operation of power system. The developing history and status quo of power system transient stability technology is reviewed in the paper by introducing the common methods of power system transient stability analysis. The features and applicability of various methods are analyzed in detail. The development foreground of power system transient stability analysis is clarified. It is pointed out that the wavelet analysis used in the transient stability analysis has a broad space for development, especially in the transient signal processing, it is a valuable research direction.%随着电力系统的发展,互联电力网络变得越来越大,暂态稳定性问题日趋严重,而电力系统安全运行的关键之一是可靠的暂态稳定分析.通过介绍电力系统暂态稳定分析常用的几种方法,回顾电力系统暂态稳定的发展历史和现状,对比分析了几种方法的特点及适用范围,并在此基础上对电力系统暂态稳定分析的发展前景进行了展望.最后指出,小波分析用于电力系统暂态稳定分析具有广阔的发展空间,特别是在处理暂态信号方面,更是一个很有应用价值的研究方向.

  10. Water Hammer Analysis by Characteristic Method

    Directory of Open Access Journals (Sweden)

    A. R. Lohrasbi


    Full Text Available Rapid changes in the velocity of fluid in closed conduits generate large pressure, which are transmitted through the system with the speed of sound. When the fluid medium is a liquid the pressure surges and related phenomena are described as water hammer. Water hammer is caused by normal operation of the system, such as valve opening or closure, pump starts and stoppages and by abnormal condition, such as power failure. Problem statement: Water hammer causes the additional pressure in water networks. This pressure maybe defects on pipes and connections. The likely effects of water hammer must be taken into account in the structural design of pipelines and in the design of operating procedures for pumps, valves, etc. Approach: The physical phenomena of water hammer and the mathematical model which provides the basis for design computations are described. Most water hammer analysis involves computer solution by the method of characteristics. In this study water hammer is modelled with this method and effect of valve opening and closure will be surveyed with a program that is used for this purpose and with a numerical example. Results: The more rapid the closure of the valve, the more rapid is the change in momentum and hence, greater is the additional pressure developed. Conclusions/Recommendations: For preventing of water hammer defects, is recommended that valves should be open or closed slowly. Also with using the method of characteristics, we can modelled all pipe networks, and see the affects of water hammer.

  11. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.


    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  12. A concise method for mine soils analysis

    Energy Technology Data Exchange (ETDEWEB)

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.


    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  13. Computational Aeroacoustic Analysis System Development (United States)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.


    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  14. Validation of a hybrid life-cycle inventory analysis method. (United States)

    Crawford, Robert H


    The life-cycle inventory analysis step of a life-cycle assessment (LCA) may currently suffer from several limitations, mainly concerned with the use of incomplete and unreliable data sources and methods of assessment. Many past LCA studies have used traditional inventory analysis methods, namely process analysis and input-output analysis. More recently, hybrid inventory analysis methods have been developed, combining these two traditional methods in an attempt to minimise their limitations. In light of recent improvements, these hybrid methods need to be compared and validated, as these too have been considered to have several limitations. This paper evaluates a recently developed hybrid inventory analysis method which aims to improve the limitations of previous methods. It was found that the truncation associated with process analysis can be up to 87%, reflecting the considerable shortcomings in the quantity of process data currently available. Capital inputs were found to account for up to 22% of the total inputs to a particular product. These findings suggest that current best-practice methods are sufficiently accurate for most typical applications, but this is heavily dependent upon data quality and availability. The use of input-output data assists in improving the system boundary completeness of life-cycle inventories. However, the use of input-output analysis alone does not always provide an accurate model for replacing process data. Further improvements in the quantity of process data currently available are needed to increase the reliability of life-cycle inventories.

  15. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang


    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  16. Development and Multi-laboratory Verification of US EPA Method 543 for the Analysis of Drinking Water Contaminants by Online Solid Phase Extraction-LC-MS-MS. (United States)

    Shoemaker, Jody A


    A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS). Online SPE-LC-MS-MS has the potential to offer cost-effective, faster, more sensitive and more rugged methods than the traditional offline SPE approach due to complete automation of the SPE process, as well as seamless integration with the LC-MS-MS system. The method uses 2-chloroacetamide, ascorbic acid and Trizma to preserve the drinking water samples for up to 28 days. The mean recoveries in drinking water (from a surface water source) fortified with method analytes are 87.1-112% with relative standard deviations of requirements for sample collection and storage, precision, accuracy, and sensitivity.

  17. Methods for developing and validating survivability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Williams, R.L.


    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  18. Beam-propagation method - Analysis and assessment (United States)

    van Roey, J.; van der Donk, J.; Lagasse, P. E.


    A method for the calculation of the propagation of a light beam through an inhomogeneous medium is presented. A theoretical analysis of this beam-propagation method is given, and a set of conditions necessary for the accurate application of the method is derived. The method is illustrated by the study of a number of integrated-optic structures, such as thin-film waveguides and gratings.

  19. Fractal methods in image analysis and coding


    Neary, David


    In this thesis we present an overview of image processing techniques which use fractal methods in some way. We show how these fields relate to each other, and examine various aspects of fractal methods in each area. The three principal fields of image processing and analysis th a t we examine are texture classification, image segmentation and image coding. In the area of texture classification, we examine fractal dimension estimators, comparing these methods to other methods in use, a...



    Fedoseeva O. S.


    The article proposes a method for landscape area analysis, which consists of four stages. Technique is proposed as a tool for the practical application of pre-project research materials in the design solutions for landscape areas planning and organization

  1. Landfill mining: Developing a comprehensive assessment method. (United States)

    Hermann, Robert; Wolfsberger, Tanja; Pomberger, Roland; Sarc, Renato


    In Austria, the first basic technological and economic examinations of mass-waste landfills with the purpose to recover secondary raw materials have been carried out by the 'LAMIS - Landfill Mining Österreich' pilot project. A main focus of its research, and the subject of this article, is the first conceptual design of a comprehensive assessment method for landfill mining plans, including not only monetary factors (like costs and proceeds) but also non-monetary ones, such as the concerns of adjoining owners or the environmental impact. Detailed reviews of references, the identification of influences and system boundaries to be included in planning landfill mining, several expert workshops and talks with landfill operators have been performed followed by a division of the whole assessment method into preliminary and main assessment. Preliminary assessment is carried out with a questionnaire to rate juridical feasibility, the risk and the expenditure of a landfill mining project. The results of this questionnaire are compiled in a portfolio chart that is used to recommend, or not, further assessment. If a detailed main assessment is recommended, defined economic criteria are rated by net present value calculations, while ecological and socio-economic criteria are examined in a utility analysis and then transferred into a utility-net present value chart. If this chart does not support making a definite statement on the feasibility of the project, the results must be further examined in a cost-effectiveness analysis. Here, the benefit of the particular landfill mining project per capital unit (utility-net present value ratio) is determined to make a final distinct statement on the general benefit of a landfill mining project.

  2. Analysis and development of spatial hp-refinement methods for solving the neutron transport equation; Analyse et developpement de methodes de raffinement hp en espace pour l'equation de transport des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, D.


    The different neutronic parameters have to be calculated with a higher accuracy in order to design the 4. generation reactor cores. As memory storage and computation time are limited, adaptive methods are a solution to solve the neutron transport equation. The neutronic flux, solution of this equation, depends on the energy, angle and space. The different variables are successively discretized. The energy with a multigroup approach, considering the different quantities to be constant on each group, the angle by a collocation method called SN approximation. Once the energy and angle variable are discretized, a system of spatially-dependent hyperbolic equations has to be solved. Discontinuous finite elements are used to make possible the development of hp-refinement methods. Thus, the accuracy of the solution can be improved by spatial refinement (h-refinement), consisting into subdividing a cell into sub-cells, or by order refinement (p-refinement), by increasing the order of the polynomial basis. In this thesis, the properties of this methods are analyzed showing the importance of the regularity of the solution to choose the type of refinement. Thus, two error estimators are used to lead the refinement process. Whereas the first one requires high regularity hypothesis (analytical solution), the second one supposes only the minimal hypothesis required for the solution to exist. The comparison of both estimators is done on benchmarks where the analytic solution is known by the method of manufactured solutions. Thus, the behaviour of the solution as a regard of the regularity can be studied. It leads to a hp-refinement method using the two estimators. Then, a comparison is done with other existing methods on simplified but also realistic benchmarks coming from nuclear cores. These adaptive methods considerably reduces the computational cost and memory footprint. To further improve these two points, an approach with energy-dependent meshes is proposed. Actually, as the

  3. Method development towards qualitative and semi-quantitative analysis of multiple pesticides from food surfaces and extracts by desorption electrospray ionization mass spectrometry as a preselective tool for food control. (United States)

    Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine


    Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.

  4. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F


    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  5. [Framework analysis method in qualitative research]. (United States)

    Liao, Xing; Liu, Jian-ping; Robison, Nicola; Xie, Ya-ming


    In recent years a number of qualitative research methods have gained popularity within the health care arena. Despite this popularity, different qualitative analysis methods pose many challenges to most researchers. The present paper responds to the needs expressed by recent Chinese medicine researches. The present paper is mainly focused on the concepts, nature, application of framework analysis, especially on how to use it, in such a way to assist the newcomer of Chinese medicine researchers to engage with the methodology.

  6. REVIEW: Development of methods for body composition studies (United States)

    Mattsson, Sören; Thomas, Brian J.


    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease.

  7. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺


    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  8. Advanced Software Methods for Physics Analysis (United States)

    Lista, L.


    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming.

  9. Two MIS Analysis Methods: An Experimental Comparison. (United States)

    Wang, Shouhong


    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  10. Analysis within the systems development life-cycle

    CERN Document Server

    Rock-Evans, Rosemary


    Analysis within the Systems Development Life-Cycle: Book 4, Activity Analysis-The Methods describes the techniques and concepts for carrying out activity analysis within the systems development life-cycle. Reference is made to the deliverables of data analysis and more than one method of analysis, each a viable alternative to the other, are discussed. The """"bottom-up"""" and """"top-down"""" methods are highlighted. Comprised of seven chapters, this book illustrates how dependent data and activities are on each other. This point is especially brought home when the task of inventing new busin

  11. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA


    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  12. Development and validation of an in-house quantitative analysis method for cylindrospermopsin using hydrophilic interaction liquid chromatography-tandem mass spectrometry: Quantification demonstrated in 4 aquatic organisms. (United States)

    Esterhuizen-Londt, Maranda; Kühn, Sandra; Pflugmacher, Stephan


    The cyanobacterial toxin cylindrospermopsin (CYN) is of great concern in aquatic environments because of its incidence, multiple toxicity endpoints, and, therefore, the severity of health implications. It may bioaccumulate in aquatic food webs, resulting in high exposure concentrations to higher-order trophic levels, particularly humans. Because of accumulation at primary levels resulting from exposure to trace amounts of toxin, a sensitive analytical technique with proven aquatic applications is required. In the present study, a hydrophilic interaction liquid chromatographic-tandem mass spectrometric method with a lower limit of detection of 200 fg on column (signal-to-noise ratio = 3, n = 9) and a lower limit of quantification of 1 pg on column (signal-to-noise ratio = 11, n = 9) with demonstrated application in 4 aquatic organisms is described. The analytical method was optimized and validated with a linear range (r(2) = 0.999) from 0.1 ng mL(-1) to 100 ng mL(-1) CYN. Mean recovery of the extraction method was 98 ± 2%. Application of the method was demonstrated by quantifying CYN uptake in Scenedesmus subspicatus (green algae), Egeria densa (Brazilian waterweed), Daphnia magna (water flea), and Lumbriculus variegatus (blackworm) after 24 h of static exposure to 50 μg L(-1) CYN. Uptake ranged from 0.05% to 0.11% of the nominal CYN exposure amount. This constitutes a sensitive and reproducible method for extraction and quantification of unconjugated CYN with demonstrated application in 4 aquatic organisms, which can be used in further aquatic toxicological investigations.

  13. Epistemological development and judgments and reasoning about teaching methods. (United States)

    Spence, Sarah; Helwig, Charles C


    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by lectures versus class discussions) were systematically varied. Epistemological development was assessed in the aesthetics, values, and physical truth domains. All participants took the domain, nature of the topic, and teaching method into consideration in ways that showed age-related variations. Epistemological development in the value domain alone was predictive of preferences for class discussions and a critical perspective on teacher-centered direct instruction, even when age was controlled in the analysis.

  14. Analysis of intestinal flora development in breast-fed and formula-fed infants by using molecular identification and detection methods

    NARCIS (Netherlands)

    Harmsen, HJM; Wildeboer-Veloo, ACM; Raangs, GC; Wagendorp, AA; Klijn, N; Bindels, JG; Welling, GW


    Background: An obvious difference between breast-fed and formula-fed newborn infants is the development of the intestinal flora, considered to be of importance for protection against harmful micro-organisms and for the maturation of the intestinal immune system. In this study, novel molecular identi




  16. Causal Moderation Analysis Using Propensity Score Methods (United States)

    Dong, Nianbo


    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  17. Error Analysis of Band Matrix Method


    Taniguchi, Takeo; Soga, Akira


    Numerical error in the solution of the band matrix method based on the elimination method in single precision is investigated theoretically and experimentally, and the behaviour of the truncation error and the roundoff error is clarified. Some important suggestions for the useful application of the band solver are proposed by using the results of above error analysis.

  18. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.


    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy anal

  19. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai


    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  20. Development of automated conjunctival hyperemia analysis software. (United States)

    Sumi, Tamaki; Yoneda, Tsuyoshi; Fukuda, Ken; Hoshikawa, Yasuhiro; Kobayashi, Masahiko; Yanagi, Masahide; Kiuchi, Yoshiaki; Yasumitsu-Lovell, Kahoko; Fukushima, Atsuki


    Conjunctival hyperemia is observed in a variety of ocular inflammatory conditions. The evaluation of hyperemia is indispensable for the treatment of patients with ocular inflammation. However, the major methods currently available for evaluation are based on nonquantitative and subjective methods. Therefore, we developed novel software to evaluate bulbar hyperemia quantitatively and objectively. First, we investigated whether the histamine-induced hyperemia of guinea pigs could be quantified by image analysis. Bulbar conjunctival images were taken by means of a digital camera, followed by the binarization of the images and the selection of regions of interest (ROIs) for evaluation. The ROIs were evaluated by counting the number of absolute pixel values. Pixel values peaked significantly 1 minute after histamine challenge was performed and were still increased after 5 minutes. Second, we applied the same method to antigen (ovalbumin)-induced hyperemia of sensitized guinea pigs, acquiring similar results except for the substantial upregulation in the first 5 minutes after challenge. Finally, we analyzed human bulbar hyperemia using the new software we developed especially for human usage. The new software allows the automatic calculation of pixel values once the ROIs have been selected. In our clinical trials, the percentage of blood vessel coverage of ROIs was significantly higher in the images of hyperemia caused by allergic conjunctival diseases and hyperemia induced by Bimatoprost, compared with those of healthy volunteers. We propose that this newly developed automated hyperemia analysis software will be an objective clinical tool for the evaluation of ocular hyperemia.

  1. Development of quantitative analysis method for mRNA in Mycobacterium leprae and slow-growing acid-fast bacteria using radioisotope

    Energy Technology Data Exchange (ETDEWEB)

    Nakanaga, Kazue; Maeda, Shinji; Matsuoka, Masanori; Kashiwabara, Yoshiko [National Inst. of Infectious Deseases, Tokyo (Japan)


    Since RNase protection assay (RPA) system for specific detection of mRNA from M. lepra was established in the previous year, modification of the system was attempted to detect a trace amount of mRNA in this study. Thus, RNA amplification was examined using nucleic aid sequence-based amplification method (NASBA). Since {sup 32}P CTP was used as an isotope for synthesis of anti-sense RNA probe in the previous method, the label compound was exchanged to that with a lower energy in this study, resulting that the half life of the probe was increased and handling of the probe became easier. Several short bands consisting of 100-130b were detected in total RNA sample of M.marinum and M.choelonae by RPA using T1 probe (194-762, 580b). Whereas the new probe M1 detected longer bands of about 350b from M.marinum RNA and of 250b from M.chelonae, M. bovis BCG and M. kansaii. However, T1 probe was more suitable for specific detection of M.leprae hsp 65 than M1 probe because high and low homogeneous regions are coexisting in the gene. Specific mRNA was detectable from only 3 pg of total RNA by the use of NASBA. RNA recovery for QIAGEN was about 50%, however, the sensitivity of NASBA method was estimated to be several ten to hundred thousands times higher, suggesting that this method is very effective for detection and determination of trace amount of mRNA. (M.N.)

  2. XRSW method, its application and development

    Energy Technology Data Exchange (ETDEWEB)

    Zheludeva, S.I.; Kovalchuk, M.V. [Russian Academy of Sciences, Institute of Crystallography, Moscow (Russian Federation)


    X-Ray Standing Waves (XRSW) may be obtained under dynamical diffraction in periodic structures or under total external reflection conditions (TR) is stratified medium. As the incident angle varies, XRSW nodes and antinodes move in the direction perpendicular to the reflecting planes, leading to drastic variation of photoelectron interaction of X-ray with matter and resulting in specific angular dependencies of secondary radiation yields (photoelectrons, fluorescence, internal photoeffect, photoluminescence, Compton and thermal diffuse scattering). The structural information - the position of investigated atoms in the direction of XRSW movement (coherent position), the distribution of atoms about this position (coherent fraction) - is obtained with the accuracy about several percents from XRSW period D. The objects under investigation are: semiconductor surface layers, heterostructure, multicomponent crystals, interfaces, adsorbed layers. Besides the development of XRSW method allow to obtain structure, geometrical and optical parameters of ultrathin films (crystalline and disordered, organic and inorganic) and nanostructures on their base.


    Directory of Open Access Journals (Sweden)

    Andrea Valéria Steil


    Full Text Available Work analysis is a process used to understand what the important tasks of the job are, how they are performed, and what human attributes are necessary to carry them out successfully. Work analysis is an attempt to develop a theory of human behavior about the job in question to support management decisions. This paper defines work analysis, discusses its main uses in organizations, and presents the objects of study and the methods of work analysis. This paper also discusses how work analysis is done, considering the following steps:  types of data to be collected, data sources, data collecting methods, summary of the information and work analysis reports. This paper ends with the differentiation of work analysis and individual modeling skills and brings arguments to endorse work analysis as an intervention of work and organizational psychology.

  4. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT


    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  5. Development and prototypical application of analysis methods for complex anion mixtures in waters and heavy metal organyls in sediments; Entwicklung und prototypische Anwendung von Analysenverfahren fuer komplexe Anionengemische in Waessern und Schwermetallorganylen in Sedimenten. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Anders, B.; Knoechel, A.; Potgeter, H.; Staub, S.; Stocker, M.


    When it comes to assessing the hazards emanating from heavy pollutants in waters mere elemental analysis provides too little information. Due to the great differences in toxicity and mobility it is important to know more about the exact species in question. This is particularly true of heavy metals that form stable organyls, specifically As, Pb, Sn and Hg, but also of alkylated arsenic acids, which need to be measured in complex anion mixtures. The purpose of the present project was to develop robust, powerful analysis methods and thus overcome the existing deficit in reliable analysis methods for these substances. An important approach in this connection is the use of coupled chromatography and detection systems for separation and analysis. [German] Hinsichtlich der von einer Schwermetallbelastung in Gewaessern ausgehenden Gefahren liefert die reine Elementanalytik nur unzureichende Aussagen. Aufgrund der grossen Unterschiede in Toxiditaet und Mobilitaet ist die Kenntnis der jeweils vorliegenden Spezies bedeutungsvoll. Dies gilt in besonderem Masse fuer die stabile Organyle bildenden Schwermetalle As, Pb, Sn und Hg sowie die alkylierten Arsensaeuren, die es innerhalb komplexer Anionengemische zu bestimmen gilt. Hinsichtlich ihrer sicheren Bestimmung bestehen methodische Defizite, die das vorliegende Projekt durch die Entwicklung robuster, nachweisstarker Analysenverfahren zu beseitigen versucht. Grosse Bedeutung kommt dabei gekoppelten Systemen aus Chromatographie und Detektion als Trenn- und Bestimmungsmethode zu. (orig.)

  6. Analysis of proteins and peptides by electromigration methods in microchips. (United States)

    Štěpánová, Sille; Kašička, Václav


    This review presents the developments and applications of microchip electromigration methods in the separation and analysis of peptides and proteins in the period 2011-mid-2016. The developments in sample preparation and preconcentration, microchannel material, and surface treatment are described. Separations by various microchip electromigration methods (zone electrophoresis in free and sieving media, affinity electrophoresis, isotachophoresis, isoelectric focusing, electrokinetic chromatography, and electrochromatography) are demonstrated. Advances in detection methods are reported and novel applications in the areas of proteomics and peptidomics, quality control of peptide and protein pharmaceuticals, analysis of proteins and peptides in biomatrices, and determination of physicochemical parameters are shown.

  7. Current Developments in Nuclear Density Functional Methods

    CERN Document Server

    Dobaczewski, J


    Density functional theory (DFT) became a universal approach to compute ground-state and excited configurations of many-electron systems held together by an external one-body potential in condensed-matter, atomic, and molecular physics. At present, the DFT strategy is also intensely studied and applied in the area of nuclear structure. The nuclear DFT, a natural extension of the self-consistent mean-field theory, is a tool of choice for computations of ground-state properties and low-lying excitations of medium-mass and heavy nuclei. Over the past thirty-odd years, a lot of experience was accumulated in implementing, adjusting, and using the density-functional methods in nuclei. This research direction is still extremely actively pursued. In particular, current developments concentrate on (i) attempts to improve the performance and precision delivered by the nuclear density-functional methods, (ii) derivations of density functionals from first principles rooted in the low-energy chromodynamics and effective th...

  8. Some selected quantitative methods of thermal image analysis in Matlab. (United States)

    Koprowski, Robert


    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  9. Methods development for total organic carbon accountability (United States)

    Benson, Brian L.; Kilgore, Melvin V., Jr.


    This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.

  10. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus


    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  11. Interactive radio instruction: developing instructional methods. (United States)

    Friend, J


    The USAID has, since 1972, funded the development of a new methodology for educational radio for young children through 3 projects: the Radio Mathematics PRoject of Nicaragua, the Radio Language Arts Project of Kenya, and the Radio Science PRoject of Papua New Guinea. These projects developed math programs for grades 1-4 and English as a second language for grades 1-3; programs to teach science in grades 4-6 are now being developed. Appropriate techniques were developed to engage young children actively in the learning process. Lessons are planned as a "conversation" between the children and the radio; scripts are written as 1/2 of a dialogue, with pauses carefully timed so that written as 12 of a dialogue, with pauses carefully timed so that students can contribute their 1/2. Teaching techniques used in all 3 projects include choral responses, simultaneous individual seatwork, and activities using simple materials such as pebbles and rulers. Certain techniques were specific to the subject being taught, or to the circumstances in which the lessons were to be used. Patterned oral drill was used frequently in the English lessons, including sound-cued drills. "Deferred" oral responses were used often in the math lessons. In this method, the children are instructed to solve a problem silently, not giving the answer aloud until requested, thus allowing time for even the slower children to participate. "One-child" questions were used in both English and science: the radio asks a question to be answered by a single child, who is selected on the spot by the classroom teacher. This allows for open-ended questions, but also requires constant supervision of the classroom teacher. Songs and games were used in all programs, and extensively for didactic purposes in the teaching of English. Instructions for science activities are often more complex than in other courses, particularly when the children are using science apparatus, especially when they work in pairs to share scarce

  12. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin


    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  13. Development of an automatic method for americium and plutonium separation and preconcentration using an multisyringe flow injection analysis-multipumping flow system. (United States)

    Fajardo, Yamila; Ferrer, Laura; Gómez, Enrique; Garcias, Francesca; Casas, Monserrat; Cerdà, Víctor


    A new procedure for automatic separation and preconcentration of 241Am and 239+240Pu from interfering matrixes using transuranide (TRU)-resin is proposed. Combination of the multisyringe flow injection analysis and multipumping flow system techniques with the TRU-resin allows carrying out the sampling treatment and separation in a short time using large sample volumes. Americium is eluted from the column with 4 mol L(-1) hydrochloric acid, and then plutonium is separated via on-column Pu(IV) reduction to Pu(III) with titanium(III) chloride. The corresponding alpha activities are measured off-line, with a relative standard deviation of 3% and a lower limit of detection of 0.004 Bq mL(-1), by using a multiplanchet low-background proportional counter.

  14. Real-time analysis of δ13C- and δD-CH4 in ambient air with laser spectroscopy: method development and first intercomparison results

    Directory of Open Access Journals (Sweden)

    S. Eyer


    Full Text Available In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS to a preconcentration unit, called TRace gas EXtractor (TREX. This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, μmole/mole methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on replicate measurements of compressed air during a two-week intercomparison campaign, the repeatability of the TREX-QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass-spectrometry (IRMS based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. Thus, the intercomparison also reveals the need for reference air samples with accurately determined isotopic composition of CH4 to further improve the interlaboratory compatibility.

  15. Critical evaluation of the role of scientific analysis in UK local authority AQMA decision-making: method development and preliminary results. (United States)

    Woodfield, N K; Longhurst, J W S; Beattie, C I; Laxen, D P H


    Over the past 4 years, local government in the UK has undertaken a process of scientific review and assessment of air quality, which has culminated in a suite of designated air quality management areas (AQMAs) in over 120 of the 403 local authorities in England (including London), Scotland and Wales. Methods to identify specific pollution hot-spots have involved the use of advanced and complex air-quality dispersion modelling and monitoring techniques, and the UK government has provided guidance on both the general and technical methods for undertaking local air quality review and assessments. Approaches to implementing UK air quality policy, through the local air quality management (LAQM) process (Air Quality Strategy 2000) has not been uniform across the UK, as an inevitable consequence of non-prescriptive guidelines. This has led to a variety of outcomes with respect to how different tools and techniques have been applied, the interpretation of scientific uncertainty and the application of caution. A technique to appraise the scientific approaches undertaken by local government and to survey local government officers involved in the LAQM process have been devised, and a conceptual model proposed to identify the main influences in the process of determining AQMAs. Modelling tools used and the consideration of modelling uncertainty, error and model inputs have played a significant role in AQMA decision-making in the majority of local authorities declaring AQMAs in the UK.

  16. Analysis of Maths Learning Activities Developed By Pre-service Teachers in Terms of the Components of Content, Purpose, Application Methods

    Directory of Open Access Journals (Sweden)

    Çağla Toprak


    Full Text Available Today- when the influence of the alteration movement done in order to keep up with the age of the educational system is still continuing- the importance of teachers in students’ learning and achieving what is expected from the education system has been stated by the studies conducted (Hazır & Bıkmaz, 2006. Teachers own a critical role in the stage of both preparing teaching materials and using them (Stein & Smith, 1998b; Swan, 2007. When the existing curriculums –in particular, maths and geometry cirriculums- are analyzed, it can be observed that activities are the most significant teaching materials (Bozkurt, 2012. In fact, it is possible to characterize the existing curriculums as activity-based ones (Report of Workshop Examining Content of Primary School Curriculums According to Branches, 2010; Epö, 2005. Therefore, what sort of learning activities there are, what qualities they need to have, how to design and apply them are topics that must be elaborated (Uğurel et al., 2010.  At this point, our study to increase the skills of pre-service teachers during the process of developing activities was conducted with 27 pre-service teachers -19 girls 8 boys- studying in the 4th year in Mathematics Education Department at a state university in the Aegean Region. The activity designs the pre-service teachers developed considering the patterns given after a series of practice were analyzed in documents in terms of the aim of design and the form of practice. As a result of the studies, it is observed that pre-service teachers deal with the topics from the maths curriculum and these topics are of different grade levels. The result of the examination named as target component suggests that activities developed aim firstly at providing learning and this is followed by reinforcing the concepts already learned. It is stated that pre-service teachers prefer mostly small group (cooperative studies in the activities they develop.Key Words:

  17. Dynamic analysis and assessment for sustainable development

    Institute of Scientific and Technical Information of China (English)


    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  18. Development of a non-destructive micro-analytical method for stable carbon isotope analysis of transmission electron microscope (TEM) samples (United States)

    Hode, Tomas; Kristiansson, Per; Elfman, Mikael; Hugo, Richard C.; Cady, Sherry L.


    The biogenicity of ancient morphological microfossil-like objects can be established by linking morphological (e.g. cell remnants and extracellular polymeric matrix) and chemical (e.g. isotopes, biomarkers and biominerals) evidence indicative of microorganisms or microbial activity. We have developed a non-destructive micro-analytical ion beam system capable of measuring with high spatial resolution the stable carbon isotope ratios of thin samples used for transmission electron microscopy. The technique is based on elastic scattering of alpha particles with an energy of 2.751 MeV. At this energy the 13C cross section is enhanced relative to the pure Rutherford cross section for 13C, whereas the 12C cross section is reduced relative to its pure Rutherford cross section. Here we report the initial results of this experimental approach used to characterize ultramicrotomed sections of sulfur-embedded graphite and microbial cells.

  19. A Review of Scaling Agile Methods in Large Software Development

    Directory of Open Access Journals (Sweden)

    Mashal Alqudah


    Full Text Available Agile methods such as Dynamic Systems Development Method (DSDM, Extreme Programming (XP, SCRUM, Agile Modeling (AM and Crystal Clear enable small teams to execute assigned task at their best. However, larger organizations aim at incorporating more Agile methods owing to the fact that its application is prevalently tailored for small teams. The scope in which large firms are interested will extend the original Agile methods to include larger teams, coordination, communication among teams and customers as well as oversight. Determining particular software method is always challenging for software companies especially when considering start-up, small to medium or large enterprises. Most of large organizations develop large-scale projects by teams of teams or teams of teams of teams. Therefore, most recognized Agile methods or first-generation methods such as XP and SCRUM need to be modified before they are employed in large organizations; which is not an easy task. Accomplishing said task would necessitate large organizations to pick and select from the scaling Agile methods in accommodating a single vision for large and multiple teams. Deciding the right choice requires wholesome understanding of the method including its strengths and weaknesses as well as when and how it makes sense. Therefore, the main aim of this paper is to review the existing literature of the utilized scaling Agile methods by defining, discussing and comparing them. In-depth reviews on the literature were performed to juxtapose the methods in impartial manner. In addition, the content analysis was used to analyse the resultant data. The result indicated that the DAD, LeSS, LeSS huge, SAFe, Spotify, Nexus and RAGE are the adopted scaling Agile methods at large organizations. They seem to be similar but there are discrepancies among them that take the form of team size, training and certification, methods and practices adopted, technical practices required and organizational

  20. Microarray Analysis of the Developing Rat Mandible

    Institute of Scientific and Technical Information of China (English)

    Hideo KABURAGI; Naoyuki SUGANO; Maiko OSHIKAWA; Ryosuke KOSHI; Naoki SENDA; Kazuhiro KAWAMOTO; Koichi ITO


    To analyze the molecular events that occur in the developing mandible, we examined the expression of 8803 genes from samples taken at different time points during rat postnatal mandible development.Total RNA was extracted from the mandibles of 1-day-old, 1-week-old, and 2-week-old rats. Complementary RNA (cRNA) was synthesized from cDNA and biotinylated. Fragmented cRNA was hybridized to RGU34A GeneChip arrays. Among the 8803 genes tested, 4344 were detectable. We identified 148 genes with significantly increased expression, and 19 genes with significantly decreased expression. A comprehensive analysis appears to be an effective method of studying the complex process of development.

  1. Quality by design in the chiral separation strategy for the determination of enantiomeric impurities: development of a capillary electrophoresis method based on dual cyclodextrin systems for the analysis of levosulpiride. (United States)

    Orlandini, S; Pasquini, B; Del Bubba, M; Pinzauti, S; Furlanetto, S


    Quality by design (QbD) concepts, in accordance with International Conference on Harmonisation Pharmaceutical Development guideline Q8(R2), represent an innovative strategy for the development of analytical methods. In this paper QbD principles have been comprehensively applied in the set-up of a capillary electrophoresis method aimed to quantify enantiomeric impurities. The test compound was the chiral drug substance levosulpiride (S-SUL) and the developed method was intended to be used for routine analysis of the pharmaceutical product. The target of analytical QbD approach is to establish a design space (DS) of critical process parameters (CPPs) where the critical quality attributes (CQAs) of the method have been assured to fulfil the desired requirements with a selected probability. QbD can improve the understanding of the enantioseparation process, including both the electrophoretic behavior of enantiomers and their separation, therefore enabling its control. The CQAs were represented by enantioresolution and analysis time. The scouting phase made it possible to select a separation system made by sulfated-β-cyclodextrin and a neutral cyclodextrin, operating in reverse polarity mode. The type of neutral cyclodextrin was included among other CPPs, both instrumental and related to background electrolyte composition, which were evaluated in a screening phase by an asymmetric screening matrix. Response surface methodology was carried out by a Doehlert design and allowed the contour plots to be drawn, highlighting significant interactions between some of the CPPs. DS was defined by applying Monte-Carlo simulations, and corresponded to the following intervals: sulfated-β-cyclodextrin concentration, 9-12 mM; methyl-β-cyclodextrin concentration, 29-38 mM; Britton-Robinson buffer pH, 3.24-3.50; voltage, 12-14 kV. Robustness of the method was examined by a Plackett-Burman matrix and the obtained results, together with system repeatability data, led to define a method

  2. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab


    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  3. Advanced Analysis Methods in Particle Physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab


    Each generation of high energy physics experiments is grander in scale than the previous – more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  4. Development of multiscale analysis and some applications (United States)

    Wang, Lipo


    For most complex systems the interaction of different scales is among the most interesting and challenging features. Typically different scale regimes have different physical properties. The commonly used analysis approaches such as structure function and Fourier analysis have their respective limitations, for instance the mixing of large and small scale information, i.e. the so-called infrared and ultraviolet effects. To make improvement in this regard, a new method, segment structure analysis (SSA), has been developed to study the multiscale statistics. Such method can detect the regime scaling based on the conditional extremal points, depicting the geometrical features directly in physical space. From standard test cases (e.g. fractal Brownian motion) to real turbulence data, results show that SSA can appropriately distinguish the different scale effects. A successful application is the scaling of the Lagrangian velocity structure function. This long-time controversial topic has been confirmed using the present method. In principle SSA can generally be applied to various problems.

  5. Methods for Rapid Screening in Woody Plant Herbicide Development

    Directory of Open Access Journals (Sweden)

    William Stanley


    Full Text Available Methods for woody plant herbicide screening were assayed with the goal of reducing resources and time required to conduct preliminary screenings for new products. Rapid screening methods tested included greenhouse seedling screening, germinal screening, and seed screening. Triclopyr and eight experimental herbicides from Dow AgroSciences (DAS 313, 402, 534, 548, 602, 729, 779, and 896 were tested on black locust, loblolly pine, red maple, sweetgum, and water oak. Screening results detected differences in herbicide and species in all experiments in much less time (days to weeks than traditional field screenings and consumed significantly less resources (<500 mg acid equivalent per herbicide per screening. Using regression analysis, various rapid screening methods were linked into a system capable of rapidly and inexpensively assessing herbicide efficacy and spectrum of activity. Implementation of such a system could streamline early-stage herbicide development leading to field trials, potentially freeing resources for use in development of beneficial new herbicide products.

  6. Developments and retrospectives in Lie theory algebraic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph


    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Algebraic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current research.  Mos...

  7. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph


    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  8. Formal methods in the development of safety critical software systems

    Energy Technology Data Exchange (ETDEWEB)

    Williams, L.G. [Software Engineering Research, Boulder, CO (United States)


    As the use of computers in critical control systems such as aircraft controls, medical instruments, defense systems, missile controls, and nuclear power plants has increased, concern for the safety of those systems has also grown. Much of this concern has focused on the software component of those computer-based systems. This is primarily due to historical experience with software systems that often exhibit larger numbers of errors than their hardware counterparts and the fact that the consequences of a software error may endanger human life, property, or the environment. A number of different techniques have been used to address the issue of software safety. Some are standard software engineering techniques aimed at reducing the number of faults in a software protect, such as reviews and walkthroughs. Others, including fault tree analysis, are based on identifying and reducing hazards. This report examines the role of one such technique, formal methods, in the development of software for safety critical systems. The use of formal methods to increase the safety of software systems is based on their role in reducing the possibility of software errors that could lead to hazards. The use of formal methods in the development of software systems is controversial. Proponents claim that the use of formal methods can eliminate errors from the software development process, and produce programs that are probably correct. Opponents claim that they are difficult to learn and that their use increases development costs unacceptably. This report discusses the potential of formal methods for reducing failures in safety critical software systems.


    Directory of Open Access Journals (Sweden)

    R. M. Cherkesov


    Full Text Available Currently taking place in the countries sociopolitical and socio-economic changes require corresponding changes in the part of departmental vocational training. Physical preparation of employees of law enforcement bodies of the Russian Federation of affairs is one of the areas in need of a thorough modification. With the growing volume and complexity of problems solved by the system of internal affairs bodies, including the configuration of highquality modern crime make it necessary to increase the physical training officer. To maintain law and order and successfully confront crime, the police officer should equally have both legal skills and a decent physical training. This principle of operation is to become a professional duty of a police officer. The article tells about the basic methods of general and special endurance, as well as on innovative developments in this field. Use of physical force has become commonplace in the everyday life of law enforcement officers. This entails the risk of serious consequences, in some cases; threaten the health and life of humans. Therefore, of particular importance are the problem of ensuring the rights of citizens, the rule of law in the activities of departments of internal affairs bodies.

  10. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    Directory of Open Access Journals (Sweden)

    Azadeh Manayi


    Full Text Available Echinacea purpurea (Asteraceae is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists′ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant′s mechanism of action using new, complementary methods.

  11. Prediction Method Study of Policy Development Trends Based on the Content Analysis of Policy Texts%基于政策文本内容分析的政策发展趋势预测研究

    Institute of Scientific and Technical Information of China (English)

    赵筱媛; 浦墨; 王娟娟; 詹淑琳


    The researchers always use qualitative or quantitative methods to judge and predict the development trends and evolution of public policies,and moreover,they use qualitative methods more often than quantitative ones. In this article,we propose a prediction method combining qualitative and quantitative analysis which integrated the thought of content analysis and comparative analogy based on summarizing the features of researches on predicting policy development at home and abroad. Meanwhile,we prove the feasibility of the method in the prediction of policy trends through empirical research.%一直以来,研究者对于政策发展趋势和演进变化的判断与预测存在着单一使用定性方法或定量方法、且对定性方法的应用多于定量方法的现象。本文通过调研、梳理国内外预测政策发展的相关研究方法,在总结各类方法的特点基础上,提出一种集成内容分析和对比类推方法思想的定性定量相结合的政策发展趋势预测方法,并利用这一方法开展了相关实证应用。研究结论证明了这种研究方法在预测政策发展趋势中具有可行性。

  12. Analytical Method Development & Validation for Related Substances Method of Busulfan Injection by Ion Chromatography Method

    Directory of Open Access Journals (Sweden)

    Rewaria S


    Full Text Available A new simple, accurate, precise and reproducible Ion chromatography method has been developed forthe estimation of Methane sulfonic acid in Busulfan injectable dosage. The method which is developedis also validated in complete compliance with the current regulatory guidelines by using well developedanalytical method validation techniques and tools which comprises with the analytical method validationparameters like Linearity, LOD and LOQ determination, Accuracy, Method precision, Specificity,System suitability, Robustness, Ruggedness etc. by adopting the current method the linearity obtained isnear to 0.999 and thus this shows that the method is capable to give a good detector response, therecovery calculated was within the range of 85% to 115% of the specification limits.

  13. Development and validation of a single RP-HPLC assay method for analysis of bulk raw material batches of four parabens that are widely used as preservatives in pharmaceutical and cosmetic products. (United States)

    Kumar, S; Mathkar, S; Romero, C; Rustum, A M


    A stability-indicating, robust, fast, and user friendly reversed-phase high-performance liquid chromatographic (RP-HPLC) assay method has been developed and validated for the analysis of commercial raw material batches of methylparaben, ethylparaben, propylparaben, and butylparaben. These four parabens are widely used as preservatives in pharmaceutical and cosmetic products. Accurate assay value of each of the parabens in their respective commercial lots is critical to determine the correct weight of the paraben that is needed to obtain the target concentration of the paraben in a specific lot of pharmaceutical or cosmetic products. Currently, there are no single HPLC assay methods (validated as per ICH requirements) available in the literature that can be used to analyze the commercial lots of each of the four parabens. The analytical method reported herein analyzes all four parabens in less than 10 min. The method presented in this report was successfully validated as per ICH guidelines. Therefore, this method can be implemented in QC laboratories to analyze and assay the commercial bulk lots of the four parabens.

  14. Linear Algebraic Method for Non-Linear Map Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yu,L.; Nash, B.


    We present a newly developed method to analyze some non-linear dynamics problems such as the Henon map using a matrix analysis method from linear algebra. Choosing the Henon map as an example, we analyze the spectral structure, the tune-amplitude dependence, the variation of tune and amplitude during the particle motion, etc., using the method of Jordan decomposition which is widely used in conventional linear algebra.

  15. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian


    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...

  16. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, J F


    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentrablatt Math "". . . carefully structured with many detailed worked examples . . .""-The Mathematical Gazette "". . . an up-to-date and user-friendly account . . .""-Mathematika An Introduction to Numerical Methods and Analysis addresses the mathematics underlying approximation and scientific computing and successfully explains where approximation methods come from, why they sometimes work (or d

  17. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing


    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  18. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E. [Emory Univ. School of Public Health, Atlanta, GA (United States); Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)


    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  19. [Development of trace metal ion analysis]. (United States)

    Kobayashi, J


    Analyses of trace biologically essential or toxic ionic compounds found in the environment are very important. However, the lack of sensitivity and interference caused by coexisting components are often serious problems. To determine trace levels of metal ions without the above problems, new preconcentration and analytical methods have been developed. Firstly, three methods for the selective preconcentration of metal ions are shown below: 1) 3-Chloropyridazine-6-carbohydrazide was immobilized on glass beads supports to be used as a column packing material. Multi-metal ions were concentrated on the column and eluted selectively with several buffers and hydrochloric acid. The eluate was analyzed off-line by flame atomized-atomic absorption spectrometry (AAS). This method was able to determine sub-ppb levels of cupper- and cadmium-ions in environmental samples. 2) Salicylideneamino-2-thiophenol was immobilized on the supports. Aluminum ion was concentrated selectively on the column and eluted with nitric acid. The eluate was analyzed off-line by flameless-AAS or on-line by flow injection analysis using pyrocatechol violet for a post-column colorimetric reagent. These methods were able to determine ppb-ppt levels of aluminium in environmental samples and were suitable for its state-analysis. 3) Bathocuproinesulfonic acid was immobilized on the supports. Copper ion was concentrated selectively on the column and eluted with nitric acid. The eluate was analyzed on-line by flow injection analysis using bathocuproinesulfonic acid. This method was able to determine sub-ppb levels of copper in environmental samples. On the other hand, to analyze simultaneously trace metal ions and anions, capillary electrophoresis was performed using ethylenediaminetetraacetic acid as an electrolyte component. Simultaneous determination of several ions in mineral waters was achieved by the system.

  20. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault


    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  1. Numerical analysis in electromagnetics the TLM method

    CERN Document Server

    Saguet, Pierre


    The aim of this book is to give a broad overview of the TLM (Transmission Line Matrix) method, which is one of the "time-domain numerical methods". These methods are reputed for their significant reliance on computer resources. However, they have the advantage of being highly general.The TLM method has acquired a reputation for being a powerful and effective tool by numerous teams and still benefits today from significant theoretical developments. In particular, in recent years, its ability to simulate various situations with excellent precision, including complex materials, has been

  2. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R


    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  3. Single-cell analysis - Methods and protocols


    Carlo Alberto Redi


    This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  4. Systems and methods for sample analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng


    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  5. Integrated numerical methods for hypersonic aircraft cooling systems analysis (United States)

    Petley, Dennis H.; Jones, Stuart C.; Dziedzic, William M.


    Numerical methods have been developed for the analysis of hypersonic aircraft cooling systems. A general purpose finite difference thermal analysis code is used to determine areas which must be cooled. Complex cooling networks of series and parallel flow can be analyzed using a finite difference computer program. Both internal fluid flow and heat transfer are analyzed, because increased heat flow causes a decrease in the flow of the coolant. The steady state solution is a successive point iterative method. The transient analysis uses implicit forward-backward differencing. Several examples of the use of the program in studies of hypersonic aircraft and rockets are provided.

  6. A New Venture Analysis Method and Its Application

    Institute of Scientific and Technical Information of China (English)

    WANG Wen-jie(王文杰); Ronald K. Mitchell; TANG Bing-yong(汤兵勇)


    The new venture analysis is the foundation of venture development. In this paper, 14 venture prototypes are proposed based on the attributes of venture.Then, a new venture analysis method is discussed by the way of matching the new venture with the corresponding prototype. Considering the fuzziness of human subjective grading, the L-R fuzzy numbers are used to express the variables and corresponding fuzzy algorithm are applied in analysis. At the end, an application example is applied to indicate the effectiveness of the method.

  7. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian


    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.


    Directory of Open Access Journals (Sweden)

    Fedorova Svetlana Yurievna


    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used: the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing. The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps: diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions. The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers.


    Directory of Open Access Journals (Sweden)

    Светлана Юрьевна Федорова


    Full Text Available Education and training of gifted children today appropriate to consider as an important strategic task of modern society. In this context, the purpose of research is the development motor giftedness, which is particularly relevant at the stage of pre-school education, which is caused by age-preschoolers. Preschoolers' motor giftedness is considered by the author as developing integrated quality, including psychomotor skills, inclinations, increased motivation for motor activity. In the process of study the following methods are used:  the study and analysis of the scientific and methodological literature on studies, questioning, interview, testing of physical fitness, statistical data processing.The result of research work is methodic of development of motor giftedness on physical education in preschool. The author's methodic consists of four steps:  diagnostic, prognostic, practice and activity, social and pedagogical. Each step determines the inclusion of preschool children in sports and developing environment that meets his or her abilities and needs through the creation of certain social and educational conditions.The area of using results of the author's methodic is preschool and the system of improvement professional skill of teachers. DOI:

  10. Further development of microparticle image velocimetry analysis for characterisation of gas streams as a novel method of fuel cell development. Final report; Weiterentwicklung des Mikro-Particle Image Velocimetry Analyseverfahrens zur Charakterisierung von Gasstroemungen als neuartige Entwicklungsmethodik fuer Brennstoffzellen. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)



    The project aimed at a better understanding of the complex fluid-mechanical processes in the small ducts of bipolar plates. So far, an appropriate technology for in-situ measurement was lacking. The project therefore focused on the further development of microparticle image velocimetry in order to enable analyses of the local velocity distribution of a gas stream in a microduct. Further, measurements were carried out in the microducts of a fuel cell in the more difficult conditions of actual operation. (orig./AKB) [German] Anlass des Forschungsvorhabens war die komplizierten stroemungsmechanischen Zusammenhaenge in den kleinen Kanaelen der Bipolarplatten zu verstehen. Bisher stand keine Messtechnik zur Verfuegung, dies es erlaubt, die stroemungsmechanischen Prozesse in den Mikrokanaelen unter Realbedingungen in situ zu vermessen und mit der instantanen Zellleistung zu korrelieren, Ziel des Projektes war es daher, die Methode der Mikro-Partikel-Image-Velocimetry in der Art weiterzuentwickeln, dass eine Analyse der lokalen Geschwindigkeitsverteilung einer Gasstroemung in einem Mikrokanal ermoeglicht wird. Darueber hinaus wird als zweites Ziel des Projekts eine solche Messung unter den erschwerten Bedingungen einer betriebenen Brennstoffzelle in Mikrokanaelen einer Zelle durchgefuehrt.

  11. Spectroscopic Chemical Analysis Methods and Apparatus (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor); Lane, Arthur L. (Inventor)


    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  12. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.


    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  13. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga


    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  14. Development of a simultaneous high resolution typing method for three SLA class II genes, SLA-DQA, SLA-DQB1, and SLA-DRB1 and the analysis of SLA class II haplotypes. (United States)

    Le, MinhThong; Choi, Hojun; Choi, Min-Kyeung; Cho, Hyesun; Kim, Jin-Hoi; Seo, Han Geuk; Cha, Se-Yeon; Seo, Kunho; Dadi, Hailu; Park, Chankyu


    The characterization of the genetic variations of major histocompatibility complex (MHC) is essential to understand the relationship between the genetic diversity of MHC molecules and disease resistance and susceptibility in adaptive immunity. We previously reported the development of high-resolution individual locus typing methods for three of the most polymorphic swine leukocyte antigens (SLA) class II loci, namely, SLA-DQA, SLA-DQB1, and SLA-DRB1. In this study, we extensively modified our previous protocols and developed a method for the simultaneous amplification of the three SLA class II genes and subsequent analysis of individual loci using direct sequencing. The unbiased and simultaneous amplification of alleles from the all three hyper-polymorphic and pseudogene containing genes such as MHC genes is extremely challenging. However, using this method, we demonstrated the successful typing of SLA-DQA, SLA-DQB1, and SLA-DRB1 for 31 selected individuals comprising 26 different SLA class II haplotypes which were identified from 700 animals using the single locus typing methods. The results were identical to the known genotypes from the individual locus typing. The new method has significant benefits over the individual locus typing, including lower typing cost, use of less biomaterial, less effort and fewer errors in handling large samples for multiple loci. We also extensively characterized the haplotypes of SLA class II genes and reported three new haplotypes. Our results should serve as a basis to investigate the possible association between polymorphisms of MHC class II and differences in immune responses to exogenous antigens.

  15. Heteroscedastic regression analysis method for mixed data

    Institute of Scientific and Technical Information of China (English)

    FU Hui-min; YUE Xiao-rui


    The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data, type- I censored data and type- Ⅱ censored data from the location-scale distribution. The best unbiased estimations of regression coefficients, as well as the confidence limits of the location parameter and scale parameter were given. Furthermore, the point estimations and confidence limits of percentiles were obtained. Thus, the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution. So the presented method has a broad range of promising applications.

  16. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis


    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  17. 系统方法在科学发展问题分析中的应用%Application of the Systematic Method to the Analysis of Scientific Development Problems

    Institute of Scientific and Technical Information of China (English)



    The systematic method is introduced to the problem analysis of national economy and social de-velopment for the purpose of revealing the nature of scientific development problem. Through the analysis about the objectification of subjective and the subjectification of objective,as well as the qualitative and quantitative analysis of the systematic elements and their relations,it is concluded from historical logic that our national e-conomy and social development needs to comprehensively coordinate different kinds of structural relationships between economic structure and social development,such as the relationships between economic growth and natural resources,between human and natural environment,between individuals,and between the present and the future,together with the relationship between our national economic and social development and enterprise development. It is also pointed out that in the practice of enterprise management,it is necessary to rationalize the relationship between government and market,to optimize and adjust the economic structure,especially to resolve the production capacity surplus. Finally,the purpose of study is to realize the all-round development of our national economy and society.%在国民经济和社会发展问题分析中引入系统方法,以揭示科学发展问题的本质;通过主体客体化、客体主体化的相关分析,以及系统要素及其关系之定性与定量分析的结合,逻辑地演绎出我国当代国民经济与社会发展需要全面协调经济增长与自然资源之间、人与自然环境之间、人与人之间、现在与未来之间等各类经济结构以及社会发展方面的结构关系,以及国民经济社会发展与企业发展的关系,并延伸到企业管理实践,提出不断理顺政府与市场的关系,优化和调整经济结构,特别是化解产能过剩问题,实现国民经济与社会全面发展。

  18. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel


    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  19. Unascertained Factor Method of Dynamic Characteristic Analysis for Antenna Structures

    Institute of Scientific and Technical Information of China (English)

    ZHU Zeng-qing; LIANG Zhen-tao; CHEN Jian-jun


    The dynamic characteristic analysis model of antenna structures is built, in which the structural physical parameters and geometrical dimensions are all considered as unascertained variables, And a structure dynamic characteristic analysis method based on the unascertained factor method is given. The computational expression of structural characteristic is developed by the mathematics expression of unascertained factor and the principles of unascertained rational numbers arithmetic. An example is given, in which the possible values and confidence degrees of the unascertained structure characteristics are obtained. The calculated results show that the method is feasible and effective.

  20. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo


    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  1. Global/local methods research using a common structural analysis framework (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.


    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  2. Developments in mycotoxin analysis: an update for 2010-2011

    NARCIS (Netherlands)

    Shephard, G.S.; Berthiller, F.; Burdaspal, P.; Crews, C.; Jonker, M.A.; Krska, R.; MacDonald, S.; Malone, R.J.; Maragos, C.; Solfrizzo, M.; Egmond, van H.P.; Whitaker, T.B.


    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2010 and mid-2011. It covers the major mycotoxins: aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. Analytical methods for mycotoxins cont

  3. Developments in mycotoxin analysis: an update for 2009-2010

    NARCIS (Netherlands)

    Shephard, G.S.; Berthiller, F.; Burdaspal, P.; Crews, C.; Jonker, M.A.; Krska, R.; MacDonald, S.; Malone, B.; Maragos, C.; Sabino, M.; Solfrizzo, M.; Egmond, van H.P.; Whitaker, T.B.


    This review highlights developments in mycotoxin analysis and sampling over a period between mid-2009 and mid-2010. It covers the major mycotoxins aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxin, patulin, trichothecenes, and zearalenone. New and improved methods for mycotoxins

  4. Multiscale Methods for Nuclear Reactor Analysis (United States)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  5. Development of quality-by-design analytical methods. (United States)

    Vogt, Frederick G; Kord, Alireza S


    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  6. Development on Vulnerability Assessment Methods of PPS

    Institute of Scientific and Technical Information of China (English)

    MIAO; Qiang; ZHANG; Wen-liang; BU; Li-xin; YIN; Hong-he; LI; Xin-jun; FANG; Xin


    Through investigating information from domestic and abroad,joint the domestic assessment experience,we present a set of physical protection system(PPS)vulnerability assessment methods for on-operating nuclear power plants and for on-designing nuclear facilities.The methods will help to strengthen and upgrade the security measures of the nuclear facilities,improve the effectiveness and

  7. Development and application of a method for analysis of lufenuron in wheat flour by gas chromatography-mass spectrometry and confirmation of bio-efficacy against Tribolium castaneum (Herbst) (Coleoptera: Tenebrionidae). (United States)

    Ahire, Kedar C; Arora, Manjit S; Mukherjee, Samindra N


    A new analytical method using gas chromatography with mass spectrometry (GC-MS) for the quantitative determination of lufenuron, a benzoylphenylurea (BPU) class of insecticide, from wheat flour has been developed and applied for time-dependent residue monitoring in treated wheat flour. The analyte was extracted from wheat flour by a single step solid-liquid extraction by using ethyl acetate and subsequently cleaned up using the Primary Secondary Amine as a sorbent prior to GC-MS analysis. The present method provides sufficient sensitivity as reflected by the values of limit of detection (LOD) and limit of quantification (LOQ), 5 ng/mL (S/N approximately 3) and 50 ng/mL (the lowest validation point on the calibration curve), respectively. The calibration curve showed an excellent linearity in the concentration range of 50-1000 ng/mL (r2=0.998). The average recovery for spiked samples at three concentrations (150, 300, and 450 ng/g) was 98.23+/-2.52% R.S.D. The method was applied for the determination of lufenuron residues in treated wheat flour samples. Simultaneous determination of bio-efficacy of lufenuron residues was also carried out against the red flour beetle, Tribolium castaneum to correlate the actual residual effect of lufenuron as detected by the analytical method, over a period of 3 months. The findings revealed that the residual concentration of lufenuron were neither uniform nor in descending order over a period of 3 months in wheat flour, possibly because of an uneven dispersal in the treated wheat which was subsequently milled into flour, as confirmed by GC-MS analysis. However, the residues of lufenuron were sufficient to produce 100% mortality of T. castaneum larvae up to 3 months. The results have been discussed in view of the potential of lufenuron as a candidate molecule for the control of stored product pests.

  8. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)


    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  9. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis (United States)

    Schiazza, Daniela Marie


    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  10. Method of thermal derivative gradient analysis (TDGA

    Directory of Open Access Journals (Sweden)

    M. Cholewa


    Full Text Available In this work a concept of thermal analysis was shown, using for crystallization kinetics description the temperature derivatives after time and direction. Method of thermal derivative gradient analysis (TDGA is assigned for alloys and metals investigation as well as cast composites in range of solidification. The construction and operation characteristics were presented for the test stand including processing modules and probes together with thermocouples location. Authors presented examples of results interpretation for AlSi11 alloy castings with diversified wall thickness and at different pouring temperature.

  11. Development of New Bituminous Pavement Design Method

    DEFF Research Database (Denmark)

    Ullidtz, Per

    The report and work of COST Action 333 sets in place the foundation for a coherent, cost-effective and harmonised European pavement design method. In order to do this, the work programme focused on information gathering, identification of requirements and the selection of the necessary design...... elements for such a European design method. During the course of the Action, 20 countries signed the Memorandum of Understanding and participated in the work, thus showing a very high level of interest in the work programme and results.The document contains a very thorough review of the position in Europe...... at the present time and identifies the basic framework for a design method. It then clearly shows the necessary steps to be taken in the future in order to arrive at a method based on current best practice in the shorter term, and then to move to improved versions....

  12. Development of gait segmentation methods for wearable foot pressure sensors. (United States)

    Crea, S; De Rossi, S M M; Donati, M; Reberšek, P; Novak, D; Vitiello, N; Lenzi, T; Podobnik, J; Munih, M; Carrozza, M C


    We present an automated segmentation method based on the analysis of plantar pressure signals recorded from two synchronized wireless foot insoles. Given the strict limits on computational power and power consumption typical of wearable electronic components, our aim is to investigate the capability of a Hidden Markov Model machine-learning method, to detect gait phases with different levels of complexity in the processing of the wearable pressure sensors signals. Therefore three different datasets are developed: raw voltage values, calibrated sensor signals and a calibrated estimation of total ground reaction force and position of the plantar center of pressure. The method is tested on a pool of 5 healthy subjects, through a leave-one-out cross validation. The results show high classification performances achieved using estimated biomechanical variables, being on average the 96%. Calibrated signals and raw voltage values show higher delays and dispersions in phase transition detection, suggesting a lower reliability for online applications.

  13. Current trends of the development of chemical analysis

    Directory of Open Access Journals (Sweden)

    Rema Matakova


    Full Text Available This paper presents dynamics of the development of all stages of chemical analysis during last 15 years. The ways of the quality improvement of chemical analysis and its considerable advancement into the field of trace concentrations of substances are shown. Features of development of analytical methods, modern techniques for concentration and separation of substances, as well as chemomerrical processing of results are analyzed. Huge importance of computerization and automation of the analysis is shown.

  14. Development of a measuring and evaluation method for X-ray analysis of residual stresses in the surface region of polycrystalline materials; Entwicklung eines Mess- und Auswerteverfahrens zur roentgenographischen Analyse des Eigenspannungszustandes im Oberflaechenbereich vielkristalliner Werkstoffe

    Energy Technology Data Exchange (ETDEWEB)

    Genzel, C.


    The topic of the habilitation thesis is the development of an X-ray diffraction method for measurement and depth-resolved analysis of internal stresses in the surface region of polycrystalline materials. The method relies on the basic approach of varying {tau}, the penetration depth of the X-rays in the materials, by the scattering vector g{sub theta{psi}} via stepwise specimen rotation. Thus, depth profiles of the interlattice plane distances d(hkl) in the specimen system can be derived for given direction and inclination angles {theta} and {psi} of the scattering vector. This offers the possibility to identify individual components of the stress tensors of the basic equation of the X-ray diffraction analysis, and to perform separate analyses of those components. For calculation of the relevant internal stress distributions {sigma}{sub ij}({tau}) using the interlattice plane distance profiles, a self-consistent method is established which takes into account the high sensitivity of the derived internal stresses in relation to the interlattice plane distance d{sub 0}(hkl) in the stress-free crystal lattice. The evaluation yields results describing the depth profiles as well as the strain-free interlattice plane distance d{sub 0}(hkl), so that a quantitative analysis is possible of tri-axial internal stress states in the surface region of the materials. (orig./CB) [German] Den Gegenstand der vorliegenden Arbeit bildet die Entwicklung eines roentgenographischen Mess- und Auswerteverfahrens zur tiefenaufgeloesten Analyse des oberflaechennahen Eigenspannungszustandes in vielkristallinen Werkstoffen. Der Grundgedanke der Methode besteht darin, die Eindringtiefe {tau} der Roentgenstrahlung in den Werkstoff durch schrittweise Drehung der Probe um den Streuvektor g{sub {theta}}{sub {psi}} zu variieren. Damit koennen Tiefenprofile der Netzebenenabstaende d(hkl) fuer fest vorgegebene Azimut- und Neigungswinkel {theta} und {psi} des Streuvektors im Probensystem ermittelt

  15. Developing New Testing Methods for Nanosatellites Project (United States)

    National Aeronautics and Space Administration — Thermal modeling and Test plan to be carried out and developed by Goddard Space Flight Center. This project will be done in collaboration with partners at MIT and...

  16. Model-based methods for linkage analysis. (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan


    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  17. Wavelet methods in mathematical analysis and engineering

    CERN Document Server

    Damlamian, Alain


    This book gives a comprehensive overview of both the fundamentals of wavelet analysis and related tools, and of the most active recent developments towards applications. It offers a stateoftheart in several active areas of research where wavelet ideas, or more generally multiresolution ideas have proved particularly effective. The main applications covered are in the numerical analysis of PDEs, and signal and image processing. Recently introduced techniques such as Empirical Mode Decomposition (EMD) and new trends in the recovery of missing data, such as compressed sensing, are also presented.

  18. Development Activities Locator and Assessment Method (DALAM) (United States)


    Consortium (2011). White Paper: Being Smart About Development in Afghanistan. - Oxfam America (2008). Smart Development: Why U.S. Foreign Aid Demands Major...electronic format at no cost online. Only the Oxfam Handbook requires purchase, but used copies are available for little money. - Haims et al...during execution. o A short list of metrics for each of the five common project types. CAA-2012049 36  APPENDIX C DALAM - Oxfam (1995). The

  19. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link......-attribute and event-based approaches, cluster identification, and risk exposure....


    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK


    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.


    Directory of Open Access Journals (Sweden)

    L. Avtonomova


    Full Text Available The complex of theoretical, calculable and applied questions of elements transport machine are studied. Coupled-field analyses are useful for solving problems where the coupled interaction of phenomena from various disciplines of physical science is significant. There are basically 3 methods of coupling distinguished by the finite element formulation techniques used to develop the matrix equations.

  2. An Autoregressive Method for Simulation Output Analysis. (United States)


    Spectral Density Function 24 3 THE AUTOREGRESSIVE METHOD AND ITS APPLICATIONS...precision of point estimates can be approximated arbitrarily closely by the spectral density function at zero of a finite order autoregressive process...also develop some approximation theorems for continuous spectral density function . It is then demonstrated that a continuous spectral density function

  3. Child Development in Developing Countries: Introduction and Methods (United States)

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.


    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…

  4. Text analysis devices, articles of manufacture, and text analysis methods (United States)

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C


    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  5. Development of safety analysis technology for LMR

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Do Hee; Kwon, Y. M.; Kim, K. D. [and others


    The analysis methodologies as well as the analysis computer code system for the transient, HCDA, and containment performance analyses, which are required for KALIMER safety analyses, have been developed. The SSC-K code has been developed based on SSC-L which is an analysis code for loop type LMR, by improving models necessary for the KALIMER system analysis, and additional models have been added to the code. In addition, HCDA analysis model has been developed and the containment performance analysis code has been also improved. The preliminary basis for the safety analysis has been established, and the preliminary safety analyses for the key design features have been performed. In addition, a state-of-art analysis for LMR PSA and overseas safety and licensing requirements have been reviewed. The design database for the systematic management of the design documents as well as design processes has been established as well.

  6. Single-cell analysis - Methods and protocols

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi


    Full Text Available This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  7. Homotopy analysis method for solving KdV equations

    Directory of Open Access Journals (Sweden)

    Hossein Jafari


    Full Text Available A scheme is developed for the numerical study of the Korteweg-de Vries (KdV and the Korteweg-de Vries Burgers (KdVB equations with initial conditions by a homotopy approach. Numerical solutions obtained by homotopy analysis method are compared with exact solution. The comparison shows that the obtained solutions are in excellent agreement.

  8. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei


    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  9. Development of a real-time PCR method for the differential detection and quantification of four solanaceae in GMO analysis: potato (Solanum tuberosum), tomato (Solanum lycopersicum), eggplant (Solanum melongena), and pepper (Capsicum annuum). (United States)

    Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves


    The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.

  10. Using Qualitative Methods to Inform Scale Development (United States)

    Rowan, Noell; Wulff, Dan


    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  11. Statistical evaluation of texture analysis from the biocrystallization method


    Meelursarn, Aumaporn


    The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developin...

  12. Extended Finite Element Method for Fracture Analysis of Structures

    CERN Document Server

    Mohammadi, Soheil


    This important textbook provides an introduction to the concepts of the newly developed extended finite element method (XFEM) for fracture analysis of structures, as well as for other related engineering applications.One of the main advantages of the method is that it avoids any need for remeshing or geometric crack modelling in numerical simulation, while generating discontinuous fields along a crack and around its tip. The second major advantage of the method is that by a small increase in number of degrees of freedom, far more accurate solutions can be obtained. The method has recently been


    Institute of Scientific and Technical Information of China (English)

    Dou Lei; Wang Zhiquan


    The MacCormack method is applied to the analysis of multiconductor transmission lines by introducing a new technique that does not require decoupling. This method can be used to analyze a wide range of problems and does not have to consider the matrix forms of distributed parameters. We have developed software named MacCormack Transmission Line Analyzer based on the proposed method. Numerical examples are presented to demonstrate the accuracy and efficiency of the method and illustrate its application to analyzing multiconductor transmission lines.

  14. Methods for generating hydroelectric power development alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Shoou-yuh; Liaw, Shu-liang; Sale, M.J.; Railsback, S.F.


    Hydropower development on large rivers can result in a number of environmental impacts, including potential reductions in dissolved oxygen (DO) concentrations. This study presents a methodology for generating different hydropower development alternatives for evaluation. This methodology employs a Streeter-Phelps model to simulate DO, and the Bounded Implicit Enumeration algorithm to solve an optimization model formulated to maximize hydroelectric energy production subject to acceptable DO limits. The upper Ohio River basin was used to illustrate the use and characteristics of the methodology. The results indicate that several alternatives which meet the specified DO constraints can be generated efficiently, meeting both power and environmental objectives. 17 refs., 2 figs., 1 tab.

  15. Methods of Fourier analysis and approximation theory

    CERN Document Server

    Tikhonov, Sergey


    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  16. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods (United States)

    Waszak, M. R.; Schmidt, D. S.


    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  17. Optical methods for the analysis of dermatopharmacokinetics (United States)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram


    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  18. Developing Unconstrained Methods for Enzyme Evolution (United States)


    methods fail to produce catalytically efficient enzymes. This study has broad application in many technologies from chemical synthesis to human health and...enzymes. This study has broad application in many technologies from chemical synthesis to human health and the environment. Our work centers around the...minimal media with N-15 labeled ammonia . After several months of screening, we finally identified conditions that allowed us to obtain labeled protein in

  19. Recent developments in time-frequency analysis

    CERN Document Server

    Loughlin, Patrick


    Recent Developments in Time-Frequency Analysis brings together in one place important contributions and up-to-date research results in this fast moving area. Recent Developments in Time-Frequency Analysis serves as an excellent reference, providing insight into some of the most challenging research issues in the field.

  20. Current status of fluoride volatility method development

    Energy Technology Data Exchange (ETDEWEB)

    Uhlir, J.; Marecek, M.; Skarohlid, J. [UJV - Nuclear Research Institute, Research Centre Rez, CZ-250 68 Husinec - Rez 130 (Czech Republic)


    The Fluoride Volatility Method is based on a separation process, which comes out from the specific property of uranium, neptunium and plutonium to form volatile hexafluorides whereas most of fission products (mainly lanthanides) and higher transplutonium elements (americium, curium) present in irradiated fuel form nonvolatile tri-fluorides. Fluoride Volatility Method itself is based on direct fluorination of the spent fuel, but before the fluorination step, the removal of cladding material and subsequent transformation of the fuel into a powdered form with a suitable grain size have to be done. The fluorination is made with fluorine gas in a flame fluorination reactor, where the volatile fluorides (mostly UF{sub 6}) are separated from the non-volatile ones (trivalent minor actinides and majority of fission products). The subsequent operations necessary for partitioning of volatile fluorides are the condensation and evaporation of volatile fluorides, the thermal decomposition of PuF{sub 6} and the finally distillation and sorption used for the purification of uranium product. The Fluoride Volatility Method is considered to be a promising advanced pyrochemical reprocessing technology, which can mainly be used for the reprocessing of oxide spent fuels coming from future GEN IV fast reactors.