WorldWideScience

Sample records for combined tool approach

  1. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    OpenAIRE

    Ritou , Mathieu; Garnier , Sébastien; Furet , Benoît; Hascoët , Jean-Yves

    2014-01-01

    International audience; The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach. Then, a new solution i...

  2. Angular approach combined to mechanical model for tool breakage detection by eddy current sensors

    Science.gov (United States)

    Ritou, M.; Garnier, S.; Furet, B.; Hascoet, J. Y.

    2014-02-01

    The paper presents a new complete approach for Tool Condition Monitoring (TCM) in milling. The aim is the early detection of small damages so that catastrophic tool failures are prevented. A versatile in-process monitoring system is introduced for reliability concerns. The tool condition is determined by estimates of the radial eccentricity of the teeth. An adequate criterion is proposed combining mechanical model of milling and angular approach.Then, a new solution is proposed for the estimate of cutting force using eddy current sensors implemented close to spindle nose. Signals are analysed in the angular domain, notably by synchronous averaging technique. Phase shifts induced by changes of machining direction are compensated. Results are compared with cutting forces measured with a dynamometer table.The proposed method is implemented in an industrial case of pocket machining operation. One of the cutting edges has been slightly damaged during the machining, as shown by a direct measurement of the tool. A control chart is established with the estimates of cutter eccentricity obtained during the machining from the eddy current sensors signals. Efficiency and reliability of the method is demonstrated by a successful detection of the damage.

  3. INDUSTRIAL APPROBATION OF COMBINED COUNTERSINK-TAP TOOL

    Directory of Open Access Journals (Sweden)

    Nurulla M. Vagabov

    2017-01-01

    Full Text Available Abstract. Objectives Based on a critical analysis of the results of complex studies, we set out to demonstrate the advantages, as compared with existing technologies, of a developed technology that uses a new cutting scheme with a combined countersink-tap tool. Methods One way to improve the processing capacity, tool life and quality of a cut thread is to reduce the torque and strain hardening of the processed material by employing a new cutting approach to completely eliminate the friction of the lateral sides of the tooth on the surface of the cut thread. It was necessary for this technology to be checked in real production conditions. Results The conducted production tests of a combined countersink-tap tool with the new cutting scheme developed by the inventors have shown that, as a result of a significant reduction in the torque and a decrease in the strain hardening of the processed material, it is possible to increase the cutting speed and increase labour productivity by more than 2 times as compared with the thread cutting processes using taps with staggered teeth, 1.2 times as compared to taps with a corrected structure, and more than 6 times as compared to standard taps. At the same time, the stability of the tool is increased 3-5 times and the number of breakages is also sharply reduced. Conclusion It has been established that the accuracy of the geometric parameters as well as the strength and quality of the thread surface cut by the combined countersink-tap tool with the new cutting scheme in hard-to-work materials is much higher than the same thread parameters obtained by processing with standard and other known taps. The studies also indicated its high reliability, operability and expediency of application for processing the above-mentioned materials. The advantages of the combined tool also include a reduction in thread cutting time as compared to a separate machining of the threaded hole (countersinking with a standard

  4. Combined tool approach is 100% successful for emergency football face mask removal.

    Science.gov (United States)

    Copeland, Aaron J; Decoster, Laura C; Swartz, Erik E; Gattie, Eric R; Gale, Stephanie D

    2007-11-01

    To compare effectiveness of two techniques for removing football face masks: cutting loop straps [cutting tool: FMXtractor (FMX)] or removing screws with a cordless screwdriver and using the FMXtractor as needed for failed removals [combined tool (CT)]. Null hypotheses: no differences in face mask removal success, removal time or difficulty between techniques or helmet characteristics. Retrospective, cross-sectional. NOCSAE-certified helmet reconditioning plants. 600 used high school helmets. Face mask removal attempted with two techniques. Success, removal time, rating of perceived exertion (RPE). Both techniques were effective [CT 100% (300/300); FMX 99.4% (298/300)]. Use of the backup FMXtractor in CT trials was required in 19% of trials. There was significantly (Pfootball player's helmet.

  5. Factors that affect micro-tooling features created by direct printing approach

    Science.gov (United States)

    Kumbhani, Mayur N.

    Current market required faster pace production of smaller, better, and improved products in shorter amount of time. Traditional high-rate manufacturing process such as hot embossing, injection molding, compression molding, etc. use tooling to replicate feature on a products. Miniaturization of many product in the field of biomedical, electronics, optical, and microfluidic is occurring on a daily bases. There is a constant need to produce cheaper, and faster tooling, which can be utilize by existing manufacturing processes. Traditionally, in order to manufacture micron size tooling features processes such as micro-machining, Electrical Discharge Machining (EDM), etc. are utilized. Due to a higher difficulty to produce smaller size features, and longer production cycle time, various additive manufacturing approaches are proposed, e.g. selective laser sintering (SLS), inkjet printing (3DP), fused deposition modeling (FDM), etc. were proposed. Most of these approaches can produce net shaped products from different materials such as metal, ceramic, or polymers. Several attempts were made to produce tooling features using additive manufacturing approaches. Most of these produced tooling were not cost effective, and the life cycle of these tooling was reported short. In this research, a method to produce tooling features using direct printing approach, where highly filled feedstock was dispensed on a substrate. This research evaluated different natural binders, such as guar gum, xanthan gum, and sodium carboxymethyl cellulose (NaCMC) and their combinations were evaluated. The best binder combination was then use to evaluate effect of different metal (316L stainless steel (3 mum), 316 stainless steel (45 mum), and 304 stainless steel (45 mum)) particle size on feature quality. Finally, the effect of direct printing process variables such as dispensing tip internal diameter (500 mum, and 333 mum) at different printing speeds were evaluated.

  6. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    Science.gov (United States)

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  7. Managing and monitoring tuberculosis using web-based tools in combination with traditional approaches.

    Science.gov (United States)

    Chapman, Ann Ln; Darton, Thomas C; Foster, Rachel A

    2013-01-01

    Tuberculosis (TB) remains a global health emergency. Ongoing challenges include the coordination of national and international control programs, high levels of drug resistance in many parts of the world, and availability of accurate and rapid diagnostic tests. The increasing availability and reliability of Internet access throughout both affluent and resource-limited countries brings new opportunities to improve TB management and control through the integration of web-based technologies with traditional approaches. In this review, we explore current and potential future use of web-based tools in the areas of TB diagnosis, treatment, epidemiology, service monitoring, and teaching and training.

  8. A combined rheology and time domain NMR approach for determining water distributions in protein blends

    NARCIS (Netherlands)

    Dekkers, Birgit L.; Kort, de Daan W.; Grabowska, Katarzyna J.; Tian, Bei; As, Van Henk; Goot, van der Atze Jan

    2016-01-01

    We present a combined time domain NMR and rheology approach to quantify the water distribution in a phase separated protein blend. The approach forms the basis for a new tool to assess the microstructural properties of phase separated biopolymer blends, making it highly relevant for many food and

  9. Thinkering through Experiments: Nurturing Transdisciplinary Approaches to the Design of Testing Tools

    Directory of Open Access Journals (Sweden)

    Kathryn B. Francis

    2017-11-01

    Full Text Available In order to assess and understand human behavior, traditional approaches to experimental design incorporate testing tools that are often artificial and devoid of corporeal features. Whilst these offer experimental control in situations in which, methodologically, real behaviors cannot be examined, there is increasing evidence that responses given in these contextually deprived experiments fail to trigger genuine responses. This may result from a lack of consideration regarding the material makeup and associations connected with the fabric of experimental tools. In a two-year collaboration, we began to experiment with the physicality of testing tools using the domain of moral psychology as a case study. This collaboration involved thinkering and prototyping methods that included direct contact and consideration of the materials involved in experimentation. Having explored the embodied nature of morality, we combined approaches from experimental psychology, moral philosophy, design thinking, and computer science to create a new testing tool for simulated moral behavior. Although the testing tool itself generated fruitful results, this paper considers the collaborative methodology through which it was produced as a route to highlight material questions within psychological research.

  10. A combined scanning tunneling microscope-atomic layer deposition tool.

    Science.gov (United States)

    Mack, James F; Van Stockum, Philip B; Iwadate, Hitoshi; Prinz, Fritz B

    2011-12-01

    We have built a combined scanning tunneling microscope-atomic layer deposition (STM-ALD) tool that performs in situ imaging of deposition. It operates from room temperature up to 200 °C, and at pressures from 1 × 10(-6) Torr to 1 × 10(-2) Torr. The STM-ALD system has a complete passive vibration isolation system that counteracts both seismic and acoustic excitations. The instrument can be used as an observation tool to monitor the initial growth phases of ALD in situ, as well as a nanofabrication tool by applying an electric field with the tip to laterally pattern deposition. In this paper, we describe the design of the tool and demonstrate its capability for atomic resolution STM imaging, atomic layer deposition, and the combination of the two techniques for in situ characterization of deposition.

  11. Managing and monitoring tuberculosis using web-based tools in combination with traditional approaches

    Directory of Open Access Journals (Sweden)

    Chapman AL

    2013-11-01

    Full Text Available Ann LN Chapman,1 Thomas C Darton,2 Rachel A Foster11Department of Infection and Tropical Medicine, Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, 2Oxford Vaccine Group, Centre for Clinical Vaccinology and Tropical Medicine, University of Oxford, Oxford, UKAbstract: Tuberculosis (TB remains a global health emergency. Ongoing challenges include the coordination of national and international control programs, high levels of drug resistance in many parts of the world, and availability of accurate and rapid diagnostic tests. The increasing availability and reliability of Internet access throughout both affluent and resource-limited countries brings new opportunities to improve TB management and control through the integration of web-based technologies with traditional approaches. In this review, we explore current and potential future use of web-based tools in the areas of TB diagnosis, treatment, epidemiology, service monitoring, and teaching and training.Keywords: tuberculosis, information communication technology, Internet

  12. XQCAT eXtra Quark Combined Analysis Tool

    CERN Document Server

    Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L

    2015-01-01

    XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

  13. Minimization of the LCA impact of thermodynamic cycles using a combined simulation-optimization approach

    International Nuclear Information System (INIS)

    Brunet, Robert; Cortés, Daniel; Guillén-Gosálbez, Gonzalo; Jiménez, Laureano; Boer, Dieter

    2012-01-01

    This work presents a computational approach for the simultaneous minimization of the total cost and environmental impact of thermodynamic cycles. Our method combines process simulation, multi-objective optimization and life cycle assessment (LCA) within a unified framework that identifies in a systematic manner optimal design and operating conditions according to several economic and LCA impacts. Our approach takes advantages of the complementary strengths of process simulation (in which mass, energy balances and thermodynamic calculations are implemented in an easy manner) and rigorous deterministic optimization tools. We demonstrate the capabilities of this strategy by means of two case studies in which we address the design of a 10 MW Rankine cycle modeled in Aspen Hysys, and a 90 kW ammonia-water absorption cooling cycle implemented in Aspen Plus. Numerical results show that it is possible to achieve environmental and cost savings using our rigorous approach. - Highlights: ► Novel framework for the optimal design of thermdoynamic cycles. ► Combined use of simulation and optimization tools. ► Optimal design and operating conditions according to several economic and LCA impacts. ► Design of a 10MW Rankine cycle in Aspen Hysys, and a 90kW absorption cycle in Aspen Plus.

  14. Minimization of the LCA impact of thermodynamic cycles using a combined simulation-optimization approach

    Energy Technology Data Exchange (ETDEWEB)

    Brunet, Robert; Cortes, Daniel [Departament d' Enginyeria Quimica, Escola Tecnica Superior d' Enginyeria Quimica, Universitat Rovira i Virgili, Campus Sescelades, Avinguda Paisos Catalans 26, 43007 Tarragona (Spain); Guillen-Gosalbez, Gonzalo [Departament d' Enginyeria Quimica, Escola Tecnica Superior d' Enginyeria Quimica, Universitat Rovira i Virgili, Campus Sescelades, Avinguda Paisos Catalans 26, 43007 Tarragona (Spain); Jimenez, Laureano [Departament d' Enginyeria Quimica, Escola Tecnica Superior d' Enginyeria Quimica, Universitat Rovira i Virgili, Campus Sescelades, Avinguda Paisos Catalans 26, 43007 Tarragona (Spain); Boer, Dieter [Departament d' Enginyeria Mecanica, Escola Tecnica Superior d' Enginyeria, Universitat Rovira i Virgili, Campus Sescelades, Avinguda Paisos Catalans 26, 43007, Tarragona (Spain)

    2012-12-15

    This work presents a computational approach for the simultaneous minimization of the total cost and environmental impact of thermodynamic cycles. Our method combines process simulation, multi-objective optimization and life cycle assessment (LCA) within a unified framework that identifies in a systematic manner optimal design and operating conditions according to several economic and LCA impacts. Our approach takes advantages of the complementary strengths of process simulation (in which mass, energy balances and thermodynamic calculations are implemented in an easy manner) and rigorous deterministic optimization tools. We demonstrate the capabilities of this strategy by means of two case studies in which we address the design of a 10 MW Rankine cycle modeled in Aspen Hysys, and a 90 kW ammonia-water absorption cooling cycle implemented in Aspen Plus. Numerical results show that it is possible to achieve environmental and cost savings using our rigorous approach. - Highlights: Black-Right-Pointing-Pointer Novel framework for the optimal design of thermdoynamic cycles. Black-Right-Pointing-Pointer Combined use of simulation and optimization tools. Black-Right-Pointing-Pointer Optimal design and operating conditions according to several economic and LCA impacts. Black-Right-Pointing-Pointer Design of a 10MW Rankine cycle in Aspen Hysys, and a 90kW absorption cycle in Aspen Plus.

  15. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  16. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  17. A Signal Detection Approach in a Multiple Cohort Study: Different Admission Tools Uniquely Select Different Successful Students

    Directory of Open Access Journals (Sweden)

    Linda van Ooijen-van der Linden

    2018-05-01

    Full Text Available Using multiple admission tools in university admission procedures is common practice. This is particularly useful if different admission tools uniquely select different subgroups of students who will be successful in university programs. A signal-detection approach was used to investigate the accuracy of Secondary School grade point average (SSGPA, an admission test score (ACS, and a non-cognitive score (NCS in uniquely selecting successful students. This was done for three consecutive first year cohorts of a broad psychology program. Each applicant's score on SSGPA, ACS, or NCS alone—and on seven combinations of these scores, all considered separate “admission tools”—was compared at two different (medium and high cut-off scores (criterion levels. Each of the tools selected successful students who were not selected by any of the other tools. Both sensitivity and specificity were enhanced by implementing multiple tools. The signal-detection approach distinctively provided useful information for decisions on admission instruments and cut-off scores.

  18. APPROACHES AND TOOLS FOR QUALITY EXAMINATION OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Galina P. Lavrentieva

    2011-02-01

    Full Text Available The article highlights the scientific and methodological approaches to quality examination of e-learning tools for general education. There are considered terms of the research, described the essence of the main components and stages of the examination. A methodology for quality estimation tools elaboration is described that should be based on identifying criteria and parameters of evaluation. Complex of psycho-pedagogical and ergonomic requirements that should be used in organizing expertise is justified and the most expedient ways of their implementation are examined.

  19. An approach for investigation of secure access processes at a combined e-learning environment

    Science.gov (United States)

    Romansky, Radi; Noninska, Irina

    2017-12-01

    The article discuses an approach to investigate processes for regulation the security and privacy control at a heterogenous e-learning environment realized as a combination of traditional and cloud means and tools. Authors' proposal for combined architecture of e-learning system is presented and main subsystems and procedures are discussed. A formalization of the processes for using different types resources (public, private internal and private external) is proposed. The apparatus of Markovian chains (MC) is used for modeling and analytical investigation of the secure access to the resources is used and some assessments are presented.

  20. Modifiable Combining Functions

    OpenAIRE

    Cohen, Paul; Shafer, Glenn; Shenoy, Prakash P.

    2013-01-01

    Modifiable combining functions are a synthesis of two common approaches to combining evidence. They offer many of the advantages of these approaches and avoid some disadvantages. Because they facilitate the acquisition, representation, explanation, and modification of knowledge about combinations of evidence, they are proposed as a tool for knowledge engineers who build systems that reason under uncertainty, not as a normative theory of evidence.

  1. Combining genomic and proteomic approaches for epigenetics research

    Science.gov (United States)

    Han, Yumiao; Garcia, Benjamin A

    2014-01-01

    Epigenetics is the study of changes in gene expression or cellular phenotype that do not change the DNA sequence. In this review, current methods, both genomic and proteomic, associated with epigenetics research are discussed. Among them, chromatin immunoprecipitation (ChIP) followed by sequencing and other ChIP-based techniques are powerful techniques for genome-wide profiling of DNA-binding proteins, histone post-translational modifications or nucleosome positions. However, mass spectrometry-based proteomics is increasingly being used in functional biological studies and has proved to be an indispensable tool to characterize histone modifications, as well as DNA–protein and protein–protein interactions. With the development of genomic and proteomic approaches, combination of ChIP and mass spectrometry has the potential to expand our knowledge of epigenetics research to a higher level. PMID:23895656

  2. Prioritizing the refactoring need for critical component using combined approach

    Directory of Open Access Journals (Sweden)

    Rajni Sehgal

    2018-10-01

    Full Text Available One of the most promising strategies that will smooth out the maintainability issues of the software is refactoring. Due to lack of proper design approach, the code often inherits some bad smells which may lead to improper functioning of the code, especially when it is subject to change and requires some maintenance. A lot of studies have been performed to optimize the refactoring strategy which is also a very expensive process. In this paper, a component based system is considered, and a Fuzzy Multi Criteria Decision Making (FMCDM model is proposed by combining subjective and objective weights to rank the components as per their urgency of refactoring. Jdeodorant tool is used to detect the code smells from the individual components of a software system. The objective method uses the Entropy approach to rank the component having the code smell. The subjective method uses the Fuzzy TOPSIS approach based on decision makers’ judgement, to identify the critically and dependency of these code smells on the overall software. The suggested approach is implemented on component-based software having 15 components. The constitute components are ranked based on refactoring requirements.

  3. MaLT - Combined Motor and Language Therapy Tool for Brain Injury Patients Using Kinect.

    Science.gov (United States)

    Wairagkar, Maitreyee; McCrindle, Rachel; Robson, Holly; Meteyard, Lotte; Sperrin, Malcom; Smith, Andy; Pugh, Moyra

    2017-03-23

    presented as graphs plotted from patient game data. Performance indicators include reaction time, accuracy, number of incorrect responses and hand use. The resultant games have also been tested by the PPI with a positive response and further suggestions for future modifications made. MaLT provides a tool that innovatively combines motor and language therapy for high dosage rehabilitation in the home. It has demonstrated that motion sensor technology can be successfully combined with a language therapy task to target both upper limb and linguistic impairment in patients following brain injury. The initial studies on stroke survivors have demonstrated that the combined therapy approach is viable and the outputs of this study will inform planned larger scale future trials.

  4. Natural training tools of informatics in conditions of embodied and mental approach realization

    Directory of Open Access Journals (Sweden)

    Daria A. Barkhatova

    2017-01-01

    Full Text Available Modern processes of globalization and informatization of human activity cause the necessity of change of the educational paradigm in the field of information training of a person, focused on the formation of the strong fundamental knowledge and abilities, which are necessary for person’s information activities and self-education during all life.In connection with these requirements, it is necessary to pay attention to new approaches in education, based on achievements of cognitive science and modern pedagogic. One of such approaches is embodied and mental approach. The paper is devoted to the description of a way of realization of embodied and mental approach in training of informatics through application of the natural tools, providing the fullest and deep understanding of the educational material, and development of cognitive abilities of students.In the paper the theoretical analysis of psychology-pedagogical and methodical literature on a research subject is carried out, results are generalized, natural tools are modeled and results of their partial approbation are described. Achievement of necessary quality of education is offered due to the use of modern techniques, focused on the development of cognitive abilities and improvement of quality of the knowledge. In the conditions of information education, the combination of embodied and mental approaches will allow to acquaint students with the essence of the studied subject due to activation of motor area of the memory and the kinesthetic and visual perception channels. The instrument of realization of this idea is offered to use natural tools in informatics, what is actualized by age features of cognitive abilities of students and individual requirements to ways of perception and mastering of the material, matched according to the level of their knowledge.The research results describe the models of natural tools, developed by students and lecturers of the basic Department of Informatics

  5. Ensemble approach combining multiple methods improves human transcription start site prediction

    LENUS (Irish Health Repository)

    Dineen, David G

    2010-11-30

    Abstract Background The computational prediction of transcription start sites is an important unsolved problem. Some recent progress has been made, but many promoters, particularly those not associated with CpG islands, are still difficult to locate using current methods. These methods use different features and training sets, along with a variety of machine learning techniques and result in different prediction sets. Results We demonstrate the heterogeneity of current prediction sets, and take advantage of this heterogeneity to construct a two-level classifier (\\'Profisi Ensemble\\') using predictions from 7 programs, along with 2 other data sources. Support vector machines using \\'full\\' and \\'reduced\\' data sets are combined in an either\\/or approach. We achieve a 14% increase in performance over the current state-of-the-art, as benchmarked by a third-party tool. Conclusions Supervised learning methods are a useful way to combine predictions from diverse sources.

  6. SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)

    Science.gov (United States)

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  7. Cleaner Production and Workplace Health and Safety: A combined approach. A case study from South Africa

    DEFF Research Database (Denmark)

    Hedlund, Frank Huess

    Environmental goals may be pursued narrow-mindedly with no attention paid to the workplace. This book examines combined approaches in cleaner production projects. It explores two main avenues. First, integration into the project specification. The planning tools in use by assistance agencies......, integration of management systems is an option. A study on the South African Nosa 5-Star system refutes earlier criticism of dismal performance of top-down systems. It is argued that integration at this level is viable. For small companies, less formalistic approaches are required. ILO's network concept WISE...

  8. Developing a New Biophysical Tool to Combine Magneto-Optical Tweezers with Super-Resolution Fluorescence Microscopy

    Directory of Open Access Journals (Sweden)

    Zhaokun Zhou

    2015-06-01

    Full Text Available We present a novel experimental setup in which magnetic and optical tweezers are combined for torque and force transduction onto single filamentous molecules in a transverse configuration to allow simultaneous mechanical measurement and manipulation. Previously we have developed a super-resolution imaging module which, in conjunction with advanced imaging techniques such as Blinking assisted Localisation Microscopy (BaLM, achieves localisation precision of single fluorescent dye molecules bound to DNA of ~30 nm along the contour of the molecule; our work here describes developments in producing a system which combines tweezing and super-resolution fluorescence imaging. The instrument also features an acousto-optic deflector that temporally divides the laser beam to form multiple traps for high throughput statistics collection. Our motivation for developing the new tool is to enable direct observation of detailed molecular topological transformation and protein binding event localisation in a stretching/twisting mechanical assay that previously could hitherto only be deduced indirectly from the end-to-end length variation of DNA. Our approach is simple and robust enough for reproduction in the lab without the requirement of precise hardware engineering, yet is capable of unveiling the elastic and dynamic properties of filamentous molecules that have been hidden using traditional tools.

  9. Modeling of Principal Flank Wear: An Empirical Approach Combining the Effect of Tool, Environment and Workpiece Hardness

    Science.gov (United States)

    Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan

    2016-10-01

    Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.

  10. New user-friendly approach to obtain an Eisenberg plot and its use as a practical tool in protein sequence analysis.

    Science.gov (United States)

    Keller, Rob C A

    2011-01-01

    The Eisenberg plot or hydrophobic moment plot methodology is one of the most frequently used methods of bioinformatics. Bioinformatics is more and more recognized as a helpful tool in Life Sciences in general, and recent developments in approaches recognizing lipid binding regions in proteins are promising in this respect. In this study a bioinformatics approach specialized in identifying lipid binding helical regions in proteins was used to obtain an Eisenberg plot. The validity of the Heliquest generated hydrophobic moment plot was checked and exemplified. This study indicates that the Eisenberg plot methodology can be transferred to another hydrophobicity scale and renders a user-friendly approach which can be utilized in routine checks in protein-lipid interaction and in protein and peptide lipid binding characterization studies. A combined approach seems to be advantageous and results in a powerful tool in the search of helical lipid-binding regions in proteins and peptides. The strength and limitations of the Eisenberg plot approach itself are discussed as well. The presented approach not only leads to a better understanding of the nature of the protein-lipid interactions but also provides a user-friendly tool for the search of lipid-binding regions in proteins and peptides.

  11. Manufacture of functional surfaces through combined application of tool manufacturing processes and Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Eriksen, Rasmus Solmer; Arentoft, Mogens; Grønbæk, J.

    2012-01-01

    The tool surface topography is often a key parameter in the tribological performance of modern metal forming tools. A new generation of multifunctional surfaces is achieved by combination of conventional tool manufacturing processes with a novel Robot Assisted Polishing process. This novel surface...

  12. Setting research priorities by applying the combined approach matrix.

    Science.gov (United States)

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  13. An Approach to Flooding Inundation Combining the Streamflow Prediction Tool (SPT) and Downscaled Soil Moisture

    Science.gov (United States)

    Cotterman, K. A.; Follum, M. L.; Pradhan, N. R.; Niemann, J. D.

    2017-12-01

    Flooding impacts numerous aspects of society, from localized flash floods to continental-scale flood events. Many numerical flood models focus solely on riverine flooding, with some capable of capturing both localized and continental-scale flood events. However, these models neglect flooding away from channels that are related to excessive ponding, typically found in areas with flat terrain and poorly draining soils. In order to obtain a holistic view of flooding, we combine flood results from the Streamflow Prediction Tool (SPT), a riverine flood model, with soil moisture downscaling techniques to determine if a better representation of flooding is obtained. This allows for a more holistic understanding of potential flood prone areas, increasing the opportunity for more accurate warnings and evacuations during flooding conditions. Thirty-five years of near-global historical streamflow is reconstructed with continental-scale flow routing of runoff from global land surface models. Elevation data was also obtained worldwide, to establish a relationship between topographic attributes and soil moisture patterns. Derived soil moisture data is validated against observed soil moisture, increasing confidence in the ability to accurately capture soil moisture patterns. Potential flooding situations can be examined worldwide, with this study focusing on the United States, Central America, and the Philippines.

  14. PAH plant uptake prediction: Evaluation of combined availability tools and modeling approach

    OpenAIRE

    Ouvrard, Stéphanie; DUPUY, Joan; Leglize, Pierre; Sterckeman, Thibault

    2015-01-01

    Transfer to plant is one of the main human exposure pathways of polycyclic aromatic hydrocarbons (PAH) from contaminated soils. However existing models implemented in risk assessment tools mostly rely on i) total contaminant concentration and ii) plant uptake models based on hydroponics experiments established with pesticides (Briggs et al., 1982, 1983). Total concentrations of soil contaminants are useful to indicate pollution, however they do not necessarily indicate risk. Me...

  15. Prediction of a service demand using combined forecasting approach

    Science.gov (United States)

    Zhou, Ling

    2017-08-01

    Forecasting facilitates cutting down operational and management costs while ensuring service level for a logistics service provider. Our case study here is to investigate how to forecast short-term logistic demand for a LTL carrier. Combined approach depends on several forecasting methods simultaneously, instead of a single method. It can offset the weakness of a forecasting method with the strength of another, which could improve the precision performance of prediction. Main issues of combined forecast modeling are how to select methods for combination, and how to find out weight coefficients among methods. The principles of method selection include that each method should apply to the problem of forecasting itself, also methods should differ in categorical feature as much as possible. Based on these principles, exponential smoothing, ARIMA and Neural Network are chosen to form the combined approach. Besides, least square technique is employed to settle the optimal weight coefficients among forecasting methods. Simulation results show the advantage of combined approach over the three single methods. The work done in the paper helps manager to select prediction method in practice.

  16. Transbasal versus endoscopic endonasal versus combined approaches for olfactory groove meningiomas: importance of approach selection.

    Science.gov (United States)

    Liu, James K; Silva, Nicole A; Sevak, Ilesha A; Eloy, Jean Anderson

    2018-04-01

    OBJECTIVE There has been much debate regarding the optimal surgical approach for resecting olfactory groove meningiomas (OGMs). In this paper, the authors analyzed the factors involved in approach selection and reviewed the surgical outcomes in a series of OGMs. METHODS A retrospective review of 28 consecutive OGMs from a prospective database was conducted. Each tumor was treated via one of 3 approaches: transbasal approach (n = 15), pure endoscopic endonasal approach (EEA; n = 5), and combined (endoscope-assisted) transbasal-EEA (n = 8). RESULTS The mean tumor volume was greatest in the transbasal (92.02 cm 3 ) and combined (101.15 cm 3 ) groups. Both groups had significant lateral dural extension over the orbits (transbasal 73.3%, p 95%) was achieved in 20% of transbasal and 37.5% of combined cases, all due to tumor adherence to the critical neurovascular structures. The rate of CSF leakage was 0% in the transbasal and combined groups, and there was 1 leak in the EEA group (20%), resulting in an overall CSF leakage rate of 3.6%. Olfaction was preserved in 66.7% in the transbasal group. There was no significant difference in length of stay or 30-day readmission rate between the 3 groups. The mean modified Rankin Scale score was 0.79 after the transbasal approach, 2.0 after EEA, and 2.4 after the combined approach (p = 0.0604). The mean follow-up was 14.5 months (range 1-76 months). CONCLUSIONS The transbasal approach provided the best clinical outcomes with the lowest rate of complications for large tumors (> 40 mm) and for smaller tumors (OGMs invading the sinonasal cavity. Careful patient selection using an individualized, tailored strategy is important to optimize surgical outcomes.

  17. Combining accounting approaches to practice valuation.

    Science.gov (United States)

    Schwartzben, D; Finkler, S A

    1998-06-01

    Healthcare organizations that wish to acquire physician or ambulatory care practices can choose from a variety of practice valuation approaches. Basic accounting methods assess the value of a physician practice on the basis of a historical, balance-sheet description of tangible assets. Yet these methods alone are inadequate to determine the true financial value of a practice. By using a combination of accounting approaches to practice valuation that consider factors such as fair market value, opportunity cost, and discounted cash flow over a defined time period, organizations can more accurately assess a practice's actual value.

  18. Approach and tool for computer animation of fields in electrical apparatus

    International Nuclear Information System (INIS)

    Miltchev, Radoslav; Yatchev, Ivan S.; Ritchie, Ewen

    2002-01-01

    The paper presents a technical approach and post-processing tool for creating and displaying computer animation. The approach enables handling of two- and three-dimensional physical field phenomena results obtained from finite element software or to display movement processes in electrical apparatus simulations. The main goal of this work is to extend auxiliary features built in general-purpose CAD software working in the Windows environment. Different storage techniques were examined and the one employing image capturing was chosen. The developed tool provides benefits of independent visualisation, creating scenarios and facilities for exporting animations in common file fon-nats for distribution on different computer platforms. It also provides a valuable educational tool.(Author)

  19. Quantifying the Metabolome of Pseudomonas taiwanensis VLB120: Evaluation of Hot and Cold Combined Quenching/Extraction Approaches

    DEFF Research Database (Denmark)

    Wordofa, Gossa Garedew; Kristensen, Mette; Schrübbers, Lars

    2017-01-01

    Absolute quantification of free intracellular metabolites is a valuable tool in both pathway discovery and metabolic engineering. In this study, we conducted a comprehensive examination of different hot and cold combined quenching/extraction approaches to extract and quantify intracellular...... (such as cold methanol/acetonitrile/water, hot water, and boiling ethanol/water, as well as cold ethanol/water) were tested and evaluated for P. taiwanensis VLB120 metabolome analysis. In total 94 out of 107 detected intracellular metabolites were quantified using an isotope-ratio-based approach....... The quantified metabolites include amino acids, nucleotides, central carbon metabolism intermediates, redox cofactors, and others. The acquired data demonstrate that the pressure driven fast filtration approach followed by boiling ethanol quenching/extraction is the most adequate technique for P. taiwanensis VLB...

  20. A combined MOIP-MCDA approach to building and screening atmospheric pollution control strategies in urban regions.

    Science.gov (United States)

    Mavrotas, George; Ziomas, Ioannis C; Diakouaki, Danae

    2006-07-01

    This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.

  1. A Combined MOIP-MCDA Approach to Building and Screening Atmospheric Pollution Control Strategies in Urban Regions

    Science.gov (United States)

    Mavrotas, George; Ziomas, Ioannis C.; Diakouaki, Danae

    2006-07-01

    This article presents a methodological approach for the formulation of control strategies capable of reducing atmospheric pollution at the standards set by European legislation. The approach was implemented in the greater area of Thessaloniki and was part of a project aiming at the compliance with air quality standards in five major cities in Greece. The methodological approach comprises two stages: in the first stage, the availability of several measures contributing to a certain extent to reducing atmospheric pollution indicates a combinatorial problem and favors the use of Integer Programming. More specifically, Multiple Objective Integer Programming is used in order to generate alternative efficient combinations of the available policy measures on the basis of two conflicting objectives: public expenditure minimization and social acceptance maximization. In the second stage, these combinations of control measures (i.e., the control strategies) are then comparatively evaluated with respect to a wider set of criteria, using tools from Multiple Criteria Decision Analysis, namely, the well-known PROMETHEE method. The whole procedure is based on the active involvement of local and central authorities in order to incorporate their concerns and preferences, as well as to secure the adoption and implementation of the resulting solution.

  2. A Holistic Approach to Interdisciplinary Innovation Supported by a Simple Tool

    DEFF Research Database (Denmark)

    Denise J. Stokholm, Marianne

    2008-01-01

    Innovation is recognised as a strategy to achieve competitive businesses and products. Managing innovation at all levels requires integration of knowledge and interdisciplinary cooperation. Different understandings and approaches to innovation between professions often result in communication...... problems. To overcome this barrier a common ground is needed. This paper describes a holistic approach to innovation and presents a simple tool for facilitating cooperation on a diversity of innovation matters. It describes the development and use of the tool and demonstrates its capacity to support...

  3. Combining multi-criteria decision analysis and mini-health technology assessment: A funding decision-support tool for medical devices in a university hospital setting.

    Science.gov (United States)

    Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle

    2016-02-01

    At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Obtaining Global Picture From Single Point Observations by Combining Data Assimilation and Machine Learning Tools

    Science.gov (United States)

    Shprits, Y.; Zhelavskaya, I. S.; Kellerman, A. C.; Spasojevic, M.; Kondrashov, D. A.; Ghil, M.; Aseev, N.; Castillo Tibocha, A. M.; Cervantes Villa, J. S.; Kletzing, C.; Kurth, W. S.

    2017-12-01

    Increasing volume of satellite measurements requires deployment of new tools that can utilize such vast amount of data. Satellite measurements are usually limited to a single location in space, which complicates the data analysis geared towards reproducing the global state of the space environment. In this study we show how measurements can be combined by means of data assimilation and how machine learning can help analyze large amounts of data and can help develop global models that are trained on single point measurement. Data Assimilation: Manual analysis of the satellite measurements is a challenging task, while automated analysis is complicated by the fact that measurements are given at various locations in space, have different instrumental errors, and often vary by orders of magnitude. We show results of the long term reanalysis of radiation belt measurements along with fully operational real-time predictions using data assimilative VERB code. Machine Learning: We present application of the machine learning tools for the analysis of NASA Van Allen Probes upper-hybrid frequency measurements. Using the obtained data set we train a new global predictive neural network. The results for the Van Allen Probes based neural network are compared with historical IMAGE satellite observations. We also show examples of predictions of geomagnetic indices using neural networks. Combination of machine learning and data assimilation: We discuss how data assimilation tools and machine learning tools can be combine so that physics-based insight into the dynamics of the particular system can be combined with empirical knowledge of it's non-linear behavior.

  5. Hybrid Design Tools Intuit Interaction

    NARCIS (Netherlands)

    Wendrich, Robert E.; Kyvsgaard Hansen, P.; Rasmussen, J.; Jorgensen, K.A.; Tollestrup, C.

    2012-01-01

    Non-linear, non-explicit, non-standard thinking and ambiguity in design tools has a great impact on enhancement of creativity during ideation and conceptualization. Tacit-tangible representation based on a mere idiosyncratic and individual approach combined with computational assistance allows the

  6. Surgical treatment of traumatic cervical facet dislocation: anterior, posterior or combined approaches?

    Directory of Open Access Journals (Sweden)

    Catarina C. Lins

    Full Text Available ABSTRACT Surgical treatment is well accepted for patients with traumatic cervical facet joint dislocations (CFD, but there is uncertainty over which approach is better: anterior, posterior or combined. We performed a systematic literature review to evaluate the indications for anterior and posterior approaches in the management of CFD. Anterior approaches can restore cervical lordosis, and cause less postoperative pain and less wound problems. Posterior approaches are useful for direct reduction of locked facet joints and provide stronger fixation from a biomechanical point of view. Combined approaches can be used in more complex cases. Although both anterior and posterior approaches can be used interchangeably, there are some patients who may benefit from one of them over the other, as discussed in this review. Surgeons who treat cervical spine trauma should be able to perform both procedures as well as combined approaches to adequately manage CFD and improve patients’ final outcomes.

  7. Bioequivalence evaluation of two brands of amoxicillin/clavulanic acid 250/125 mg combination tablets in healthy human volunteers: use of replicate design approach.

    Science.gov (United States)

    Idkaidek, Nasir M; Al-Ghazawi, Ahmad; Najib, Naji M

    2004-12-01

    The purpose of this study was to apply a replicate design approach to a bioequivalence study of amoxicillin/clavulanic acid combination following a 250/125 mg oral dose to 23 subjects, and to compare the analysis of individual bioequivalence with average bioequivalence. This was conducted as a 2-treatment 2-sequence 4-period crossover study. Average bioequivalence was shown, while the results from the individual bioequivalence approach had no success in showing bioequivalence. In conclusion, the individual bioequivalence approach is a strong statistical tool to test for intra-subject variances and also subject-by-formulation interaction variance compared with the average bioequivalence approach. copyright (c) 2004 John Wiley & Sons, Ltd.

  8. Constructed Wetlands for Combined Sewer Overflow Treatment—Comparison of German, French and Italian Approaches

    Directory of Open Access Journals (Sweden)

    Daniel Meyer

    2012-12-01

    Full Text Available Combined sewer systems are designed to transport stormwater surface run off in addition to the dry weather flows up to defined limits. In most European countries, hydraulic loads greater than the design flow are discharged directly into receiving water bodies, with minimal treatment (screening, sedimentation, or with no treatment at all. One feasible solution to prevent receiving waters from strong negative impacts seems to be the application of vertical flow constructed wetlands. In Germany, first attempts to use this ecological technology were recognized in early 1990s. Since then, further development continued until a high level of treatment performance was reached. During recent years the national “state-of-the-art” (defined in 2005 was adapted in other European countries, including France and Italy. Against the background of differing national requirements in combined sewer system design, substantial developmental steps were taken. The use of coarser filter media in combination with alternating loadings of separated filter beds allows direct feedings with untreated combined runoff. Permanent water storage in deep layers of the wetland improves the system’s robustness against extended dry periods, but contains operational risks. Besides similar functions (but different designs and layouts, correct dimensioning of all approaches suffers from uncertainties in long-term rainfall predictions as well as inside sewer system simulation tools.

  9. A combined reaction class approach with integrated molecular orbital+molecular orbital (IMOMO) methodology: A practical tool for kinetic modeling

    International Nuclear Information System (INIS)

    Truong, Thanh N.; Maity, Dilip K.; Truong, Thanh-Thai T.

    2000-01-01

    We present a new practical computational methodology for predicting thermal rate constants of reactions involving large molecules or a large number of elementary reactions in the same class. This methodology combines the integrated molecular orbital+molecular orbital (IMOMO) approach with our recently proposed reaction class models for tunneling. With the new methodology, we show that it is possible to significantly reduce the computational cost by several orders of magnitude while compromising the accuracy in the predicted rate constants by less than 40% over a wide range of temperatures. Another important result is that the computational cost increases only slightly as the system size increases. (c) 2000 American Institute of Physics

  10. Combining web-based tools for transparent evaluation of data for risk assessment: developmental effects of bisphenol A on the mammary gland as a case study.

    Science.gov (United States)

    Molander, Linda; Hanberg, Annika; Rudén, Christina; Ågerstrand, Marlene; Beronius, Anna

    2017-03-01

    Different tools have been developed that facilitate systematic and transparent evaluation and handling of toxicity data in the risk assessment process. The present paper sets out to explore the combined use of two web-based tools for study evaluation and identification of reliable data relevant to health risk assessment. For this purpose, a case study was performed using in vivo toxicity studies investigating low-dose effects of bisphenol A on mammary gland development. The reliability of the mammary gland studies was evaluated using the Science in Risk Assessment and Policy (SciRAP) criteria for toxicity studies. The Health Assessment Workspace Collaborative (HAWC) was used for characterizing and visualizing the mammary gland data in terms of type of effects investigated and reported, and the distribution of these effects within the dose interval. It was then investigated whether there was any relationship between study reliability and the type of effects reported and/or their distribution in the dose interval. The combination of the SciRAP and HAWC tools allowed for transparent evaluation and visualization of the studies investigating developmental effects of BPA on the mammary gland. The use of these tools showed that there were no apparent differences in the type of effects and their distribution in the dose interval between the five studies assessed as most reliable and the whole data set. Combining the SciRAP and HAWC tools was found to be a useful approach for evaluating in vivo toxicity studies and identifying reliable and sensitive information relevant to regulatory risk assessment of chemicals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Liquid-phase microextraction approaches combined with atomic detection: A critical review

    International Nuclear Information System (INIS)

    Pena-Pereira, Francisco; Lavilla, Isela; Bendicho, Carlos

    2010-01-01

    Liquid-phase microextraction (LPME) displays unique characteristics such as excellent preconcentration capability, simplicity, low cost, sample cleanup and integration of steps. Even though LPME approaches have the potential to be combined with almost every analytical technique, their use in combination with atomic detection techniques has not been exploited until recently. A comprehensive review dealing with the applications of liquid-phase microextraction combined with atomic detection techniques is presented. Theoretical features, possible strategies for these combinations as well as the effect of key experimental parameters influencing method development are addressed. Finally, a critical comparison of the different LPME approaches in terms of enrichment factors achieved, extraction efficiency, precision, selectivity and simplicity of operation is provided.

  12. MediPlEx - a tool to combine in silico & experimental gene expression profiles of the model legume Medicago truncatula

    Directory of Open Access Journals (Sweden)

    Stutz Leonhard J

    2010-10-01

    Full Text Available Abstract Background Expressed Sequence Tags (ESTs are in general used to gain a first insight into gene activities from a species of interest. Subsequently, and typically based on a combination of EST and genome sequences, microarray-based expression analyses are performed for a variety of conditions. In some cases, a multitude of EST and microarray experiments are conducted for one species, covering different tissues, cell states, and cell types. Under these circumstances, the challenge arises to combine results derived from the different expression profiling strategies, with the goal to uncover novel information on the basis of the integrated datasets. Findings Using our new analysis tool, MediPlEx (MEDIcago truncatula multiPLe EXpression analysis, expression data from EST experiments, oligonucleotide microarrays and Affymetrix GeneChips® can be combined and analyzed, leading to a novel approach to integrated transcriptome analysis. We have validated our tool via the identification of a set of well-characterized AM-specific and AM-induced marker genes, identified by MediPlEx on the basis of in silico and experimental gene expression profiles from roots colonized with AM fungi. Conclusions MediPlEx offers an integrated analysis pipeline for different sets of expression data generated for the model legume Medicago truncatula. As expected, in silico and experimental gene expression data that cover the same biological condition correlate well. The collection of differentially expressed genes identified via MediPlEx provides a starting point for functional studies in plant mutants. MediPlEx can freely be used at http://www.cebitec.uni-bielefeld.de/mediplex.

  13. Experimental and analytical combined thermal approach for local tribological understanding in metal cutting

    International Nuclear Information System (INIS)

    Artozoul, Julien; Lescalier, Christophe; Dudzinski, Daniel

    2015-01-01

    Metal cutting is a highly complex thermo-mechanical process. The knowledge of temperature in the chip forming zone is essential to understand it. Conventional experimental methods such as thermocouples only provide global information which is incompatible with the high stress and temperature gradients met in the chip forming zone. Field measurements are essential to understand the localized thermo-mechanical problem. An experimental protocol has been developed using advanced infrared imaging in order to measure temperature distribution in both the tool and the chip during an orthogonal or oblique cutting operation. It also provides several information on the chip formation process such as some geometrical characteristics (tool-chip contact length, chip thickness, primary shear angle) and thermo-mechanical information (heat flux dissipated in deformation zone, local interface heat partition ratio). A study is carried out on the effects of cutting conditions i.e. cutting speed, feed and depth of cut on the temperature distribution along the contact zone for an elementary operation. An analytical thermal model has been developed to process experimental data and access more information i.e. local stress or heat flux distribution. - Highlights: • A thermal analytical model is proposed for orthogonal cutting process. • IR thermography is used during cutting tests. • Combined experimental and modeling approaches are applied. • Heat flux and stress distribution at the tool-chip interface are determined. • The decomposition into sticking and sliding zones is defined.

  14. Combined Interhemispheric and Transsylvian Approach for Resection of Craniopharyngioma.

    Science.gov (United States)

    Inoue, Tomohiro; Ono, Hideaki; Tamura, Akira; Saito, Isamu

    2018-04-01

    We present a 37-year-old male case of cystic suprasellar huge craniopharyngioma, who presented with significant memory disturbance due to obstructive hydrocephalus. Combined interhemispheric and pterional approach was chosen to resect huge suprasellar tumor. Interhemispheric trans-lamina terminalis approach was quite effective to resect third ventricular tumor, while pterional approach was useful to dissect tumor out of basilar perforators and stalk. The link to the video can be found at: https://youtu.be/BoYIPa96kdo .

  15. Predicting prices of agricultural commodities in Thailand using combined approach emphasizing on data pre-processing technique

    Directory of Open Access Journals (Sweden)

    Thoranin Sujjaviriyasup

    2018-02-01

    Full Text Available In this research, a combined approach emphasizing on data pre-processing technique is developed to forecast prices of agricultural commodities in Thailand. The future prices play significant role in decision making to cultivate crops in next year. The proposed model takes ability of MODWT to decompose original time series data into more stable and explicit subseries, and SVR model to formulate complex function of forecasting. The experimental results indicated that the proposed model outperforms traditional forecasting models based on MAE and MAPE criteria. Furthermore, the proposed model reveals that it is able to be a useful forecasting tool for prices of agricultural commodities in Thailand

  16. Combining Upper Limb Robotic Rehabilitation with Other Therapeutic Approaches after Stroke: Current Status, Rationale, and Challenges

    Directory of Open Access Journals (Sweden)

    Stefano Mazzoleni

    2017-01-01

    Full Text Available A better understanding of the neural substrates that underlie motor recovery after stroke has led to the development of innovative rehabilitation strategies and tools that incorporate key elements of motor skill relearning, that is, intensive motor training involving goal-oriented repeated movements. Robotic devices for the upper limb are increasingly used in rehabilitation. Studies have demonstrated the effectiveness of these devices in reducing motor impairments, but less so for the improvement of upper limb function. Other studies have begun to investigate the benefits of combined approaches that target muscle function (functional electrical stimulation and botulinum toxin injections, modulate neural activity (noninvasive brain stimulation, and enhance motivation (virtual reality in an attempt to potentialize the benefits of robot-mediated training. The aim of this paper is to overview the current status of such combined treatments and to analyze the rationale behind them.

  17. Combining medical informatics and bioinformatics toward tools for personalized medicine.

    Science.gov (United States)

    Sarachan, B D; Simmons, M K; Subramanian, P; Temkin, J M

    2003-01-01

    Key bioinformatics and medical informatics research areas need to be identified to advance knowledge and understanding of disease risk factors and molecular disease pathology in the 21 st century toward new diagnoses, prognoses, and treatments. Three high-impact informatics areas are identified: predictive medicine (to identify significant correlations within clinical data using statistical and artificial intelligence methods), along with pathway informatics and cellular simulations (that combine biological knowledge with advanced informatics to elucidate molecular disease pathology). Initial predictive models have been developed for a pilot study in Huntington's disease. An initial bioinformatics platform has been developed for the reconstruction and analysis of pathways, and work has begun on pathway simulation. A bioinformatics research program has been established at GE Global Research Center as an important technology toward next generation medical diagnostics. We anticipate that 21 st century medical research will be a combination of informatics tools with traditional biology wet lab research, and that this will translate to increased use of informatics techniques in the clinic.

  18. An approach to combine radar and gauge based rainfall data under consideration of their qualities in low mountain ranges of Saxony

    Directory of Open Access Journals (Sweden)

    N. Jatho

    2010-03-01

    Full Text Available An approach to combine gauge and radar data and additional quality information is presented. The development was focused on the improvement of the diagnostic for temporal (one hour and spatial (1×1 km2 highly resolved precipitation data. The method is embedded in an online tool and was applied to the target area Saxony, Germany. The aim of the tool is to provide accurate spatial rainfall estimates. The results can be used for rainfall run-off modelling, e.g. in a flood management system.

    Quality information allows a better assessment of the input data and the resulting precipitation field. They are stored in corresponding fields and represent the static and dynamic uncertainties of radar and gauge data. Objective combination of various precipitation and quality fields is realised using a cost function.

    The findings of cross validation reveal that the proposed combination method merged the benefits and disadvantages of interpolated gauge and radar data and leads to mean estimates. The sampling point validation implies that the presented method slightly overestimated the areal rain as well as the high rain intensities in case of convective and advective events, while the results of pure interpolation method performed better. In general, the use of presented cost function avoids false rainfall amount in areas of low input data quality and improves the reliability in areas of high data quality. It is obvious that the combined product includes the small-scale variability of radar, which is seen as the important benefit of the presented combination approach. Local improvements of the final rain field are possible due to consideration of gauges that were not used for radar calibration, e.g. in topographic distinct regions.

  19. Combined Hydrologic (AGWA-KINEROS2) and Hydraulic (HEC2) Modeling for Post-Fire Runoff and Inundation Risk Assessment through a Set of Python Tools

    Science.gov (United States)

    Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.

    2016-12-01

    Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid

  20. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  1. Combined transoral and endoscopic approach for total maxillectomy: a pioneering report.

    Science.gov (United States)

    Liu, Zhuofu; Yu, Huapeng; Wang, Dehui; Wang, Jingjing; Sun, Xicai; Liu, Juan

    2013-06-01

    Total maxillectomy is sometimes necessary especially for malignant tumors originating from the maxillary sinus. Here we describe a combined transoral and endoscopic approach for total maxillectomy for the treatment of malignant maxillary sinus tumors and evaluate its short-term outcome. This approach was evaluated in terms of the physiological function, aesthetic outcome, and complications. Six patients underwent the above-mentioned approach for resection of malignant maxillary sinus tumors from May 2010 to June 2011. This combined transoral and endoscopic approach includes five basic steps: total sphenoethmoidectomy, sublabial incision, incision of the frontal process of the maxilla, incision of the zygomaticomaxillary fissure, and hard palate osteotomy. All patients with malignant maxillary sinus tumors successfully underwent the planned total endoscopic maxillectomy without the need for facial incision or transfixion of the nasal septum; there were no significant complications. Five patients received preoperative radiation therapy. All patients were well and had no recurrence at follow-up from 13 to 27 months. The combined approach is feasible and can be performed in carefully selected patients. The benefit of the absence of facial incisions or transfixion of the nasal septum, potential improvement in hemostasis, and visual magnification may help to decrease the morbidity of traditional open approaches.

  2. Improvement of tool support of the spatial approach to regional planning: problems, specifics, trends

    Directory of Open Access Journals (Sweden)

    Nataliya Gennadievna Yushkova

    2015-01-01

    Full Text Available The emerging imperatives of innovation economic development in Russia determine the content of conceptual and institutional constraints to the development of regional economic systems (RES. They consider the regional planning system as a leading priority in its inseparable unity with modern public administration tasks. However, the practice of development of long-term plans in the RF subjects proves that the innovation challenges of economic policy are not reflected properly in them or they are significantly distorted. The following reasons reduce the effectiveness of modernization processes in the RF subjects and hamper the appropriate reaction of RES on their impact: the lack of coordination between socio-economic and spatial regional plans, the imbalance of interaction between state authorities engaged in long-term planning, the lack of real prerequisites for the implementation of innovation initiatives in the regions. Systematization and analysis of long-term plans make it possible to substantiate the consistency of the spatial approach to regional planning expressed in the dominance of the transformational function that synchronizes the configuration and parameters of RES, and to establish ways to integrate spatial components in the system of regional planning through optimization of its tool support. The change in the content of the instrumentation support is based on the synthesis of the predominant basic characteristics of the existing tools used in isolated subsystems of regional planning of socio-economic and territorial development. The study has established a system of tool support for regional planning that adapts to the changes in both internal and external factors in the development of RES. Three main groups of tools: organizing, regulating, and coordinating are defined by their typing in accordance with the groups of management functions. The article proposes the modeling of combinations of tools that are subordinated to the

  3. Working with text tools, techniques and approaches for text mining

    CERN Document Server

    Tourte, Gregory J L

    2016-01-01

    Text mining tools and technologies have long been a part of the repository world, where they have been applied to a variety of purposes, from pragmatic aims to support tools. Research areas as diverse as biology, chemistry, sociology and criminology have seen effective use made of text mining technologies. Working With Text collects a subset of the best contributions from the 'Working with text: Tools, techniques and approaches for text mining' workshop, alongside contributions from experts in the area. Text mining tools and technologies in support of academic research include supporting research on the basis of a large body of documents, facilitating access to and reuse of extant work, and bridging between the formal academic world and areas such as traditional and social media. Jisc have funded a number of projects, including NaCTem (the National Centre for Text Mining) and the ResDis programme. Contents are developed from workshop submissions and invited contributions, including: Legal considerations in te...

  4. Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites

    Science.gov (United States)

    2015-03-19

    Bioremediation Approaches at Chlorinated Solvent Sites March 19, 2015 SERDP & ESTCP Webinar Series (#11) SERDP & ESTCP Webinar Series Welcome and...Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated Solvent Sites Ms. Carmen Lebrón, Independent Consultant (20 minutes + Q&A) Dr...ESTCP Webinar Series Quantitative Framework and Management Expectation Tool for the Selection of Bioremediation Approaches at Chlorinated

  5. Recognition of chemical entities: combining dictionary-based and grammar-based approaches

    Science.gov (United States)

    2015-01-01

    Background The past decade has seen an upsurge in the number of publications in chemistry. The ever-swelling volume of available documents makes it increasingly hard to extract relevant new information from such unstructured texts. The BioCreative CHEMDNER challenge invites the development of systems for the automatic recognition of chemicals in text (CEM task) and for ranking the recognized compounds at the document level (CDI task). We investigated an ensemble approach where dictionary-based named entity recognition is used along with grammar-based recognizers to extract compounds from text. We assessed the performance of ten different commercial and publicly available lexical resources using an open source indexing system (Peregrine), in combination with three different chemical compound recognizers and a set of regular expressions to recognize chemical database identifiers. The effect of different stop-word lists, case-sensitivity matching, and use of chunking information was also investigated. We focused on lexical resources that provide chemical structure information. To rank the different compounds found in a text, we used a term confidence score based on the normalized ratio of the term frequencies in chemical and non-chemical journals. Results The use of stop-word lists greatly improved the performance of the dictionary-based recognition, but there was no additional benefit from using chunking information. A combination of ChEBI and HMDB as lexical resources, the LeadMine tool for grammar-based recognition, and the regular expressions, outperformed any of the individual systems. On the test set, the F-scores were 77.8% (recall 71.2%, precision 85.8%) for the CEM task and 77.6% (recall 71.7%, precision 84.6%) for the CDI task. Missed terms were mainly due to tokenization issues, poor recognition of formulas, and term conjunctions. Conclusions We developed an ensemble system that combines dictionary-based and grammar-based approaches for chemical named

  6. Recognition of chemical entities: combining dictionary-based and grammar-based approaches.

    Science.gov (United States)

    Akhondi, Saber A; Hettne, Kristina M; van der Horst, Eelke; van Mulligen, Erik M; Kors, Jan A

    2015-01-01

    The past decade has seen an upsurge in the number of publications in chemistry. The ever-swelling volume of available documents makes it increasingly hard to extract relevant new information from such unstructured texts. The BioCreative CHEMDNER challenge invites the development of systems for the automatic recognition of chemicals in text (CEM task) and for ranking the recognized compounds at the document level (CDI task). We investigated an ensemble approach where dictionary-based named entity recognition is used along with grammar-based recognizers to extract compounds from text. We assessed the performance of ten different commercial and publicly available lexical resources using an open source indexing system (Peregrine), in combination with three different chemical compound recognizers and a set of regular expressions to recognize chemical database identifiers. The effect of different stop-word lists, case-sensitivity matching, and use of chunking information was also investigated. We focused on lexical resources that provide chemical structure information. To rank the different compounds found in a text, we used a term confidence score based on the normalized ratio of the term frequencies in chemical and non-chemical journals. The use of stop-word lists greatly improved the performance of the dictionary-based recognition, but there was no additional benefit from using chunking information. A combination of ChEBI and HMDB as lexical resources, the LeadMine tool for grammar-based recognition, and the regular expressions, outperformed any of the individual systems. On the test set, the F-scores were 77.8% (recall 71.2%, precision 85.8%) for the CEM task and 77.6% (recall 71.7%, precision 84.6%) for the CDI task. Missed terms were mainly due to tokenization issues, poor recognition of formulas, and term conjunctions. We developed an ensemble system that combines dictionary-based and grammar-based approaches for chemical named entity recognition, outperforming

  7. RooStats: Statistical Tools for the LHC

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC data, with emphasis on discoveries, confidence intervals, and combined measurements in the both the Bayesian and Frequentist approaches. The tools are built on top of the RooFit data modeling language and core ROOT mathematics libraries and persistence technology. These tools have been developed in collaboration with the LHC experiments and used by them to produce numerous physics results, such as the combination of ATLAS and CMS Higgs searches that resulted in a model with more than 200 parameters. We will review new developments which have been included in RooStats and the performance optimizations, required to cope with such complex models used by the LHC experiments. We will show as well the parallelization capability of these statistical tools using multiple-processors via PROOF.

  8. Approaches to modernize the combination drug development paradigm

    Directory of Open Access Journals (Sweden)

    Daphne Day

    2016-10-01

    Full Text Available Abstract Recent advances in genomic sequencing and omics-based capabilities are uncovering tremendous therapeutic opportunities and rapidly transforming the field of cancer medicine. Molecularly targeted agents aim to exploit key tumor-specific vulnerabilities such as oncogenic or non-oncogenic addiction and synthetic lethality. Additionally, immunotherapies targeting the host immune system are proving to be another promising and complementary approach. Owing to substantial tumor genomic and immunologic complexities, combination strategies are likely to be required to adequately disrupt intricate molecular interactions and provide meaningful long-term benefit to patients. To optimize the therapeutic success and application of combination therapies, systematic scientific discovery will need to be coupled with novel and efficient clinical trial approaches. Indeed, a paradigm shift is required to drive precision medicine forward, from the traditional “drug-centric” model of clinical development in pursuit of small incremental benefits in large heterogeneous groups of patients, to a “strategy-centric” model to provide customized transformative treatments in molecularly stratified subsets of patients or even in individual patients. Crucially, to combat the numerous challenges facing combination drug development—including our growing but incomplete understanding of tumor biology, technical and informatics limitations, and escalating financial costs—aligned goals and multidisciplinary collaboration are imperative to collectively harness knowledge and fuel continual innovation.

  9. Rectal duplication cyst: a combined abdominal and endoanal operative approach.

    Science.gov (United States)

    Rees, Clare M; Woodward, Mark; Grier, David; Cusick, Eleri

    2007-04-01

    Rectal duplication cysts are rare, comprising duplications. Early excision is the treatment of choice and a number of surgical approaches have been described. We present a 3-week-old infant with a 3 cm cyst that was excised using a previously unreported combined abdominal and endoanal approach.

  10. Nanotechnology-based combinational drug delivery: an emerging approach for cancer therapy.

    Science.gov (United States)

    Parhi, Priyambada; Mohanty, Chandana; Sahoo, Sanjeeb Kumar

    2012-09-01

    Combination therapy for the treatment of cancer is becoming more popular because it generates synergistic anticancer effects, reduces individual drug-related toxicity and suppresses multi-drug resistance through different mechanisms of action. In recent years, nanotechnology-based combination drug delivery to tumor tissues has emerged as an effective strategy by overcoming many biological, biophysical and biomedical barriers that the body stages against successful delivery of anticancer drugs. The sustained, controlled and targeted delivery of chemotherapeutic drugs in a combination approach enhanced therapeutic anticancer effects with reduced drug-associated side effects. In this article, we have reviewed the scope of various nanotechnology-based combination drug delivery approaches and also summarized the current perspective and challenges facing the successful treatment of cancer. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Safe manning of merchant ships: an approach and computer tool

    DEFF Research Database (Denmark)

    Alapetite, Alexandre; Kozin, Igor

    2017-01-01

    In the shipping industry, staffing expenses have become a vital competition parameter. In this paper, an approach and a software tool are presented to support decisions on the staffing of merchant ships. The tool is implemented in the form of a Web user interface that makes use of discrete......-event simulation and allows estimation of the workload and of whether different scenarios are successfully performed taking account of the number of crewmembers, watch schedules, distribution of competencies, and others. The software library ‘SimManning’ at the core of the project is provided as open source...

  12. Introducing Product Lines through Open Source Tools

    OpenAIRE

    Haugen, Øystein

    2008-01-01

    We present an approach to introducing product lines to companies that lower their initial risk by applying open source tools and a smooth learning curve into the use and creation of domain specific modeling combined with standardized variability modeling.

  13. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  14. Management of interstitial ectopic pregnancies with a combined intra-amniotic and systemic approach.

    Science.gov (United States)

    Swank, Morgan L; Harken, Tabetha R; Porto, Manuel

    2013-08-01

    Approximately 2% of all pregnancies are ectopic; of these, 4% are interstitial or cervical. There exists no clear consensus as to whether surgical or medical management is superior. We present three cases of advanced nonfallopian tube ectopic pregnancies from 6 to 8 weeks of gestation. Our first two cases were managed with a combined intrafetal, intra-amniotic and systemic approach using methotrexate and potassium chloride, whereas our third case was managed with an intra-amniotic approach alone. Our combined approach cases were successful, with resolution of human chorionic gonadotropin in 50 and 34 days, whereas our single approach case re-presented with bleeding requiring uterine artery embolization and operative removal of products of conception. Patients presenting with advanced interstitial or cervical pregnancies who are clinically stable can be offered medical management with a combined approach.

  15. COMPARISONS BETWEEN AND COMBINATIONS OF DIFFERENT APPROACHES TO ACCELERATE ENGINEERING PROJECTS

    Directory of Open Access Journals (Sweden)

    H. Steyn

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this article, traditional project management methods such as PERT and CPM, as well as fast-tracking and systems approaches, viz. concurrent engineering and critical chain, are reviewed with specific reference to their contribution to reducing the duration of the execution phase of engineering projects. Each of these techniques has some role to play in the acceleration of project execution. Combinations of approaches are evaluated by considering the potential of sets consisting of two different approaches each. While PERT and CPM approaches have been combined for many years in a technique called PERT/CPM, new combinations of approaches are discussed. Certain assumptions inherent to PERT and often wrong are not made by the critical chain approach.

    AFRIKAANSE OPSOMMING: In hierdie artikel word tradisionele projekbestuurbenaderings soos PERT en CPM asook projekversnelling en stelselbenaderings, naamlik gelyktydige ingenieurswese, en kritiekeketting-ondersoek met betrekking tot die bydrae wat elk tot die versnelling van die uitvoeringsfase van ingenieursprojekte kan lewer. Elk van hierdie benaderings kan ‘n spesifieke bydrae tot die versnelling van projekte lewer. Kombinasies, elk bestaande uit twee verskillende benaderings, word geëvalueer. Terwyl PERT en CPM reeds baie jare lank in kombinasie gebruik word, word nuwe kombinasies ook hier bespreek. Sekere aannames inherent aan die PERT-benadering is dikwels foutief. Hierdie aannames word nie deur die kritieke-ketting-benadering gemaak nie.

  16. Combining non-invasive transcranial brain stimulation with neuroimaging and electrophysiology: Current approaches and future perspectives.

    Science.gov (United States)

    Bergmann, Til Ole; Karabanov, Anke; Hartwigsen, Gesa; Thielscher, Axel; Siebner, Hartwig Roman

    2016-10-15

    Non-invasive transcranial brain stimulation (NTBS) techniques such as transcranial magnetic stimulation (TMS) and transcranial current stimulation (TCS) are important tools in human systems and cognitive neuroscience because they are able to reveal the relevance of certain brain structures or neuronal activity patterns for a given brain function. It is nowadays feasible to combine NTBS, either consecutively or concurrently, with a variety of neuroimaging and electrophysiological techniques. Here we discuss what kind of information can be gained from combined approaches, which often are technically demanding. We argue that the benefit from this combination is twofold. Firstly, neuroimaging and electrophysiology can inform subsequent NTBS, providing the required information to optimize where, when, and how to stimulate the brain. Information can be achieved both before and during the NTBS experiment, requiring consecutive and concurrent applications, respectively. Secondly, neuroimaging and electrophysiology can provide the readout for neural changes induced by NTBS. Again, using either concurrent or consecutive applications, both "online" NTBS effects immediately following the stimulation and "offline" NTBS effects outlasting plasticity-inducing NTBS protocols can be assessed. Finally, both strategies can be combined to close the loop between measuring and modulating brain activity by means of closed-loop brain state-dependent NTBS. In this paper, we will provide a conceptual framework, emphasizing principal strategies and highlighting promising future directions to exploit the benefits of combining NTBS with neuroimaging or electrophysiology. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Combined endoscopic approach in the management of suprasellar craniopharyngioma.

    Science.gov (United States)

    Deopujari, Chandrashekhar E; Karmarkar, Vikram S; Shah, Nishit; Vashu, Ravindran; Patil, Rahul; Mohanty, Chandan; Shaikh, Salman

    2018-05-01

    Craniopharyngiomas are dysontogenic tumors with benign histology but aggressive behavior. The surgical challenges posed by the tumor are well recognized. Neuroendoscopy has recently contributed to its surgical management. This study focuses on our experience in managing craniopharyngiomas in recent years, highlighting the role of combined endoscopic trans-ventricular and endonasal approach. Ninety-two patients have been treated for craniopharyngioma from 2000 to 2016 by the senior author. A total of 125 procedures, microsurgical (58) and endoscopic (67), were undertaken. Combined endoscopic approach was carried out in 18 of these patients, 16 children and 2 young adults. All of these patients presented with a large cystic suprasellar mass associated with hydrocephalus. In the first instance, they were treated with a transventricular endoscopic procedure to decompress the cystic component. This was followed by an endonasal transsphenoidal procedure for excision within the next 2 to 6 days. All these patients improved after the initial cyst decompression with relief of hydrocephalus while awaiting remaining tumor removal in a more elective setting. Gross total resection could be done in 84% of these patients. Diabetes insipidus was the most common postsurgical complication seen in 61% patients in the immediate period but was persistent in only two patients at 1-year follow-up. None of the children in this group developed morbid obesity. There was one case of CSF leak requiring repair after initial surgery. Peri-operative mortality was seen in one patient secondary to ventriculitis. The patients who benefit most from the combined approach are those who present with raised intracranial pressure secondary to a large tumor with cyst causing hydrocephalus. Intraventricular endoscopic cyst drainage allows resolution of hydrocephalus with restoration of normal intracranial pressure, gives time for proper preoperative work up, and has reduced incidence of CSF leak after

  18. Hypnotherapy: A Combined Approach Using Psychotherapy and Behavior Modification.

    Science.gov (United States)

    Goldberg, Bruce

    1987-01-01

    Discusses use of hypnosis in traditional psychoanalysis, compares use of hypnosis in behavior modification therapy versus psychoanalysis, and presents a hypno-behavioral model which combines both approaches using hypnosis as the medium. (Author/NB)

  19. Combined SAFE/SNAP approach to safeguards evaluation

    International Nuclear Information System (INIS)

    Engi, D.; Chapman, L.D.; Grant, F.H.; Polito, J.

    1980-01-01

    The scope of a safeguards evaluation model can efficiently address one of two issues: (1) global safeguards effectiveness or (2) vulnerability analysis for individual scenarios. The Safeguards Automated Facility Evaluation (SAFE) focuses on the first issue, while the Safeguards Network Analysis Procedure (SNAP) is directed towards the second. A combined SAFE/SNAP approach to the problem of safeguards evaluation is described and illustrated through an example. 4 refs

  20. New Approaches for the Use of the Classical Tools of Scenario Planning

    Directory of Open Access Journals (Sweden)

    Ricardo Balieiro Fischer

    2016-04-01

    Full Text Available The future is to be built – is multiple and uncertain. Within the social sciences, scenarios can be defined as a description of a future situation and a course of events that allow move from a primary position toward this future situation. Currently, there is a multiplicity of methods and tools available for building scenarios, including methods of an essentially rationalist approach, as Michel Godet’s method. The purpose of this work is to use the hypothetical-deductive method to reduce, starting from Michel Godet’s Scenario Method and its tools, the complexity of the scenario-building process while maintaining the robustness of the findings. For this purpose, it is proposed two different approaches: (1 to integrate, in one step, the structural analysis and the cross-impact matrix so the first one derives automatically while filling the last one; (2 to use the concept of Bayesian networks as a method to integrate the cross-impact matrix and the morphological analysis. Both approaches aim to reduce the amount of information needed to feed the tools and improve the feedback criteria, resulting in greater flexibility during the process and better holistic view of the system. Scientifically, these approaches open a new field of studies in scenario planning as it appropriates the concept of Bayesian networks, widely used in other areas of knowledge (artificial intelligence, geological studies, medical diagnostics, pattern classification, etc., and bring it to the field of social sciences.

  1. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    Science.gov (United States)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2016-02-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real

  2. Antiviral Combination Approach as a Perspective to Combat Enterovirus Infections.

    Science.gov (United States)

    Galabov, Angel S; Nikolova, Ivanka; Vassileva-Pencheva, Ralitsa; Stoyanova, Adelina

    2015-01-01

    Human enteroviruses distributed worldwide are causative agents of a broad spectrum of diseases with extremely high morbidity, including a series of severe illnesses of the central nervous system, heart, endocrine pancreas, skeleton muscles, etc., as well as the common cold contributing to the development of chronic respiratory diseases, including the chronic obstructive pulmonary disease. The above mentioned diseases along with the significantly high morbidity and mortality in children, as well as in the high-risk populations (immunodeficiencies, neonates) definitely formulate the chemotherapy as the main tool for the control of enterovirus infections. At present, clinically effective antivirals for use in the treatment of enteroviral infection do not exist, in spite of the large amount of work carried out in this field. The main reason for this is the development of drug resistance. We studied the process of development of resistance to the strongest inhibitors of enteroviruses, WIN compounds (VP1 protein hydrophobic pocket blockers), especially in the models in vivo, Coxsackievirus B (CV-B) infections in mice. We introduced the tracing of a panel of phenotypic markers (MIC50 value, plaque shape and size, stability at 50℃, pathogenicity in mice) for characterization of the drug-mutants (resistant and dependent) as a very important stage in the study of enterovirus inhibitors. Moreover, as a result of VP1 RNA sequence analysis performed on the model of disoxaril mutants of CVB1, we determined the molecular basis of the drug-resistance. The monotherapy courses were the only approach used till now. For the first time in the research for anti-enterovirus antivirals our team introduced the testing of combination effect of the selective inhibitors of enterovirus replication with different mode of action. This study resulted in the selection of a number of very effective in vitro double combinations with synergistic effect and a broad spectrum of sensitive

  3. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    Science.gov (United States)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  4. Life cycle tools combined with multi-criteria and participatory methods for agricultural sustainability: Insights from a systematic and critical review.

    Science.gov (United States)

    De Luca, Anna Irene; Iofrida, Nathalie; Leskinen, Pekka; Stillitano, Teodora; Falcone, Giacomo; Strano, Alfio; Gulisano, Giovanni

    2017-10-01

    Life cycle (LC) methodologies have attracted a great interest in agricultural sustainability assessments, even if, at the same time, they have sometimes been criticized for making unrealistic assumptions and subjective choices. To cope with these weaknesses, Multi-Criteria Decision Analysis (MCDA) and/or participatory methods can be used to balance and integrate different sustainability dimensions. The purpose of this study is to highlight how life cycle approaches were combined with MCDA and participatory methods to address agricultural sustainability in the published scientific literature. A systematic and critical review was developed, highlighting the following features: which multi-criterial and/or participatory methods have been associated with LC tools; how they have been integrated or complemented (methodological relationships); the intensity of the involvement of stakeholders (degree of participation); and which synergies have been achieved by combining the methods. The main typology of integration was represented by multi-criterial frameworks integrating LC evaluations. LC tools can provide MCDA studies with local and global information on how to reduce negative impacts and avoid burden shifts, while MCDA methods can help LC practitioners deal with subjective assumptions in an objective way, to take into consideration actors' values and to overcome trade-offs among the different dimensions of sustainability. Considerations concerning the further development of Life Cycle Sustainability Assessment (LCSA) have been identified as well. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. A Systematic Approach to Food Variety Classification as a Tool in ...

    African Journals Online (AJOL)

    A Systematic Approach to Food Variety Classification as a Tool in Dietary ... and food variety (count of all dietary items consumed during the recall period up to the ... This paper presents a pilot study carried out with an aim of demonstrating the ...

  6. A combined modeling approach for wind power feed-in and electricity spot prices

    International Nuclear Information System (INIS)

    Keles, Dogan; Genoese, Massimo; Möst, Dominik; Ortlieb, Sebastian; Fichtner, Wolf

    2013-01-01

    Wind power generation and its impacts on electricity prices has strongly increased in the EU. Therefore, appropriate mark-to-market evaluation of new investments in wind power and energy storage plants should consider the fluctuant generation of wind power and uncertain electricity prices, which are affected by wind power feed-in (WPF). To gain the input data for WPF and electricity prices, simulation models, such as econometric models, can serve as a data basis. This paper describes a combined modeling approach for the simulation of WPF series and electricity prices considering the impacts of WPF on prices based on an autoregressive approach. Thereby WPF series are firstly simulated for each hour of the year and integrated in the electricity price model to generate an hourly resolved price series for a year. The model results demonstrate that the WPF model delivers satisfying WPF series and that the extended electricity price model considering WPF leads to a significant improvement of the electricity price simulation compared to a model version without WPF effects. As the simulated series of WPF and electricity prices also contain the correlation between both series, market evaluation of wind power technologies can be accurately done based on these series. - Highlights: • Wind power feed-in can be directly simulated with stochastic processes. • Non-linear relationship between wind power feed-in and electricity prices. • Price reduction effect of wind power feed-in depends on the actual load. • Considering wind power feed-in effects improves the electricity price simulation. • Combined modeling of both parameters delivers a data basis for evaluation tools

  7. Mechanics and energetics in tool manufacture and use: a synthetic approach.

    Science.gov (United States)

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-11-06

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. © 2014 The Author

  8. Mechanics and energetics in tool manufacture and use: a synthetic approach

    Science.gov (United States)

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-01-01

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. PMID:25209405

  9. A Combined Raindrop Aggregate Destruction Test-Settling Tube (RADT-ST Approach to Identify the Settling Velocity of Sediment

    Directory of Open Access Journals (Sweden)

    Liangang Xiao

    2015-10-01

    Full Text Available The use of sediment settling velocity based on mineral grain size distribution in erosion models ignores the effects of aggregation on settling velocity. The alternative approach, wet-sieved aggregate size distribution, on the other hand, cannot represent all destructive processes that eroded soils may experience under impacting raindrops. Therefore, without considering raindrop impact, both methods may lead to biased predictions of the redistribution of sediment and associated substances across landscapes. Rainfall simulation is an effective way to simulate natural raindrop impact under controlled laboratory conditions. However, very few methods have been developed to integrate rainfall simulation with the settling velocity of eroded sediment. This study aims to develop a new proxy, based on rainfall simulation, in order to identify the actual settling velocity distribution of aggregated sediment. A combined Raindrop Aggregate Destruction Test-Settling Tube (RADT-ST approach was developed to (1 simulate aggregate destruction under a series of simulated rainfalls; and (2 measure the actual settling velocity distribution of destroyed aggregates. Mean Weight Settling Velocity (MWSV of aggregates was used to investigate settling behaviors of different soils as rainfall kinetic energy increased. The results show the settling velocity of silt-rich raindrop impacted aggregates is likely to be underestimated by at least six times if based on mineral grain size distribution. The RADT-ST designed in this study effectively captures the effects of aggregation on settling behavior. The settling velocity distribution should be regarded as an evolving, rather than steady state parameter during erosion events. The combined RADT-ST approach is able to generate the quasi-natural sediment under controlled simulated rainfall conditions and is adequately sensitive to measure actual settling velocities of differently aggregated soils. This combined approach provides

  10. Combined approach for gynecomastia.

    Science.gov (United States)

    El-Sabbagh, Ahmed Hassan

    2016-01-01

    Gynecomastia is a deformity of male chest. Treatment of gynecomastia varied from direct surgical excision to other techniques (mainly liposuction) to a combination of both. Skin excision is done according to the grade. In this study, experience of using liposuction adjuvant to surgical excision was described. Between September 2012 and April 2015, a total of 14 patients were treated with liposuction and surgical excision through a periareolar incision. Preoperative evaluation was done in all cases to exclude any underlying cause of gynecomastia. All fourteen patients were treated bilaterally (28 breast tissues). Their ages ranged between 13 and 33 years. Two patients were classified as grade I, and four as grade IIa, IIb or III, respectively. The first 3 patients showed seroma. Partial superficial epidermolysis of areola occurred in 2 cases. Superficial infection of incision occurred in one case and was treated conservatively. All grades of gynecomastia were managed by the same approach. Skin excision was added to a patient that had severe skin excess with limited activity and bad skin complexion. No cases required another setting or asked for 2(nd) opinion.

  11. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    Science.gov (United States)

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  12. Assessment of the predictive accuracy of five in silico prediction tools, alone or in combination, and two metaservers to classify long QT syndrome gene mutations.

    Science.gov (United States)

    Leong, Ivone U S; Stuckey, Alexander; Lai, Daniel; Skinner, Jonathan R; Love, Donald R

    2015-05-13

    Long QT syndrome (LQTS) is an autosomal dominant condition predisposing to sudden death from malignant arrhythmia. Genetic testing identifies many missense single nucleotide variants of uncertain pathogenicity. Establishing genetic pathogenicity is an essential prerequisite to family cascade screening. Many laboratories use in silico prediction tools, either alone or in combination, or metaservers, in order to predict pathogenicity; however, their accuracy in the context of LQTS is unknown. We evaluated the accuracy of five in silico programs and two metaservers in the analysis of LQTS 1-3 gene variants. The in silico tools SIFT, PolyPhen-2, PROVEAN, SNPs&GO and SNAP, either alone or in all possible combinations, and the metaservers Meta-SNP and PredictSNP, were tested on 312 KCNQ1, KCNH2 and SCN5A gene variants that have previously been characterised by either in vitro or co-segregation studies as either "pathogenic" (283) or "benign" (29). The accuracy, sensitivity, specificity and Matthews Correlation Coefficient (MCC) were calculated to determine the best combination of in silico tools for each LQTS gene, and when all genes are combined. The best combination of in silico tools for KCNQ1 is PROVEAN, SNPs&GO and SIFT (accuracy 92.7%, sensitivity 93.1%, specificity 100% and MCC 0.70). The best combination of in silico tools for KCNH2 is SIFT and PROVEAN or PROVEAN, SNPs&GO and SIFT. Both combinations have the same scores for accuracy (91.1%), sensitivity (91.5%), specificity (87.5%) and MCC (0.62). In the case of SCN5A, SNAP and PROVEAN provided the best combination (accuracy 81.4%, sensitivity 86.9%, specificity 50.0%, and MCC 0.32). When all three LQT genes are combined, SIFT, PROVEAN and SNAP is the combination with the best performance (accuracy 82.7%, sensitivity 83.0%, specificity 80.0%, and MCC 0.44). Both metaservers performed better than the single in silico tools; however, they did not perform better than the best performing combination of in silico

  13. Combining engineering and data-driven approaches

    DEFF Research Database (Denmark)

    Fischer, Katharina; De Sanctis, Gianluca; Kohler, Jochen

    2015-01-01

    Two general approaches may be followed for the development of a fire risk model: statistical models based on observed fire losses can support simple cost-benefit studies but are usually not detailed enough for engineering decision-making. Engineering models, on the other hand, require many assump...... to the calibration of a generic fire risk model for single family houses to Swiss insurance data. The example demonstrates that the bias in the risk estimation can be strongly reduced by model calibration.......Two general approaches may be followed for the development of a fire risk model: statistical models based on observed fire losses can support simple cost-benefit studies but are usually not detailed enough for engineering decision-making. Engineering models, on the other hand, require many...... assumptions that may result in a biased risk assessment. In two related papers we show how engineering and data-driven modelling can be combined by developing generic risk models that are calibrated to statistical data on observed fire events. The focus of the present paper is on the calibration procedure...

  14. A Combinational Digital Logic Design Tool for Practice and Assessment in Engineering Education

    Directory of Open Access Journals (Sweden)

    Rasha Morsi

    2016-08-01

    Full Text Available As technology advances, computers are being used almost everywhere. In a 2013 US Census report (File and Ryan, 2014, 83.8% (up from 78.9% in 2012 of U.S. households reported owning a computer with 74.4% reporting internet use (73.4% high speed internet. In recent years, the shift in educational technologies has been moving towards gaming, more specifically serious gaming. Although this is an important trend, there is still much to be said about e-learning through a step-by-step interactive process using an online practice tool. This paper presents a detailed description of the Combinational Logic Design Tool (CLDT (Morsi and Russell (2007. CLDT was designed and developed under the CCLI project, #0737242, funded by the National Science Foundation, which aimed to develop and disseminate a novel online practice tool for on demand review and assessment in Electrical and Computer Engineering education. The paper also reports on a formal assessment conducted in a Digital Logic Design Classroom and presents the results of this assessment.

  15. Advanced REACH Tool : Development and application of the substance emission potential modifying factor

    NARCIS (Netherlands)

    Tongeren, M. van; Fransman, W.; Spankie, S.; Tischer, M.; Brouwer, D.; Schinkel, J.; Cherrie, J.W.; Tielemans, E.

    2011-01-01

    The Advanced REACH Tool (ART) is an exposure assessment tool that combines mechanistically modelled inhalation exposure estimates with available exposure data using a Bayesian approach. The mechanistic model is based on nine independent principal modifying factors (MF). One of these MF is the

  16. Combined endoscopic approaches to the cardiac sphincter achalasia treatment

    Directory of Open Access Journals (Sweden)

    V. N. Klimenko

    2015-12-01

    Full Text Available Aim. To assess combined endoscopic approaches to the cardiac sphincter achalasia treatment. Results. There are preliminary results of treatment and methods of carrying out of combined endoscopic pneumocardiodilatation and injections of botulotoxin type A ‘Disport’ at achalasia cardia are described in the article. Aethio-pathogenetic aspects in the development of achalasia cardia, action of botulotoxin type A and balloon pneumocardiodilatation of the esophagus, were described. And modern roentgen-endoscopic classification of achalasia cardia was given. Prognostic estimation scale of possibility to implement further combined endoscopic or surgical treatment is defined and is being in subsequent working out. Conclusion. Described clinical cases most brightly demonstrate variety of clinical achalasia cardia manifestations and also determine of the earlier display of surgical treatment.

  17. Combining Results from Distinct MicroRNA Target Prediction Tools Enhances the Performance of Analyses

    Directory of Open Access Journals (Sweden)

    Arthur C. Oliveira

    2017-05-01

    Full Text Available Target prediction is generally the first step toward recognition of bona fide microRNA (miRNA-target interactions in living cells. Several target prediction tools are now available, which use distinct criteria and stringency to provide the best set of candidate targets for a single miRNA or a subset of miRNAs. However, there are many false-negative predictions, and consensus about the optimum strategy to select and use the output information provided by the target prediction tools is lacking. We compared the performance of four tools cited in literature—TargetScan (TS, miRanda-mirSVR (MR, Pita, and RNA22 (R22, and we determined the most effective approach for analyzing target prediction data (individual, union, or intersection. For this purpose, we calculated the sensitivity, specificity, precision, and correlation of these approaches using 10 miRNAs (miR-1-3p, miR-17-5p, miR-21-5p, miR-24-3p, miR-29a-3p, miR-34a-5p, miR-124-3p, miR-125b-5p, miR-145-5p, and miR-155-5p and 1,400 genes (700 validated and 700 non-validated as targets of these miRNAs. The four tools provided a subset of high-quality predictions and returned few false-positive predictions; however, they could not identify several known true targets. We demonstrate that union of TS/MR and TS/MR/R22 enhanced the quality of in silico prediction analysis of miRNA targets. We conclude that the union rather than the intersection of the aforementioned tools is the best strategy for maximizing performance while minimizing the loss of time and resources in subsequent in vivo and in vitro experiments for functional validation of miRNA-target interactions.

  18. Combining a survey approach and energy and indoor environment auditing in historic buildings

    DEFF Research Database (Denmark)

    Rohdin, Patrik; Dalewski, Mariusz; Moshfegh, Bahram

    2016-01-01

    Purpose – This paper presents an approach where a survey study is combined with energy and indoor environment auditing in the built environment. The combination of methods presented in this paper is one way to obtain a wider perspective on the indoor environment and energy use and also let...... this research project. Design/methodology/approach – A combination of energy and indoor environment auditing and standardized occupant surveys. Findings – The main findings in the paper are related to the good agreement between results from standardized occupant surveys and physical measurements...

  19. 3D measurement using combined Gray code and dual-frequency phase-shifting approach

    Science.gov (United States)

    Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Liu, Xin

    2018-04-01

    The combined Gray code and phase-shifting approach is a commonly used 3D measurement technique. In this technique, an error that equals integer multiples of the phase-shifted fringe period, i.e. period jump error, often exists in the absolute analog code, which can lead to gross measurement errors. To overcome this problem, the present paper proposes 3D measurement using a combined Gray code and dual-frequency phase-shifting approach. Based on 3D measurement using the combined Gray code and phase-shifting approach, one set of low-frequency phase-shifted fringe patterns with an odd-numbered multiple of the original phase-shifted fringe period is added. Thus, the absolute analog code measured value can be obtained by the combined Gray code and phase-shifting approach, and the low-frequency absolute analog code measured value can also be obtained by adding low-frequency phase-shifted fringe patterns. Then, the corrected absolute analog code measured value can be obtained by correcting the former by the latter, and the period jump errors can be eliminated, resulting in reliable analog code unwrapping. For the proposed approach, we established its measurement model, analyzed its measurement principle, expounded the mechanism of eliminating period jump errors by error analysis, and determined its applicable conditions. Theoretical analysis and experimental results show that the proposed approach can effectively eliminate period jump errors, reliably perform analog code unwrapping, and improve the measurement accuracy.

  20. Non-coding RNA detection methods combined to improve usability, reproducibility and precision

    Directory of Open Access Journals (Sweden)

    Kreikemeyer Bernd

    2010-09-01

    Full Text Available Abstract Background Non-coding RNAs gain more attention as their diverse roles in many cellular processes are discovered. At the same time, the need for efficient computational prediction of ncRNAs increases with the pace of sequencing technology. Existing tools are based on various approaches and techniques, but none of them provides a reliable ncRNA detector yet. Consequently, a natural approach is to combine existing tools. Due to a lack of standard input and output formats combination and comparison of existing tools is difficult. Also, for genomic scans they often need to be incorporated in detection workflows using custom scripts, which decreases transparency and reproducibility. Results We developed a Java-based framework to integrate existing tools and methods for ncRNA detection. This framework enables users to construct transparent detection workflows and to combine and compare different methods efficiently. We demonstrate the effectiveness of combining detection methods in case studies with the small genomes of Escherichia coli, Listeria monocytogenes and Streptococcus pyogenes. With the combined method, we gained 10% to 20% precision for sensitivities from 30% to 80%. Further, we investigated Streptococcus pyogenes for novel ncRNAs. Using multiple methods--integrated by our framework--we determined four highly probable candidates. We verified all four candidates experimentally using RT-PCR. Conclusions We have created an extensible framework for practical, transparent and reproducible combination and comparison of ncRNA detection methods. We have proven the effectiveness of this approach in tests and by guiding experiments to find new ncRNAs. The software is freely available under the GNU General Public License (GPL, version 3 at http://www.sbi.uni-rostock.de/moses along with source code, screen shots, examples and tutorial material.

  1. Combined approach for gynecomastia

    Directory of Open Access Journals (Sweden)

    El-Sabbagh, Ahmed Hassan

    2016-02-01

    Full Text Available Background: Gynecomastia is a deformity of male chest. Treatment of gynecomastia varied from direct surgical excision to other techniques (mainly liposuction to a combination of both. Skin excision is done according to the grade. In this study, experience of using liposuction adjuvant to surgical excision was described. Patients and methods: Between September 2012 and April 2015, a total of 14 patients were treated with liposuction and surgical excision through a periareolar incision. Preoperative evaluation was done in all cases to exclude any underlying cause of gynecomastia. Results: All fourteen patients were treated bilaterally (28 breast tissues. Their ages ranged between 13 and 33 years. Two patients were classified as grade I, and four as grade IIa, IIb or III, respectively. The first showed seroma. Partial superficial epidermolysis of areola occurred in 2 cases. Superficial infection of incision occurred in one case and was treated conservatively. Conclusion: All grades of gynecomastia were managed by the same approach. Skin excision was added to a patient that had severe skin excess with limited activity and bad skin complexion. No cases required another setting or asked for 2 opinion.

  2. Spray-formed tooling

    Science.gov (United States)

    McHugh, K. M.; Key, J. F.

    The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

  3. From theoretical concept to organizational tool for public sector improvement:

    DEFF Research Database (Denmark)

    Ernst, Jette; Hindhede, Anette Lykke; Andersen, Vibeke

    2018-01-01

    Purpose – The purpose of this paper is to examine, first, how social capital was crafted and transformed from a theoretical concept to an organizational tool for public sector improvement that was adopted by a Danish region and implemented in all regional hospitals. Second, the paper examines...... produce a pressure on the department management and the nurses. Originality/value – The explanatory critical framework combined with the ethnographic approach is a useful approach for theorizing and understanding social capital as an example of the emergence and consequences of new managerial tools...

  4. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Science.gov (United States)

    Kieseler, Jan

    2017-11-01

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections.

  5. A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kieseler, Jan [CERN, Geneva (Switzerland)

    2017-11-15

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A dedicated software package implementing this method is also presented. It provides a text-based user interface alongside a C++ interface. The latter also interfaces to ROOT classes for simple combination of binned measurements such as differential cross sections. (orig.)

  6. Developing a Crowdsourcing Approach and Tool for Pharmacovigilance Education Material Delivery.

    Science.gov (United States)

    Bate, Andrew; Beckmann, Jürgen; Dodoo, Alexander; Härmark, Linda; Hartigan-Go, Kenneth; Hegerius, Anna; Lindquist, Marie; van Puijenbroek, Eugène; Tuccori, Marco; Hagemann, Ulrich

    2017-03-01

    The number of pharmacovigilance professionals worldwide is increasing with a high staff turnover. There is a constant stream of new colleagues with an interest or need to learn about the discipline. Consequently, there is an increasing need for training in pharmacovigilance. An important step towards this has been made through developing and publishing the World Health Organization (WHO)-International Society of Pharmacovigilance (ISoP) Pharmacovigilance Curriculum. Using the Pharmacovigilance Curriculum effectively, it should be supplemented by providing comprehensive training material from various sources, and making the Pharmacovigilance Curriculum attractive and a high-utility product. We describe a pilot of the development and initial evaluation of a crowdsourcing tool for the provision of pharmacovigilance education material. Pharmacovigilance experts shared links to their material to sections of relevance in the hierarchy and a small group of organisations conducted an initial testing. In this pilot, we have shown the usability of such a web-based tool. The strengths of this approach include the potential for a routine 'democratic' approach to sharing educational material to a wider community and an openness for access.

  7. The Planets Approach to Migration Tools

    DEFF Research Database (Denmark)

    Zierau, Eld; van Wijk, Caroline

    2008-01-01

    claim is that the market will cover the required tools for commonly used formats. The second claim is that in the long term less tools will be required due to growing use of archiving standard formats. The Planets view on the current situation, the scope of tool development and the claims stated are...

  8. A new approach for heparin standardization: combination of scanning UV spectroscopy, nuclear magnetic resonance and principal component analysis.

    Directory of Open Access Journals (Sweden)

    Marcelo A Lima

    Full Text Available The year 2007 was marked by widespread adverse clinical responses to heparin use, leading to a global recall of potentially affected heparin batches in 2008. Several analytical methods have since been developed to detect impurities in heparin preparations; however, many are costly and dependent on instrumentation with only limited accessibility. A method based on a simple UV-scanning assay, combined with principal component analysis (PCA, was developed to detect impurities, such as glycosaminoglycans, other complex polysaccharides and aromatic compounds, in heparin preparations. Results were confirmed by NMR spectroscopy. This approach provides an additional, sensitive tool to determine heparin purity and safety, even when NMR spectroscopy failed, requiring only standard laboratory equipment and computing facilities.

  9. Advances in combining gene therapy with cell and tissue engineering-based approaches to enhance healing of the meniscus.

    Science.gov (United States)

    Cucchiarini, M; McNulty, A L; Mauck, R L; Setton, L A; Guilak, F; Madry, H

    2016-08-01

    Meniscal lesions are common problems in orthopaedic surgery and sports medicine, and injury or loss of the meniscus accelerates the onset of knee osteoarthritis (OA). Despite a variety of therapeutic options in the clinics, there is a critical need for improved treatments to enhance meniscal repair. In this regard, combining gene-, cell-, and tissue engineering-based approaches is an attractive strategy to generate novel, effective therapies to treat meniscal lesions. In the present work, we provide an overview of the tools currently available to improve meniscal repair and discuss the progress and remaining challenges for potential future translation in patients. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  10. uVis: A Formula-Based Visualization Tool

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Xu, Shangjin; Kuhail, Mohammad Amin

    Several tools use programming approaches for developing advanced visualizations. Others can with a few steps create simple visualizations with built-in patterns, and users with limited IT experience can use them. However, it is programming and time demanding to create and customize...... these visualizations. We introduce uVis, a tool that allows users with advanced spreadsheet-like IT knowledge and basic database understanding to create simple as well as advanced visualizations. These users construct visualizations by combining building blocks (i.e. controls, shapes). They specify spreadsheet...

  11. Interdisciplinary Approach to Tool-Handle Design Based on Medical Imaging

    Directory of Open Access Journals (Sweden)

    G. Harih

    2013-01-01

    Full Text Available Products are becoming increasingly complex; therefore, designers are faced with a challenging task to incorporate new functionality, higher performance, and optimal shape design. Traditional user-centered design techniques such as designing with anthropometric data do not incorporate enough subject data to design products with optimal shape for best fit to the target population. To overcome these limitations, we present an interdisciplinary approach with medical imaging. The use of this approach is being presented on the development of an optimal sized and shaped tool handle where the hand is imaged using magnetic resonance imaging machine. The obtained images of the hand are reconstructed and imported into computer-aided design software, where optimal shape of the handle is obtained with Boolean operations. Methods can be used to develop fully customized products with optimal shape to provide best fit to the target population. This increases subjective comfort rating, performance and can prevent acute and cumulative trauma disorders. Provided methods are especially suited for products where high stresses and exceptional performance is expected (high performance tools, professional sports, and military equipment, etc.. With the use of these interdisciplinary methods, the value of the product is increased, which also increases the competitiveness of the product on the market.

  12. OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.

    Science.gov (United States)

    Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein

    2018-01-01

    Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.

  13. Darcy Tools version 3.4. User's Guide

    International Nuclear Information System (INIS)

    Svensson, Urban; Ferry, Michel

    2010-12-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is developed by a collaborative effort by SKB (The Swedish Nuclear Fuel and Waste Management Company ), MFRDC (Michel Ferry RandD Consulting) and CFE AB (Computer-aided Fluid Engineering AB). It builds upon earlier development of groundwater models, carried out by CFE during the last twenty years. In the earlier work the CFD code PHOENICS (Spalding 1981) was used as an equation solver. DarcyTools is based on a solver called MIGAL (Ferry 2002). It has however been carefully evaluated that the two solvers produce very similar solutions and the earlier work is thus still valid as a background for DarcyTools. The present report will focus on the software that constitutes DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods and Equations. (Svensson et al. 2010) (Hereafter denoted Report 1). - Verification, Validation and Demonstration (Svensson 2010) (Hereafter denoted Report 2). Two basic approaches in groundwater modelling can be identified; in one we define grid cell conductivities (sometimes called the continuum porous-medium (CPM) approach, i.e. Jackson et al. 2000), in the other we calculate the flow through the fracture network directly (DFN approach). Both approaches have their merits and drawbacks, which however will not be discussed here (for a discussion, see Sahimi 1995). In DarcyTools the two approaches are combined, meaning that we first generate a fracture network and then represent the network as grid cell properties. Further background information is given in the two reports mentioned

  14. Darcy Tools version 3.4. User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Lyckeby (Sweden); Ferry, Michel [MFRDC, Orvault (France)

    2010-12-15

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is developed by a collaborative effort by SKB (The Swedish Nuclear Fuel and Waste Management Company ), MFRDC (Michel Ferry RandD Consulting) and CFE AB (Computer-aided Fluid Engineering AB). It builds upon earlier development of groundwater models, carried out by CFE during the last twenty years. In the earlier work the CFD code PHOENICS (Spalding 1981) was used as an equation solver. DarcyTools is based on a solver called MIGAL (Ferry 2002). It has however been carefully evaluated that the two solvers produce very similar solutions and the earlier work is thus still valid as a background for DarcyTools. The present report will focus on the software that constitutes DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods and Equations. (Svensson et al. 2010) (Hereafter denoted Report 1). - Verification, Validation and Demonstration (Svensson 2010) (Hereafter denoted Report 2). Two basic approaches in groundwater modelling can be identified; in one we define grid cell conductivities (sometimes called the continuum porous-medium (CPM) approach, i.e. Jackson et al. 2000), in the other we calculate the flow through the fracture network directly (DFN approach). Both approaches have their merits and drawbacks, which however will not be discussed here (for a discussion, see Sahimi 1995). In DarcyTools the two approaches are combined, meaning that we first generate a fracture network and then represent the network as grid cell properties. Further background information is given in the two reports mentioned

  15. BrainCheck - a very brief tool to detect incipient cognitive decline: optimized case-finding combining patient- and informant-based data.

    Science.gov (United States)

    Ehrensperger, Michael M; Taylor, Kirsten I; Berres, Manfred; Foldi, Nancy S; Dellenbach, Myriam; Bopp, Irene; Gold, Gabriel; von Gunten, Armin; Inglin, Daniel; Müri, René; Rüegger, Brigitte; Kressig, Reto W; Monsch, Andreas U

    2014-01-01

    Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). A new and very brief instrument for

  16. Weighting Performance Evaluation Criteria Base in Balanced Score Card Approach with Use of Combination Method Shapley value & Bull\\'s-eye

    Directory of Open Access Journals (Sweden)

    Mohammad Hassan Kamfiroozi

    2014-05-01

    Full Text Available Performance evaluation as a control tool was considered by managers in the organizations and manufactures. In this paper we decide to present a new model for performance evaluation and industrial companies ranking at uncertain conditions. Based on this, we implemented performance evaluation based on balance score card (BSC method. Beside, we tried to use three parameter interval grey numbers in lieu of linguistic variables. Then evaluation and weighting of fourth indicators is done with use of Bulls-eye-Shapley combination method that is counted as new approach in this article. Reason of utilization of three parameter interval grey numbers and combination method was decreasing of environmental uncertainty on data and model. This combination weighting method can be used as a new method in decision making Science. At final of this paper case study was implemented on industrial companies (nail makers that ranking of these companies is obtained by use of grey-TOPSIS method (that is a generalization of classic TOPSIS for three parameter interval grey numbers.

  17. "Combining equity and utilitarianism"-additional insights into a novel approach

    NARCIS (Netherlands)

    Lemmen-Gerdessen, van Joke; Kanellopoulos, Argyris; Claassen, Frits

    2018-01-01

    Recently, a novel approach (to be referred to as CEU) was introduced for the frequently arising problem of combining the conflicting criteria of equity and utilitarianism. This paper provides additional insights into CEU and assesses its added value for practice by comparing it with a commonly used

  18. Cell manipulation tool with combined microwell array and optical tweezers for cell isolation and deposition

    International Nuclear Information System (INIS)

    Wang, Xiaolin; Gou, Xue; Chen, Shuxun; Yan, Xiao; Sun, Dong

    2013-01-01

    Isolation from rare cells and deposition of sorted cells with high accuracy for further study are critical to a wide range of biomedical applications. In the current paper, we report an automated cell manipulation tool with combined optical tweezers and a uniquely designed microwell array, which functions for recognition, isolation, assembly, transportation and deposition of the interesting cells. The microwell array allows the passive hydrodynamic docking of cells, while offering the opportunity to inspect the interesting cell phenotypes with high spatio-temporal resolution based on the flexible image processing technique. In addition, dynamic and parallel cell manipulation in three dimensions can realize the target cell levitation from microwell and pattern assembly with multiple optical traps. Integrated with the programmed motorized stage, the optically levitated and assembled cells can be transported and deposited to the predefined microenvironment, so the tool can facilitate the integration of other on-chip functionalities for further study without removing these isolated cells from the chip. Experiments on human embryonic stem cells and yeast cells are performed to demonstrate the effectiveness of the proposed cell manipulation tool. Besides the application to cell isolation and deposition, three other biological applications with this tool are also presented. (paper)

  19. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  20. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    Energy Technology Data Exchange (ETDEWEB)

    Pennell, Kurt [Tufts Univ., Medford, MA (United States)

    2014-12-31

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  1. MURMoT: Design and Application of Microbial Uranium Reduction Monitoring Tools

    International Nuclear Information System (INIS)

    Pennell, Kurt

    2014-01-01

    The overarching project goal of the MURMoT project was the design of tools to elucidate the presence, abundance, dynamics, spatial distribution, and activity of metal- and radionuclide-transforming bacteria. To accomplish these objectives, an integrated approach that combined nucleic acid-based tools, proteomic workflows, uranium isotope measurements, and U(IV) speciation and structure analyses using the Advanced Photon Source (APS) at Argonne National Laboratory was developed.

  2. Management of advanced intracranial intradural juvenile nasopharyngeal angiofibroma: combined single-stage rhinosurgical and neurosurgical approach.

    Science.gov (United States)

    Naraghi, Mohsen; Saberi, Hooshang; Mirmohseni, Atefeh Sadat; Nikdad, Mohammad Sadegh; Afarideh, Mohsen

    2015-07-01

    Although intracranial extension of juvenile nasopharyngeal angiofibroma (JNA) occurs commonly, intradural penetration is extremely rare. Management of such tumors is a challenging issue in skull-base surgery, necessitating their removal via combined approaches. In this work, we share our experience in management of extensive intradural JNA. In a university hospital-based setting of 2 tertiary care academic centers, retrospective chart of 6 male patients (5 between 15 and 19 years old) was reviewed. Patients presented chiefly with nasal obstruction, epistaxis, and proptosis. One of them was an aggressive recurrent tumor in a 32-year-old patient. All cases underwent combined transnasal, transmaxillary, and craniotomy approaches assisted by the use of image-guided endoscopic surgery, with craniotomy preceding the rhinosurgical approach in 3 cases. Adding a transcranial approach to the transnasal and transmaxillary endoscopic approaches provided 2-sided exposure and appreciated access to the huge intradural JNAs. One postoperative cerebrospinal fluid leak and 1 postoperative recurrence at the site of infratemporal fossa were treated successfully. Otherwise, the course was uneventful in the remaining cases. Management of intracranial intradural JNA requires a multidisciplinary approach of combined open and endoscopic-assisted rhinosurgery and neurosurgery, because of greater risk for complications during the dissection. Carotid rupture and brain damage remain 2 catastrophic complications that should always be kept in mind. A combined rhinosurgical and neurosurgical approach also has the advantage of very modest cosmetic complications. © 2015 ARS-AAOA, LLC.

  3. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  4. Synchronous method and engineering tool for the strategic factory planning

    OpenAIRE

    Abdul Rahman, O.; Jaeger, J.; Constantinescu, C.

    2011-01-01

    This paper presents the approach to combine two reference methods and engineering tools, for "Factory Performance and Investment Planning«as well as "Value Added Ideal Production Network Planning". The resulted synchronous method aims to support factories in the strategic planning as well as in the network planning. The corresponding engineering tool is employed for assessment planning, sales planning, capacity planning and production costs planning under the consideration of dynamic and stoc...

  5. A combined stochastic programming and optimal control approach to personal finance and pensions

    DEFF Research Database (Denmark)

    Konicz, Agnieszka Karolina; Pisinger, David; Rasmussen, Kourosh Marjani

    2015-01-01

    The paper presents a model that combines a dynamic programming (stochastic optimal control) approach and a multi-stage stochastic linear programming approach (SLP), integrated into one SLP formulation. Stochastic optimal control produces an optimal policy that is easy to understand and implement....

  6. arXiv A method and tool for combining differential or inclusive measurements obtained with simultaneously constrained uncertainties

    CERN Document Server

    Kieseler, Jan

    2017-11-22

    A method is discussed that allows combining sets of differential or inclusive measurements. It is assumed that at least one measurement was obtained with simultaneously fitting a set of nuisance parameters, representing sources of systematic uncertainties. As a result of beneficial constraints from the data all such fitted parameters are correlated among each other. The best approach for a combination of these measurements would be the maximization of a combined likelihood, for which the full fit model of each measurement and the original data are required. However, only in rare cases this information is publicly available. In absence of this information most commonly used combination methods are not able to account for these correlations between uncertainties, which can lead to severe biases as shown in this article. The method discussed here provides a solution for this problem. It relies on the public result and its covariance or Hessian, only, and is validated against the combined-likelihood approach. A d...

  7. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation

  8. Dural opening/removal for combined petrosal approach: technical note.

    Science.gov (United States)

    Terasaka, Shunsuke; Asaoka, Katsuyuki; Kobayashi, Hiroyuki; Sugiyama, Taku; Yamaguchi, Shigeru

    2011-03-01

    Detailed descriptions of stepwise dural opening/removal for combined petrosal approach are presented. Following maximum bone work, the first dural incision was made along the undersurface of the temporal lobe parallel to the superior petrosal sinus. Posterior extension of the dural incision was made in a curved fashion, keeping away from the transverse-sigmoid junction and taking care to preserve the vein of Labbé. A second incision was made perpendicular to the first incision. After sectioning the superior petrosal sinus around the porus trigeminus, the incision was extended toward the posterior fossa dura in the middle fossa region. The tentorium was incised toward the incisura at a point just posterior to the entrance of the trochlear nerve. A third incision was made longitudinally between the superior petrosal sinus and the jugular bulb. A final incision was initiated perpendicular to the third incision in the presigmoid region and extended parallel to the superior petrosal sinus connecting the second incision. The dural complex consisting of the temporal lobe dura, the posterior fossa dura, and the freed tentorium could then be removed. In addition to extensive bone resection, our strategic cranial base dural opening/removal can yield true advantages for the combined petrosal approach.

  9. Urban Green Infrastructure as a tool for urban heat mitigation

    NARCIS (Netherlands)

    Saaroni, H.; Amorim, J.H.; Hiemstra, J.A.; Pearlmutter, D.

    2018-01-01

    The combined trends of urban heat island intensification and global warming are focusing attention on greening of cities as a tool for urban heat mitigation. Our study examines the range of research approaches and findings regarding the role of urban green infrastructure (UGI) in mitigating urban

  10. Application of MCDM based hybrid optimization tool during turning of ASTM A588

    Directory of Open Access Journals (Sweden)

    Himadri Majumder

    2017-07-01

    Full Text Available Multi-criteria decision making approach is one of the most troublesome tools for solving the tangled optimization problems in the machining area due to its capability of solving the complex optimization problems in the production process. Turning is widely used in the manufacturing processes as it offers enormous advantages like good quality product, customer satisfaction, economical and relatively easy to apply. A contemporary approach, MOORA coupled with PCA, was used to ascertain an optimal combination of input parameters (spindle speed, depth of cut and feed rate for the given output parameters (power consumption, average surface roughness and frequency of tool vibration using L27 orthogonal array for turning on ASTM A588 mild steel. Comparison between MOORA-PCA and TOPSIS-PCA shows the effectiveness of MOORA over TOPSIS method. The optimum parameter combination for multi-performance characteristics has been established for ASTM A588 mild steel are spindle speed 160 rpm, depth of cut 0.1 mm and feed rate 0.08 mm/rev. Therefore, this study focuses on the application of the hybrid MCDM approach as a vital selection making tool to deal with multi objective optimization problems.

  11. Pesticides in the Lake Kinneret basin: a combined approach towards mircopollutant management

    Science.gov (United States)

    Gaßmann, M.; Friedler, E.; Dubwoski, Y.; Dinerman, E.; Olsson, O.; Bauer, M.

    2009-04-01

    Lake Kinneret is the only large surface waterbody in Israel, supplying about 27% of the country's freshwater. Water quality in Lake Kinneret is of major concern and improving the ecological status of this large water body is now a national priority. While many studies in the past focused on nutrients inflows and phytoplankton dynamics, less research has been done on assessing the fate and pathways of micropollutants at semi-arid environments in common and Lake Kinneret in particular. Since the watershed area of Lake Kinneret is used primarily for agriculture, it is important to evaluate the fate and dynamic transfer of organic micropollutants such as pesticides and herbicides in the watershed streams and in the lake itself. This study introduces a combined concept of extensive measurements and modelling tools to observe and simulate the pesticide release chain (i) application - (ii) diffuse release to rivers - (iii) transport in the river - (iv) accumulation in the lake. The available information regarding identification of application zones (i) and the amounts of used pesticides is based on stakeholders interviews, a survey of the different crop types and orchards and a comparison to sold amounts of the target pesticides (Melman and Bar-Ilan 2008). In the current research, a single field mass balance of pesticides is carried out to determine the field release to rivers (ii) by an extensive measurement campaign on the different compartments (soil, vegetation, atmosphere) and phases (water, air, solids) of a single field. The mass balance results in a release pattern of pesticide, which will be overtaken into the modelling approach. Transport of pesticides in rivers (iii) is modelled on the base of a recently developed stream network model for ephemeral streams (MOHID River), introducing important instream fate processes of pesticides and supported by six instream measurement stations of hydrological as well as pesticide data in the basin. To determine the final

  12. The function of prehistoric lithic tools: a combined study of use-wear analysis and FTIR microspectroscopy.

    Science.gov (United States)

    Nunziante Cesaro, Stella; Lemorini, Cristina

    2012-02-01

    The application of combined use-wear analysis and FTIR micro spectroscopy for the investigation of the flint and obsidian tools from the archaeological sites of Masseria Candelaro (Foggia, Italy) and Sant'Anna di Oria (Brindisi, Italy) aiming to clarify their functional use is described. The tools excavated in the former site showed in a very high percentage spectroscopically detectable residues on their working edges. The identification of micro deposits is based on comparison with a great number of replicas studied in the same experimental conditions. FTIR data confirmed in almost all cases the use-wear analysis suggestions and added details about the material processed and about the working procedures. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Approach to the problem of combined radiation and environmental effect standardization

    International Nuclear Information System (INIS)

    Burykina, L.N.; Ajzina, N.L.; Vasil'eva, L.A.; Veselovskaya, K.A.; Likhachev, Yu.P.; Ponomareva, V.L.; Satarina, S.M.; Shmeleva, E.V.

    1978-01-01

    Rats were used to study combined forms of damage caused by radioactive substances with varioUs types of distribution ( 131 I and 147 Pm) and by external radiation sources (γ, X). Damage caused by radiation and dust factors was also studied. Synergism of the combined effect of the tolerance dose of 147 Pm introduced and preceding external general γ-irradiation was determined. The combined action of 131 I and external γ- and X-ray radiation exhibited an additional effect on rat thyroid glands. The combined action of dust and radiation factors showed that the biological effect depended on the dose abs.orbed in a critical organ (lungs). The results of the investigations point to an important role of critical organs (systems) and the degree of their radiosensitivity in response of body to combined internal and external irradiations. The facts presented show that the approach to standardizing radiation factors from the position of partial summation should be changed. This may be accomplished by using a combination factor which is determined experimentally and reflects a relative biological efficiency of the combined effects as compared to separate ones

  14. A combined telemetry - tag return approach to estimate fishing and natural mortality rates of an estuarine fish

    Science.gov (United States)

    Bacheler, N.M.; Buckel, J.A.; Hightower, J.E.; Paramore, L.M.; Pollock, K.H.

    2009-01-01

    A joint analysis of tag return and telemetry data should improve estimates of mortality rates for exploited fishes; however, the combined approach has thus far only been tested in terrestrial systems. We tagged subadult red drum (Sciaenops ocellatus) with conventional tags and ultrasonic transmitters over 3 years in coastal North Carolina, USA, to test the efficacy of the combined telemetry - tag return approach. There was a strong seasonal pattern to monthly fishing mortality rate (F) estimates from both conventional and telemetry tags; highest F values occurred in fall months and lowest levels occurred during winter. Although monthly F values were similar in pattern and magnitude between conventional tagging and telemetry, information on F in the combined model came primarily from conventional tags. The estimated natural mortality rate (M) in the combined model was low (estimated annual rate ?? standard error: 0.04 ?? 0.04) and was based primarily upon the telemetry approach. Using high-reward tagging, we estimated different tag reporting rates for state agency and university tagging programs. The combined telemetry - tag return approach can be an effective approach for estimating F and M as long as several key assumptions of the model are met.

  15. Combined SAFE/SNAP approach to safeguards evaluation

    International Nuclear Information System (INIS)

    Engi, D.; Chapman, L.D.; Grant, F.H.; Polito, J.

    1980-01-01

    Generally, the scope of a safeguards evaluation model can efficiently address one of two issues, (1) global safeguards effectiveness, or (2) vulnerability analysis for individual scenarios. The Safeguards Automated Facility Evaluation (SAFE) focuses on (1) while the Safeguards Network Analysis Procedure (SNAP) is directed at (2). SAFE addresses (1) in that it considers the entire facility, i.e., the composite system of hardware and human components, in one global analysis. SNAP addresses (2) by providing a safeguards modeling symbology sufficiently flexible to represent quite complex scenarios from the standpoint of hardware interfaces while also accounting for a rich variety of human decision making. A combined SAFE/SNAP approach to the problem of safeguards evaluation is described and illustrated through an example

  16. Darcy Tools version 3.4. User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban (Computer-aided Fluid Engineering AB, Lyckeby (Sweden)); Ferry, Michel (MFRDC, Orvault (France))

    2010-12-15

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. DarcyTools is developed by a collaborative effort by SKB (The Swedish Nuclear Fuel and Waste Management Company ), MFRDC (Michel Ferry RandD Consulting) and CFE AB (Computer-aided Fluid Engineering AB). It builds upon earlier development of groundwater models, carried out by CFE during the last twenty years. In the earlier work the CFD code PHOENICS (Spalding 1981) was used as an equation solver. DarcyTools is based on a solver called MIGAL (Ferry 2002). It has however been carefully evaluated that the two solvers produce very similar solutions and the earlier work is thus still valid as a background for DarcyTools. The present report will focus on the software that constitutes DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods and Equations. (Svensson et al. 2010) (Hereafter denoted Report 1). - Verification, Validation and Demonstration (Svensson 2010) (Hereafter denoted Report 2). Two basic approaches in groundwater modelling can be identified; in one we define grid cell conductivities (sometimes called the continuum porous-medium (CPM) approach, i.e. Jackson et al. 2000), in the other we calculate the flow through the fracture network directly (DFN approach). Both approaches have their merits and drawbacks, which however will not be discussed here (for a discussion, see Sahimi 1995). In DarcyTools the two approaches are combined, meaning that we first generate a fracture network and then represent the network as grid cell properties. Further background information is given in the two reports mentioned

  17. Integrated Transport Planning Framework Involving Combined Utility Regret Approach

    DEFF Research Database (Denmark)

    Wang, Yang; Monzon, Andres; Di Ciommo, Floridea

    2014-01-01

    Sustainable transport planning requires an integrated approach involving strategic planning, impact analysis, and multicriteria evaluation. This study aimed at relaxing the utility-based decision-making assumption by newly embedding anticipated-regret and combined utility regret decision mechanisms...... in a framework for integrated transport planning. The framework consisted of a two-round Delphi survey, integrated land use and transport model for Madrid, and multicriteria analysis. Results show that (a) the regret-based ranking has a similar mean but larger variance than the utility-based ranking does, (b......) the least-regret scenario forms a compromise between the desired and the expected scenarios, (c) the least-regret scenario can lead to higher user benefits in the short term and lower user benefits in the long term, (d) the utility-based, the regret-based, and the combined utility- and regret...

  18. RELAP5 simulation of surge line break accident using combined and best estimate plus uncertainty approaches

    International Nuclear Information System (INIS)

    Kristof, Marian; Kliment, Tomas; Petruzzi, Alessandro; Lipka, Jozef

    2009-01-01

    Licensing calculations in a majority of countries worldwide still rely on the application of combined approach using best estimate computer code without evaluation of the code models uncertainty and conservative assumptions on initial and boundary, availability of systems and components and additional conservative assumptions. However best estimate plus uncertainty (BEPU) approach representing the state-of-the-art in the area of safety analysis has a clear potential to replace currently used combined approach. There are several applications of BEPU approach in the area of licensing calculations, but some questions are discussed, namely from the regulatory point of view. In order to find a proper solution to these questions and to support the BEPU approach to become a standard approach for licensing calculations, a broad comparison of both approaches for various transients is necessary. Results of one of such comparisons on the example of the VVER-440/213 NPP pressurizer surge line break event are described in this paper. A Kv-scaled simulation based on PH4-SLB experiment from PMK-2 integral test facility applying its volume and power scaling factor is performed for qualitative assessment of the RELAP5 computer code calculation using the VVER-440/213 plant model. Existing hardware differences are identified and explained. The CIAU method is adopted for performing the uncertainty evaluation. Results using combined and BEPU approaches are in agreement with the experimental values in PMK-2 facility. Only minimal difference between combined and BEPU approached has been observed in the evaluation of the safety margins for the peak cladding temperature. Benefits of the CIAU uncertainty method are highlighted.

  19. Multilayer composition coatings for cutting tools: formation and performance properties

    Science.gov (United States)

    Tabakov, Vladimir P.; Vereschaka, Anatoly S.; Vereschaka, Alexey A.

    2018-03-01

    The paper considers the concept of a multi-layer architecture of the coating in which each layer has a predetermined functionality. Latest generation of coatings with multi-layered architecture for cutting tools secure a dual nature of the coating, in which coatings should not only improve the mechanical and physical characteristics of the cutting tool material, but also reduce the thermo-mechanical effect on the cutting tool determining wear intensity. Here are presented the results of the development of combined methods of forming multi-layer coatings with improved properties. Combined method of forming coatings using a pulsed laser allowed reducing excessively high levels of compressive residual stress and increasing micro hardness of the multilayered coatings. The results in testing coated HSS tools showed that the use of additional pulse of laser processing increases tool life up to 3 times. Using filtered cathodic vacuum arc deposition for the generation of multilayer coatings based on TiAlN compound has increased the wear-resistance of carbide tools by 2 fold compared with tool life of cutting tool with commercial TiN coatings. The aim of this study was to develop an innovative methodological approach to the deposition of multilayer coatings for cutting tools with functional architectural selection, properties and parameters of the coating based on sound knowledge of coating failure in machining process.

  20. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  1. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  2. Process planning optimization on turning machine tool using a hybrid genetic algorithm with local search approach

    Directory of Open Access Journals (Sweden)

    Yuliang Su

    2015-04-01

    Full Text Available A turning machine tool is a kind of new type of machine tool that is equipped with more than one spindle and turret. The distinctive simultaneous and parallel processing abilities of turning machine tool increase the complexity of process planning. The operations would not only be sequenced and satisfy precedence constraints, but also should be scheduled with multiple objectives such as minimizing machining cost, maximizing utilization of turning machine tool, and so on. To solve this problem, a hybrid genetic algorithm was proposed to generate optimal process plans based on a mixed 0-1 integer programming model. An operation precedence graph is used to represent precedence constraints and help generate a feasible initial population of hybrid genetic algorithm. Encoding strategy based on data structure was developed to represent process plans digitally in order to form the solution space. In addition, a local search approach for optimizing the assignments of available turrets would be added to incorporate scheduling with process planning. A real-world case is used to prove that the proposed approach could avoid infeasible solutions and effectively generate a global optimal process plan.

  3. Mediated Authentic Video: A Flexible Tool Supporting a Developmental Approach to Teacher Education

    Science.gov (United States)

    Stutchbury, Kris; Woodward, Clare

    2017-01-01

    YouTube now has more searches than Google, indicating that video is a motivating and, potentially, powerful learning tool. This paper investigates how we can embrace video to support improvements in teacher education. It will draw on innovative approaches to teacher education, developed by the Open University UK, in order to explore in more depth…

  4. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  5. An eHealth Diary and Symptom-Tracking Tool Combined With Person-Centered Care for Improving Self-Efficacy After a Diagnosis of Acute Coronary Syndrome: A Substudy of a Randomized Controlled Trial.

    Science.gov (United States)

    Wolf, Axel; Fors, Andreas; Ulin, Kerstin; Thorn, Jörgen; Swedberg, Karl; Ekman, Inger

    2016-02-23

    Patients with cardiovascular diseases managed by a person-centered care (PCC) approach have been observed to have better treatment outcomes and satisfaction than with traditional care. eHealth may facilitate the often slow transition to more person-centered health care by increasing patients' beliefs in their own capacities (self-efficacy) to manage their care trajectory. eHealth is being increasingly used, but most studies continue to focus on health care professionals' logic of care. Knowledge is lacking regarding the effects of an eHealth tool on self-efficacy when combined with PCC for patients with chronic heart diseases. The objective of our study was to investigate the effect of an eHealth diary and symptom-tracking tool in combination with PCC for patients with acute coronary syndrome (ACS). This was a substudy of a randomized controlled trial investigating the effects of PCC in patients hospitalized with ACS. In total, 199 patients with ACS aged eHealth tool, or both, for at least 2 months after hospital discharge. The primary end point was a composite score of changes in general self-efficacy, return to work or prior activity level, and rehospitalization or death 6 months after discharge. Of the 94 patients in the intervention arm, 37 (39%) used the eHealth tool at least once after the index hospitalization. Most of these (24/37, 65%) used the mobile app and not the Web-based app as the primary source of daily self-rating input. Patients used the eHealth tool a mean of 38 times during the first 8 weeks (range 1-118, SD 33) and 64 times over a 6-month period (range 1-597, SD 104). Patients who used the eHealth tool in combination with the PCC intervention had a 4-fold improvement in the primary end point compared with the control group (odds ratio 4.0, 95% CI 1.5-10.5; P=.005). This improvement was driven by a significant increase in general self-efficacy compared with the control group (P=.011). Patients in the PCC group who did not use the eHealth tool

  6. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  7. An approach to combining heuristic and qualitative reasoning in an expert system

    Science.gov (United States)

    Jiang, Wei-Si; Han, Chia Yung; Tsai, Lian Cheng; Wee, William G.

    1988-01-01

    An approach to combining the heuristic reasoning from shallow knowledge and the qualitative reasoning from deep knowledge is described. The shallow knowledge is represented in production rules and under the direct control of the inference engine. The deep knowledge is represented in frames, which may be put in a relational DataBase Management System. This approach takes advantage of both reasoning schemes and results in improved efficiency as well as expanded problem solving ability.

  8. A survey of approaches combining safety and security for industrial control systems

    International Nuclear Information System (INIS)

    Kriaa, Siwar; Pietre-Cambacedes, Ludovic; Bouissou, Marc; Halgand, Yoran

    2015-01-01

    The migration towards digital control systems creates new security threats that can endanger the safety of industrial infrastructures. Addressing the convergence of safety and security concerns in this context, we provide a comprehensive survey of existing approaches to industrial facility design and risk assessment that consider both safety and security. We also provide a comparative analysis of the different approaches identified in the literature. - Highlights: • We raise awareness of safety and security convergence in numerical control systems. • We highlight safety and security interdependencies for modern industrial systems. • We give a survey of approaches combining safety and security engineering. • We discuss the potential of the approaches to model safety and security interactions

  9. Interactive and Approachable Web-Based Tools for Exploring Global Geophysical Data Records

    Science.gov (United States)

    Croteau, M. J.; Nerem, R. S.; Merrifield, M. A.; Thompson, P. R.; Loomis, B. D.; Wiese, D. N.; Zlotnicki, V.; Larson, J.; Talpe, M.; Hardy, R. A.

    2017-12-01

    Making global and regional data accessible and understandable for non-experts can be both challenging and hazardous. While data products are often developed with end users in mind, the ease of use of these data can vary greatly. Scientists must take care to provide detailed guides for how to use data products to ensure users are not incorrectly applying data to their problem. For example, terrestrial water storage data from the Gravity Recovery and Climate Experiment (GRACE) satellite mission is notoriously difficult for non-experts to access and correctly use. However, allowing these data to be easily accessible to scientists outside the GRACE community is desirable because this would allow that data to see much wider-spread use. We have developed a web-based interactive mapping and plotting tool that provides easy access to geophysical data. This work presents an intuitive method for making such data widely accessible to experts and non-experts alike, making the data approachable and ensuring proper use of the data. This tool has proven helpful to experts by providing fast and detailed access to the data. Simultaneously, the tool allows non-experts to gain familiarity with the information contained in the data and access to that information for both scientific studies and public use. In this presentation, we discuss the development of this tool and application to both GRACE and ocean altimetry satellite missions, and demonstrate the capabilities of the tool. Focusing on the data visualization aspects of the tool, we showcase our integrations of the Mapbox API and the D3.js data-driven web document framework. We then explore the potential of these tools in other web-based visualization projects, and how incorporation of such tools into science can improve the presentation of research results. We demonstrate how the development of an interactive and exploratory resource can enable further layers of exploratory and scientific discovery.

  10. Constraint satisfaction adaptive neural network and heuristics combined approaches for generalized job-shop scheduling.

    Science.gov (United States)

    Yang, S; Wang, D

    2000-01-01

    This paper presents a constraint satisfaction adaptive neural network, together with several heuristics, to solve the generalized job-shop scheduling problem, one of NP-complete constraint satisfaction problems. The proposed neural network can be easily constructed and can adaptively adjust its weights of connections and biases of units based on the sequence and resource constraints of the job-shop scheduling problem during its processing. Several heuristics that can be combined with the neural network are also presented. In the combined approaches, the neural network is used to obtain feasible solutions, the heuristic algorithms are used to improve the performance of the neural network and the quality of the obtained solutions. Simulations have shown that the proposed neural network and its combined approaches are efficient with respect to the quality of solutions and the solving speed.

  11. Why do fearful facial expressions elicit behavioral approach? Evidence from a combined approach-avoidance implicit association test.

    Science.gov (United States)

    Hammer, Jennifer L; Marsh, Abigail A

    2015-04-01

    Despite communicating a "negative" emotion, fearful facial expressions predominantly elicit behavioral approach from perceivers. It has been hypothesized that this seemingly paradoxical effect may occur due to fearful expressions' resemblance to vulnerable, infantile faces. However, this hypothesis has not yet been tested. We used a combined approach-avoidance/implicit association test (IAT) to test this hypothesis. Participants completed an approach-avoidance lever task during which they responded to fearful and angry facial expressions as well as neutral infant and adult faces presented in an IAT format. Results demonstrated an implicit association between fearful facial expressions and infant faces and showed that both fearful expressions and infant faces primarily elicit behavioral approach. The dominance of approach responses to both fearful expressions and infant faces decreased as a function of psychopathic personality traits. Results suggest that the prosocial responses to fearful expressions observed in most individuals may stem from their associations with infantile faces. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  12. Building flexible, distributed collaboration tools using type-based publish/subscribe - The Distributed Knight case

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Damm, Christian Heide

    2004-01-01

    Distributed collaboration is becoming increasingly impor tant also in software development. Combined with an in creasing interest in experimental and agile approaches to software development, this poses challenges to tool sup port for software development. Specifically, tool support is needed...... for flexible, distributed collaboration. We intro duce the Distributed Knight tool that provides flexible and lightweight support for distributed collaboration in object oriented modelling. The Distributed Knight implementa tion builds crucially on the type-based publish/subscribe distributed communication...... paradigm, which provides an effective and natural abstraction for developing distributed collaboration tools....

  13. Estimating Hantavirus Risk in Southern Argentina: A GIS-Based Approach Combining Human Cases and Host Distribution

    Directory of Open Access Journals (Sweden)

    Veronica Andreo

    2014-01-01

    Full Text Available We use a Species Distribution Modeling (SDM approach along with Geographic Information Systems (GIS techniques to examine the potential distribution of hantavirus pulmonary syndrome (HPS caused by Andes virus (ANDV in southern Argentina and, more precisely, define and estimate the area with the highest infection probability for humans, through the combination with the distribution map for the competent rodent host (Oligoryzomys longicaudatus. Sites with confirmed cases of HPS in the period 1995–2009 were mostly concentrated in a narrow strip (~90 km × 900 km along the Andes range from northern Neuquén to central Chubut province. This area is characterized by high mean annual precipitation (~1,000 mm on average, but dry summers (less than 100 mm, very low percentages of bare soil (~10% on average and low temperatures in the coldest month (minimum average temperature −1.5 °C, as compared to the HPS-free areas, features that coincide with sub-Antarctic forests and shrublands (especially those dominated by the invasive plant Rosa rubiginosa, where rodent host abundances and ANDV prevalences are known to be the highest. Through the combination of predictive distribution maps of the reservoir host and disease cases, we found that the area with the highest probability for HPS to occur overlaps only 28% with the most suitable habitat for O. longicaudatus. With this approach, we made a step forward in the understanding of the risk factors that need to be considered in the forecasting and mapping of risk at the regional/national scale. We propose the implementation and use of thematic maps, such as the one built here, as a basic tool allowing public health authorities to focus surveillance efforts and normally scarce resources for prevention and control actions in vast areas like southern Argentina.

  14. Estimating Hantavirus Risk in Southern Argentina: A GIS-Based Approach Combining Human Cases and Host Distribution

    Science.gov (United States)

    Andreo, Veronica; Neteler, Markus; Rocchini, Duccio; Provensal, Cecilia; Levis, Silvana; Porcasi, Ximena; Rizzoli, Annapaola; Lanfri, Mario; Scavuzzo, Marcelo; Pini, Noemi; Enria, Delia; Polop, Jaime

    2014-01-01

    We use a Species Distribution Modeling (SDM) approach along with Geographic Information Systems (GIS) techniques to examine the potential distribution of hantavirus pulmonary syndrome (HPS) caused by Andes virus (ANDV) in southern Argentina and, more precisely, define and estimate the area with the highest infection probability for humans, through the combination with the distribution map for the competent rodent host (Oligoryzomys longicaudatus). Sites with confirmed cases of HPS in the period 1995–2009 were mostly concentrated in a narrow strip (~90 km × 900 km) along the Andes range from northern Neuquén to central Chubut province. This area is characterized by high mean annual precipitation (~1,000 mm on average), but dry summers (less than 100 mm), very low percentages of bare soil (~10% on average) and low temperatures in the coldest month (minimum average temperature −1.5 °C), as compared to the HPS-free areas, features that coincide with sub-Antarctic forests and shrublands (especially those dominated by the invasive plant Rosa rubiginosa), where rodent host abundances and ANDV prevalences are known to be the highest. Through the combination of predictive distribution maps of the reservoir host and disease cases, we found that the area with the highest probability for HPS to occur overlaps only 28% with the most suitable habitat for O. longicaudatus. With this approach, we made a step forward in the understanding of the risk factors that need to be considered in the forecasting and mapping of risk at the regional/national scale. We propose the implementation and use of thematic maps, such as the one built here, as a basic tool allowing public health authorities to focus surveillance efforts and normally scarce resources for prevention and control actions in vast areas like southern Argentina. PMID:24424500

  15. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    Science.gov (United States)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  16. GMOseek: a user friendly tool for optimized GMO testing.

    Science.gov (United States)

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference

  17. Customer Experience Marketing : Concepts and Tools

    OpenAIRE

    Kalaoja, Petteri

    2015-01-01

    This work studies what Customer Experience Management (CEM) is and how it can be implemented in modern marketing with an emphasis on B2C. This work takes a look into the concepts which are needed to achieve a versatile CEM approach. The tools and technologies that are needed to operate the CEM concept are also evaluated. This thesis explains that a Customer Experience Management strategy consists of a certain combination of concepts. These concepts usually include customer data, data-driv...

  18. Designing a Tool for History Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Katalin Eszter Morgan

    2012-11-01

    Full Text Available This article describes the process by which a five-dimensional tool for history textbook analysis was conceptualized and developed in three stages. The first stage consisted of a grounded theory approach to code the content of the sampled chapters of the books inductively. After that the findings from this coding process were combined with principles of text analysis as derived from the literature, specifically focusing on the notion of semiotic mediation as theorized by Lev VYGOTSKY. We explain how we then entered the third stage of the development of the tool, comprising five dimensions. Towards the end of the article we show how the tool could be adapted to serve other disciplines as well. The argument we forward in the article is for systematic and well theorized tools with which to investigate textbooks as semiotic mediators in education. By implication, textbook authors can also use these as guidelines. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs130170

  19. Dry metal forming of high alloy steel using laser generated aluminum bronze tools

    Directory of Open Access Journals (Sweden)

    Freiße Hannes

    2015-01-01

    Full Text Available Regarding the optimization of forming technology in economic and environmental aspects, avoiding lubricants is an approach to realize the vision of a new green technology. The resulting direct contact between the tool and the sheet in non-lubricated deep drawing causes higher stress and depends mainly on the material combination. The tribological system in dry sliding has to be assessed by means on the one hand of the resulting friction coefficient and on the other hand of the wear of the tool and sheet material. The potential to generate tailored tribological systems for dry metal forming could be shown within the investigations by using different material combinations and by applying different laser cladding process parameters. Furthermore, the feasibility of additive manufacturing of a deep drawing tool was demonstrated. The tool was successfully applied to form circular cups in a dry metal forming process.

  20. Design of the tool for periodic not evolvent profiles

    Directory of Open Access Journals (Sweden)

    Anisimov Roman

    2017-01-01

    Full Text Available The new approach to profiling of the tool for processing of parts with periodic not evolvent profiles are considered in the article The discriminatory analysis of periodic profiles including repetition of profile both in the plane of perpendicular axis of part, and in the plane of passing along part of axis is offered. In the basis of the offered profiling method the idea of space shaping by rated surface of product of tool surface lies. The big advantage of the offered approach in profiling is its combination with the analysis of parameters of process of engineering work. It allows to predict the accuracy and surface quality of product with not evolvent periodic profile. While using the offered approach the pinion cutter for processing of wheels with internal triangular teeths and mill for processing of the screw of the counter of consumption of liquid, complex profile of which consists of several formings, have been received

  1. A Preliminary Report on Combined Penoscrotal and Perineal Approach for Placement of Penile Prosthesis with Corporal Fibrosis

    Directory of Open Access Journals (Sweden)

    John P. Brusky

    2008-01-01

    Full Text Available Purpose. This paper aims at describing the combined penoscrotal and perineal approach for placement of penile prosthesis in cases of severe corporal fibrosis and scarring. Materials and methods. Three patients with extensive corporal fibrosis underwent penile prosthesis placement via combined penoscrotal and perineal approach from 1997 to 2006. Follow-up ranged from 15 to 129 months. Results. All patients underwent successful implantation of semirigid penile prosthesis. There were no short- or long-term complications. Conclusions. Results on combined penoscrotal and perineal approach to penile prosthetic surgery in this preliminary series of patients suggest that it is a safe technique and increases the chance of successful outcome in the surgical management of severe corporal fibrosis.

  2. PPI finder: a mining tool for human protein-protein interactions.

    Directory of Open Access Journals (Sweden)

    Min He

    Full Text Available BACKGROUND: The exponential increase of published biomedical literature prompts the use of text mining tools to manage the information overload automatically. One of the most common applications is to mine protein-protein interactions (PPIs from PubMed abstracts. Currently, most tools in mining PPIs from literature are using co-occurrence-based approaches or rule-based approaches. Hybrid methods (frame-based approaches by combining these two methods may have better performance in predicting PPIs. However, the predicted PPIs from these methods are rarely evaluated by known PPI databases and co-occurred terms in Gene Ontology (GO database. METHODOLOGY/PRINCIPAL FINDINGS: We here developed a web-based tool, PPI Finder, to mine human PPIs from PubMed abstracts based on their co-occurrences and interaction words, followed by evidences in human PPI databases and shared terms in GO database. Only 28% of the co-occurred pairs in PubMed abstracts appeared in any of the commonly used human PPI databases (HPRD, BioGRID and BIND. On the other hand, of the known PPIs in HPRD, 69% showed co-occurrences in the literature, and 65% shared GO terms. CONCLUSIONS: PPI Finder provides a useful tool for biologists to uncover potential novel PPIs. It is freely accessible at http://liweilab.genetics.ac.cn/tm/.

  3. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  4. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  5. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    assessment and response design ultimately benefit from spatial differentiation in the results. DPSIR based on LCIA seems a useful tool to improve communication and learning, as it bridges science and management while promoting the basic elements of sustainable development in a practical educational...... eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio......: environmentally sustainable, technologically feasible, economically viable, socially desirable, legally permissible, and administratively achievable. Specific LCIA indicators may provide preliminary information to support a precautionary approach to act earlier on D-P and contribute to sustainability. Impacts...

  6. First Approach to a Holistic Tool for Assessing RES Investment Feasibility

    Directory of Open Access Journals (Sweden)

    José María Flores-Arias

    2018-04-01

    Full Text Available Combining availability, viability, sustainability, technical options, and environmental impact in an energy-planning project is a difficult job itself for the today’s engineers. This becomes harder if the potential investors also need to be persuaded. Moreover, the problem increases even more if various consumptions are considered, as their patterns depend to a large extent on the type of facility and the activity. It is therefore essential to develop tools to assess the balance between generation and demand in a given installation. In this paper, a valuable tool is developed for the seamless calculation of the integration possibilities of renewable energies and the assessment of derived technical, financial and environmental impacts. Furthermore, it also considers their interaction with the power grid or other networks, raising awareness of the polluting emissions responsible for global warming. Through a series of Structured Query Language databases and a dynamic data parameterization, the software is provided with sufficient information to encode, calculate, simulate and graphically display information on the generation and demand of electric, thermal and transport energy, all in a user-friendly environment, finally providing an evaluation and feasibility report.

  7. An Approach for the Development and Implementation of an Assessment Tool for Interprofessional Education Learning Activities

    Directory of Open Access Journals (Sweden)

    Lisa Salvati

    2017-10-01

    Full Text Available The Accreditation Council for Pharmacy Education Standards 2016 state that colleges of pharmacy must assess student achievement and readiness to contribute as a member of an interprofessional collaborative patient care team. There are a limited number of assessment tools available to achieve this part of the Standards. The purpose of this Case Study Report is to describe the process that one college of pharmacy took to develop an interprofessional education (IPE assessment tool to be used for their longitudinal assessment approach for IPE in the didactic portion of the curriculum. Strategies for the development of an assessment tool are provided through three themes: continuous refinement, collaboration and streamlining. Next steps for the implementation of the assessment tool, as well as evaluating its validity and reliability, are discussed.   Type: Case Study

  8. Efficient Use of Behavioral Tools to Reduce Electricity Demand of Domestic Consumers

    Directory of Open Access Journals (Sweden)

    Elbaz Shimon

    2016-12-01

    Full Text Available Purpose: The present study investigated the main literature on the subject of methods and policies for reducing the electricity demand of domestic consumers, in order to identify the place of behavioral tools. Methodology: We used secondary sources, performing a literature review, together with analysis and synthesis. Findings: Policy makers prefer to use tools offered by neoclassical economics, such as various forms of taxation, fines and financial incentives in order to make domestic electricity consumers save electricity, on the assumption that consumers will make rational decisions while maximizing their personal benefit. However, studies conducted in recent years in the field of behavioral economics, which are based on the assumption that consumers’ decisions are not rational and are affected by cognitive biases, showed that the use of behavioral tools, such as detailed online information (feedback,social comparison information, information on varying rates (dynamic pricing and general information (advertising campaign, are tools that are not less appropriate than the ones the neoclassical economics offers, mainly because electricity is an invisible product and consumers are unable to assess it by normal cognitive measures. Using an interdisciplinary combination of behavioral tools that come from a variety of approaches taken from a wide variety of different academic fields, it is possible to receive efficient results in the endeavor of reducing electricity demand. Implications: Although the neoclassical economics still remains the fundamental theory used by policymakers, it is recommended to consider behavioral economics as a complementary approach to the neoclassical economics, and combine behavioral tools in the policymakers’ toolbox, especially when those tools do not require a significant financial investment, thus efficiently maximizing the reduction of electricity demand among domestic consumers. These theoretical results will be

  9. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  10. Combination approaches with immune checkpoint blockade in cancer therapy

    Directory of Open Access Journals (Sweden)

    Maarten Swart

    2016-11-01

    Full Text Available In healthy individuals, immune checkpoint molecules prevent autoimmune responses and limit immune cell-mediated tissue damage. Tumors frequently exploit these molecules to evade eradication by the immune system. Over the past years, immune checkpoint blockade of cytotoxic T lymphocyte antigen-4 (CTLA-4 and programmed death-1 (PD-1 emerged as promising strategies to activate anti-tumor cytotoxic T cell responses. Although complete regression and long-term survival is achieved in some patients, not all patients respond. This review describes promising, novel combination approaches involving immune checkpoint blockade, aimed at increasing response-rates to the single treatments.

  11. FEM-DEM coupling simulations of the tool wear characteristics in prestressed machining superalloy

    Directory of Open Access Journals (Sweden)

    Ruitao Peng

    2016-01-01

    Full Text Available Due to the complicated contact loading at the tool-chip interface, ceramic tool wear in prestressed machining superalloy is rare difficult to evaluate only by experimental approaches. This study aims to develop a methodology to predict the tool wear evolution by using combined FEM and DEM numerical simulations. Firstly, a finite element model for prestressed cutting is established, subsequently a discrete element model to describe the tool-chip behaviour is established based on the obtained boundary conditions by FEM simulations, finally, simulated results are experimentally validated. The predicted tool wear results show nice agreement with experiments, the simulation indicates that, within a certain range, higher cutting speed effectively results in slighter wear of Sialon ceramic tools, and deeper depth of cut leads to more serious tool wear.

  12. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  13. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  14. Combined time-varying forecast based on the proper scoring approach for wind power generation

    DEFF Research Database (Denmark)

    Chen, Xingying; Jiang, Yu; Yu, Kun

    2017-01-01

    Compared with traditional point forecasts, combined forecast have been proposed as an effective method to provide more accurate forecasts than individual model. However, the literature and research focus on wind-power combined forecasts are relatively limited. Here, based on forecasting error...... distribution, a proper scoring approach is applied to combine plausible models to form an overall time-varying model for the next day forecasts, rather than weights-based combination. To validate the effectiveness of the proposed method, real data of 3 years were used for testing. Simulation results...... demonstrate that the proposed method improves the accuracy of overall forecasts, even compared with a numerical weather prediction....

  15. Combined Yamamoto approach for simultaneous estimation of adsorption isotherm and kinetic parameters in ion-exchange chromatography.

    Science.gov (United States)

    Rüdt, Matthias; Gillet, Florian; Heege, Stefanie; Hitzler, Julian; Kalbfuss, Bernd; Guélat, Bertrand

    2015-09-25

    Application of model-based design is appealing to support the development of protein chromatography in the biopharmaceutical industry. However, the required efforts for parameter estimation are frequently perceived as time-consuming and expensive. In order to speed-up this work, a new parameter estimation approach for modelling ion-exchange chromatography in linear conditions was developed. It aims at reducing the time and protein demand for the model calibration. The method combines the estimation of kinetic and thermodynamic parameters based on the simultaneous variation of the gradient slope and the residence time in a set of five linear gradient elutions. The parameters are estimated from a Yamamoto plot and a gradient-adjusted Van Deemter plot. The combined approach increases the information extracted per experiment compared to the individual methods. As a proof of concept, the combined approach was successfully applied for a monoclonal antibody on a cation-exchanger and for a Fc-fusion protein on an anion-exchange resin. The individual parameter estimations for the mAb confirmed that the new approach maintained the accuracy of the usual Yamamoto and Van Deemter plots. In the second case, offline size-exclusion chromatography was performed in order to estimate the thermodynamic parameters of an impurity (high molecular weight species) simultaneously with the main product. Finally, the parameters obtained from the combined approach were used in a lumped kinetic model to simulate the chromatography runs. The simulated chromatograms obtained for a wide range of gradient lengths and residence times showed only small deviations compared to the experimental data. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Combining Narrative and Numerical Simulation

    DEFF Research Database (Denmark)

    Hansen, Mette Sanne; Ladeby, Klaes Rohde; Rasmussen, Lauge Baungaard

    2011-01-01

    for decision makers to systematically test several different outputs of possible solutions in order to prepare for future consequences. The CSA can be a way to evaluate risks and address possible unforeseen problems in a more methodical way than either guessing or forecasting. This paper contributes...... to the decision making in operations and production management by providing new insights into modelling and simulation based on the combined narrative and numerical simulation approach as a tool for strategy making. The research question asks, “How can the CSA be applied in a practical context to support strategy...... making?” The paper uses a case study where interviews and observations were carried out in a Danish corporation. The CSA is a new way to address decision making and has both practical value and further expands the use of strategic simulation as a management tool....

  17. Comet Methy-sens and DNMTs transcriptional analysis as a combined approach in epigenotoxicology

    Directory of Open Access Journals (Sweden)

    Alessio Perotti

    2015-05-01

    In conclusion, our data demonstrate that Comet Methy-sens, in combination with the analysis of transcriptional levels of DNA methyl transferases, represents a simple and multifunctional approach to implement biomonitoring studies on epigenotoxicological effects of known and unknown xenobiotics.

  18. A Combined Structural and Electromechanical FE Approach for Industrial Ultrasonic Devices Design

    Science.gov (United States)

    Schorderet, Alain; Prenleloup, Alain; Colla, Enrico

    2011-05-01

    Ultrasonic assistance is widely used in manufacturing, both for conventional (e.g. grinding, drilling) and non-conventional (e.g. EDM) processes. Ultrasonic machining is also used as a stand alone process for instance for micro-drilling. Industrial application of these processes requires increasingly efficient and accurate development tools to predict the performance of the ultrasonic device: the so-called sonotrode and the piezo-transducer. This electromechanical system consists of a structural part and of a piezo-electrical part (actuator). In this paper, we show how to combine two simulation softwares—for stuctures and electromechanical devices—to perform a complete design analysis and optimization of a sonotrode for ultrasonic drilling applications. The usual design criteria are the eigenfrequencies of the desired vibrational modes. In addition, during the optimization phase, one also needs to consider the maximum achievable displacement for a given applied voltage. Therefore, one must be able to predict the electromechanical behavior of the integrated piezo-structure system, in order to define, adapt and optimize the electric power supply as well as the control strategy (search, tracking of the eigenfrequency). In this procedure, numerical modelling follows a two-step approach, by means of a solid mechanics FE code (ABAQUS) and of an electromechanical simulation software (ATILA). The example presented illustrates the approach and describes the obtained results for the development of an industrial sonotrode system dedicated to ultrasonic micro-drilling of ceramics. The 3D model of the sonotrode serves as input for generating the FE mesh in ABAQUS and this mesh is then translated into an input file for ATILA. ABAQUS results are used to perform the first optimization step in order to obtain a sonotrode design leading to the requested modal behaviour—eigen-frequency and corresponding dynamic amplification. The second step aims at evaluating the dynamic

  19. Multirule Based Diagnostic Approach for the Fog Predictions Using WRF Modelling Tool

    Directory of Open Access Journals (Sweden)

    Swagata Payra

    2014-01-01

    Full Text Available The prediction of fog onset remains difficult despite the progress in numerical weather prediction. It is a complex process and requires adequate representation of the local perturbations in weather prediction models. It mainly depends upon microphysical and mesoscale processes that act within the boundary layer. This study utilizes a multirule based diagnostic (MRD approach using postprocessing of the model simulations for fog predictions. The empiricism involved in this approach is mainly to bridge the gap between mesoscale and microscale variables, which are related to mechanism of the fog formation. Fog occurrence is a common phenomenon during winter season over Delhi, India, with the passage of the western disturbances across northwestern part of the country accompanied with significant amount of moisture. This study implements the above cited approach for the prediction of occurrences of fog and its onset time over Delhi. For this purpose, a high resolution weather research and forecasting (WRF model is used for fog simulations. The study involves depiction of model validation and postprocessing of the model simulations for MRD approach and its subsequent application to fog predictions. Through this approach model identified foggy and nonfoggy days successfully 94% of the time. Further, the onset of fog events is well captured within an accuracy of 30–90 minutes. This study demonstrates that the multirule based postprocessing approach is a useful and highly promising tool in improving the fog predictions.

  20. Development of a practice tool for community-based nurses: the Heart Failure Palliative Approach to Care (HeFPAC).

    Science.gov (United States)

    Strachan, Patricia H; Joy, Cathy; Costigan, Jeannine; Carter, Nancy

    2014-04-01

    Patients living with advanced heart failure (HF) require a palliative approach to reduce suffering. Nurses have described significant knowledge gaps about the disease-specific palliative care (PC) needs of these patients. An intervention is required to facilitate appropriate end-of-life care for HF patients. The purpose of this study was to develop a user-friendly, evidence-informed HF-specific practice tool for community-based nurses to facilitate care and communication regarding a palliative approach to HF care. Guided by the Knowledge to Action framework, we identified key HF-specific issues related to advanced HF care provision within the context of a palliative approach to care. Informed by current evidence and subsequent iterative consultation with community-based and specialist PC and HF nurses, a pocket guide tool for community-based nurses was created. We developed the Heart Failure Palliative Approach to Care (HeFPAC) pocket guide to promote communication and a palliative approach to care for HF patients. The HeFPAC has potential to improve the quality of care and experiences for patients with advanced HF. It will be piloted in community-based practice and in a continuing education program for nurses. The HeFPAC pocket guide offers PC nurses a concise, evidence-informed and practical point-of care tool to communicate with other clinicians and patients about key HF issues that are associated with improving disease-specific HF palliative care and the quality of life of patients and their families. Pilot testing will offer insight as to its utility and potential for modification for national and international use.

  1. Systematic review and meta-analysis: tools for the information age.

    Science.gov (United States)

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): OVERVIEW AND DEMONSTRATION FOR FINAL PHASE 3 CONFERENCE

    Science.gov (United States)

    The U.S. contingent of the U.S.-German Bilateral Working Group is developing Sustainable Management Approaches and Revitalization Tools-electronic (SMARTe). SMARTe is a web-based, decision support system designed to assist stakeholders in developing and evaluating alternative reu...

  3. Ultimate intra-wafer critical dimension uniformity control by using lithography and etch tool corrections

    Science.gov (United States)

    Kubis, Michael; Wise, Rich; Reijnen, Liesbeth; Viatkina, Katja; Jaenen, Patrick; Luca, Melisa; Mernier, Guillaume; Chahine, Charlotte; Hellin, David; Kam, Benjamin; Sobieski, Daniel; Vertommen, Johan; Mulkens, Jan; Dusa, Mircea; Dixit, Girish; Shamma, Nader; Leray, Philippe

    2016-03-01

    With shrinking design rules, the overall patterning requirements are getting aggressively tighter. For the 7-nm node and below, allowable CD uniformity variations are entering the Angstrom region (ref [1]). Optimizing inter- and intra-field CD uniformity of the final pattern requires a holistic tuning of all process steps. In previous work, CD control with either litho cluster or etch tool corrections has been discussed. Today, we present a holistic CD control approach, combining the correction capability of the etch tool with the correction capability of the exposure tool. The study is done on 10-nm logic node wafers, processed with a test vehicle stack patterning sequence. We include wafer-to-wafer and lot-to-lot variation and apply optical scatterometry to characterize the fingerprints. Making use of all available correction capabilities (lithography and etch), we investigated single application of exposure tool corrections and of etch tool corrections as well as combinations of both to reach the lowest CD uniformity. Results of the final pattern uniformity based on single and combined corrections are shown. We conclude on the application of this holistic lithography and etch optimization to 7nm High-Volume manufacturing, paving the way to ultimate within-wafer CD uniformity control.

  4. Adaptive scallop height tool path generation for robot-based incremental sheet metal forming

    Science.gov (United States)

    Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd

    2016-10-01

    Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.

  5. Combined transnasal and transoral endoscopic approach to a transsphenoidal encephalocele in an infant.

    Science.gov (United States)

    Tan, Sien Hui; Mun, Kein Seong; Chandran, Patricia Ann; Manuel, Anura Michelle; Prepageran, Narayanan; Waran, Vicknes; Ganesan, Dharmendra

    2015-07-01

    This paper reports an unusual case of a transsphenoidal encephalocele and discusses our experience with a minimally invasive management. To the best of our knowledge, we present the first case of a combined endoscopic transnasal and transoral approach to a transsphenoidal encephalocele in an infant. A 17-day-old boy, who was referred for further assessment of upper airway obstruction, presented with respiratory distress and feeding difficulties. Bronchoscopy and imaging revealed a transsphenoidal encephalocele. At the age of 48 days, he underwent a combined endoscopic transnasal and transoral excision of the nasal component of the encephalocele. This approach, with the aid of neuronavigation, allows good demarcation of the extra-cranial neck of the transsphenoidal encephalocele. We were able to cauterize and carefully dissect the sac prior to excision. The defect of the neck was clearly visualized, and Valsalva manoeuvre was performed to exclude any CSF leak. As the defect was small, it was allowed to heal by secondary intention. The patient's recovery was uneventful, and he tolerated full feeds orally on day 2. Postoperative imaging demonstrated no evidence of recurrence of the nasal encephalocele. Endoscopic follow-up showed good healing of the mucosa and no cerebrospinal fluid leak. The surgical management of transsphenoidal encephalocele in neonates and infants is challenging. We describe a safe technique with low morbidity in managing such a condition. The combined endoscopic transnasal and transoral approach with neuronavigation is a minimally invasive, safe and feasible alternative, even for children below 1 year of age.

  6. MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms

    Science.gov (United States)

    Allred, Joel

    2012-01-01

    Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.

  7. Quality assurance tool for organ at risk delineation in radiation therapy using a parametric statistical approach.

    Science.gov (United States)

    Hui, Cheukkai B; Nourzadeh, Hamidreza; Watkins, William T; Trifiletti, Daniel M; Alonso, Clayton E; Dutta, Sunil W; Siebers, Jeffrey V

    2018-02-26

    To develop a quality assurance (QA) tool that identifies inaccurate organ at risk (OAR) delineations. The QA tool computed volumetric features from prior OAR delineation data from 73 thoracic patients to construct a reference database. All volumetric features of the OAR delineation are computed in three-dimensional space. Volumetric features of a new OAR are compared with respect to those in the reference database to discern delineation outliers. A multicriteria outlier detection system warns users of specific delineation outliers based on combinations of deviant features. Fifteen independent experimental sets including automatic, propagated, and clinically approved manual delineation sets were used for verification. The verification OARs included manipulations to mimic common errors. Three experts reviewed the experimental sets to identify and classify errors, first without; and then 1 week after with the QA tool. In the cohort of manual delineations with manual manipulations, the QA tool detected 94% of the mimicked errors. Overall, it detected 37% of the minor and 85% of the major errors. The QA tool improved reviewer error detection sensitivity from 61% to 68% for minor errors (P = 0.17), and from 78% to 87% for major errors (P = 0.02). The QA tool assists users to detect potential delineation errors. QA tool integration into clinical procedures may reduce the frequency of inaccurate OAR delineation, and potentially improve safety and quality of radiation treatment planning. © 2018 American Association of Physicists in Medicine.

  8. Report on tool transfer and alignment methods

    DEFF Research Database (Denmark)

    Tosello, Guido; Gasparin, Stefania; De Grave, Arnaud

    2010-01-01

    In the last few years, research work has been carried out regarding the feature miniaturization and tooling performance achievable with specific process chains combining different micro machining processes. On the other hand, technologies, strategies and tool design rules in order to optimize the...... process chain for tooling (i.e. hybrid tooling) based on the combination of micro milling and micro electrical discharge machining (EDM) milling, both at machine tool and at machining technology levels....

  9. Investigating Learner Attitudes toward E-Books as Learning Tools: Based on the Activity Theory Approach

    Science.gov (United States)

    Liaw, Shu-Sheng; Huang, Hsiu-Mei

    2016-01-01

    This paper investigates the use of e-books as learning tools in terms of learner satisfaction, usefulness, behavioral intention, and learning effectiveness. Based on the activity theory approach, this research develops a research model to understand learner attitudes toward e-books in two physical sizes: 10? and 7?. Results suggest that screen…

  10. The tools for evaluating logistics processes

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2013-12-01

    Full Text Available Background: The growing importance of business process approach and dynamic management is triggered by market expectations for lead time reductions and the pressure for cost cuts. An efficient process management requires measurement and assessment skills. This article is intended to present the tools used in evaluating processes and the way in which they work together under simulated conditions. Methods: The project's Authors believe that a process can be assessed by measuring its attributes: cost, time and quality. An assessment tool has been developed for each of those attributes. For costs - it could be activity based costing, for time - value stream mapping; for quality - statistical process control. Each tool allows for evaluating one of the attributes, any element in the process hierarchy. The methods presented in the paper have been supplemented with process modelling and simulation. Results: In order to show how process assessment tools are combined with process simulation the Authors show a sample process in three versions (serial, parallel and mixed. A variant simulation (using iGrafx software allows for determining the values of attributes in the entire process based on the data set for its components (activities. In the example under investigation the process variant has no impact on its quality. Process cost and time are affected. Conclusions: The tools for identifying attribute values, in combination with process modelling and simulation, can prove very beneficial when applied in business practice. In the first place they allow for evaluating a process based on the value of the attributes pertaining to its particular activities, which, on the other hand, raises the possibility of process configuration at the design stage. The solution presented in the paper can be developed further with a view to process standardization and best variant recommendation.  

  11. Degradation of 2,4-dichlorophenol using combined approach based on ultrasound, ozone and catalyst.

    Science.gov (United States)

    Barik, Arati J; Gogate, Parag R

    2017-05-01

    The present work investigates the application of ultrasound and ozone operated individually and in combination with catalyst (ZnO and CuO) for establishing the possible synergistic effects for the degradation of 2,4-dichlorophenol. The dependency of extent of degradation on the operating parameters like temperature (over the range of 30-36°C), initial pH (3-9), catalyst as ZnO (loading of 0.025-0.15g/L) and CuO (loading of 0.02-0.1g/L) and initial concentration of 2,4-DCP (20-50ppm) has been established to maximize the efficacy of ultrasound (US) induced degradation. Using only US, the maximum degradation of 2,4-DCP obtained was 28.85% under optimized conditions of initial concentration as 20ppm, pH of 5 and temperature of 34°C. Study of effect of ozone flow rate for approach of only ozone revealed that maximum degradation was obtained at 400mg/h ozone flow rate. The combined approaches such as US+O 3 , US+ZnO, US+CuO, O 3 +ZnO, O 3 +CuO, US+O 3 +ZnO and US+O 3 +CuO have been subsequently investigated under optimized conditions and observed to be more efficient as compared to individual approaches. The maximum extent of degradation for the combined operation of US+O 3 (400mg/h)+ZnO (0.1g/L) and US+O 3 (400mg/h)+CuO (0.08g/L) has been obtained as 95.66% and 97.03% respectively. The degradation products of 2,4-DCP have been identified using GC-MS analysis and the toxicity analysis has also been performed based on the anti-microbial activity test (agar-well diffusion method) for the different treatment strategies. The present work has conclusively established that the combined approach of US+O 3 +CuO was the most efficient treatment scheme resulting in near complete degradation of 2,4-DCP with production of less toxic intermediates. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  13. TIPS Placement via Combined Transjugular and Transhepatic Approach for Cavernous Portal Vein Occlusion: Targeted Approach

    Directory of Open Access Journals (Sweden)

    Natanel Jourabchi

    2013-01-01

    Full Text Available Purpose. We report a novel technique which aided recanalization of an occluded portal vein for transjugular intrahepatic portosystemic shunt (TIPS creation in a patient with symptomatic portal vein thrombosis with cavernous transformation. Some have previously considered cavernous transformation a contraindication to TIPS. Case Presentation. 62-year-old man with chronic pancreatitis, portal vein thrombosis, portal hypertension and recurrent variceal bleeding presents with melena and hematemesis. The patient was severely anemic, hemodynamically unstable, and required emergent portal decompression. Attempts to recanalize the main portal vein using traditional transjugular access were unsuccessful. After percutaneous transhepatic right portal vein access and navigation of a wire through the occluded main portal vein, an angioplasty balloon was inflated at the desired site of shunt takeoff. The balloon was targeted and punctured from the transjugular approach, and a wire was passed into the portal system. TIPS placement then proceeded routinely. Conclusion. Although occlusion of the portal vein increases difficulty of performing TIPS, it should not be considered an absolute contraindication. We have described a method for recanalizing an occluded portal vein using a combined transhepatic and transjugular approach for TIPS. This approach may be useful to relieve portal hypertension in patients who fail endoscopic and/or surgical therapies.

  14. Combining Digital Archives Content with Serious Game Approach to Create a Gamified Learning Experience

    Directory of Open Access Journals (Sweden)

    D.-T. Shih

    2015-08-01

    Full Text Available This paper presents an interdisciplinary to develop content-aware application that combines game with learning on specific categories of digital archives. The employment of content-oriented game enhances the gamification and efficacy of learning in culture education on architectures and history of Hsinchu County, Taiwan. The gamified form of the application is used as a backbone to support and provide a strong stimulation to engage users in learning art and culture, therefore this research is implementing under the goal of “The Digital ARt/ARchitecture Project”. The purpose of the abovementioned project is to develop interactive serious game approaches and applications for Hsinchu County historical archives and architectures. Therefore, we present two applications, “3D AR for Hukou Old ” and “Hsinchu County History Museum AR Tour” which are in form of augmented reality (AR. By using AR imaging techniques to blend real object and virtual content, the users can immerse in virtual exhibitions of Hukou Old Street and Hsinchu County History Museum, and to learn in ubiquitous computing environment. This paper proposes a content system that includes tools and materials used to create representations of digitized cultural archives including historical artifacts, documents, customs, religion, and architectures. The Digital ARt / ARchitecture Project is based on the concept of serious game and consists of three aspects: content creation, target management, and AR presentation. The project focuses on developing a proper approach to serve as an interactive game, and to offer a learning opportunity for appreciating historic architectures by playing AR cards. Furthermore, the card game aims to provide multi-faceted understanding and learning experience to help user learning through 3D objects, hyperlinked web data, and the manipulation of learning mode, and then effectively developing their learning levels on cultural and historical archives in

  15. Combining Digital Archives Content with Serious Game Approach to Create a Gamified Learning Experience

    Science.gov (United States)

    Shih, D.-T.; Lin, C. L.; Tseng, C.-Y.

    2015-08-01

    This paper presents an interdisciplinary to develop content-aware application that combines game with learning on specific categories of digital archives. The employment of content-oriented game enhances the gamification and efficacy of learning in culture education on architectures and history of Hsinchu County, Taiwan. The gamified form of the application is used as a backbone to support and provide a strong stimulation to engage users in learning art and culture, therefore this research is implementing under the goal of "The Digital ARt/ARchitecture Project". The purpose of the abovementioned project is to develop interactive serious game approaches and applications for Hsinchu County historical archives and architectures. Therefore, we present two applications, "3D AR for Hukou Old " and "Hsinchu County History Museum AR Tour" which are in form of augmented reality (AR). By using AR imaging techniques to blend real object and virtual content, the users can immerse in virtual exhibitions of Hukou Old Street and Hsinchu County History Museum, and to learn in ubiquitous computing environment. This paper proposes a content system that includes tools and materials used to create representations of digitized cultural archives including historical artifacts, documents, customs, religion, and architectures. The Digital ARt / ARchitecture Project is based on the concept of serious game and consists of three aspects: content creation, target management, and AR presentation. The project focuses on developing a proper approach to serve as an interactive game, and to offer a learning opportunity for appreciating historic architectures by playing AR cards. Furthermore, the card game aims to provide multi-faceted understanding and learning experience to help user learning through 3D objects, hyperlinked web data, and the manipulation of learning mode, and then effectively developing their learning levels on cultural and historical archives in Hsinchu County.

  16. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  17. A combined segmenting and non-segmenting approach to signal quality estimation for ambulatory photoplethysmography

    International Nuclear Information System (INIS)

    Wander, J D; Morris, D

    2014-01-01

    Continuous cardiac monitoring of healthy and unhealthy patients can help us understand the progression of heart disease and enable early treatment. Optical pulse sensing is an excellent candidate for continuous mobile monitoring of cardiovascular health indicators, but optical pulse signals are susceptible to corruption from a number of noise sources, including motion artifact. Therefore, before higher-level health indicators can be reliably computed, corrupted data must be separated from valid data. This is an especially difficult task in the presence of artifact caused by ambulation (e.g. walking or jogging), which shares significant spectral energy with the true pulsatile signal. In this manuscript, we present a machine-learning-based system for automated estimation of signal quality of optical pulse signals that performs well in the presence of periodic artifact. We hypothesized that signal processing methods that identified individual heart beats (segmenting approaches) would be more error-prone than methods that did not (non-segmenting approaches) when applied to data contaminated by periodic artifact. We further hypothesized that a fusion of segmenting and non-segmenting approaches would outperform either approach alone. Therefore, we developed a novel non-segmenting approach to signal quality estimation that we then utilized in combination with a traditional segmenting approach. Using this system we were able to robustly detect differences in signal quality as labeled by expert human raters (Pearson’s r = 0.9263). We then validated our original hypotheses by demonstrating that our non-segmenting approach outperformed the segmenting approach in the presence of contaminated signal, and that the combined system outperformed either individually. Lastly, as an example, we demonstrated the utility of our signal quality estimation system in evaluating the trustworthiness of heart rate measurements derived from optical pulse signals. (paper)

  18. Combined supra-transorbital keyhole approach for treatment of delayed intraorbital encephalocele: A minimally invasive approach for an unusual complication of decompressive craniectomy

    Science.gov (United States)

    di Somma, Lucia; Iacoangeli, Maurizio; Nasi, Davide; Balercia, Paolo; Lupi, Ettore; Girotto, Riccardo; Polonara, Gabriele; Scerrati, Massimo

    2016-01-01

    Background: Intraorbital encephalocele is a rare entity characterized by the herniation of cerebral tissue inside the orbital cavity through a defect of the orbital roof. In patients who have experienced head trauma, intraorbital encephalocele is usually secondary to orbital roof fracture. Case Description: We describe here a case of a patient who presented an intraorbital encephalocele 2 years after severe traumatic brain injury, treated by decompressive craniectomy and subsequent autologous cranioplasty, without any evidence of orbital roof fracture. The encephalocele removal and the subsequent orbital roof reconstruction were performed by using a modification of the supraorbital keyhole approach, in which we combine an orbital osteotomy with a supraorbital minicraniotomy to facilitate view and access to both the anterior cranial fossa and orbital compartment and to preserve the already osseointegrated autologous cranioplasty. Conclusions: The peculiarities of this case are the orbital encephalocele without an orbital roof traumatic fracture, and the combined minimally invasive approach used to fix both the encephalocele and the orbital roof defect. Delayed intraorbital encephalocele is probably a complication related to an unintentional opening of the orbit during decompressive craniectomy through which the brain herniated following the restoration of physiological intracranial pressure gradients after the bone flap repositioning. The reconstruction of the orbital roof was performed by using a combined supra-transorbital minimally invasive approach aiming at achieving adequate surgical exposure while preserving the autologous cranioplasty, already osteointegrated. To the best of our knowledge, this approach has not been previously used to address intraorbital encephalocele. PMID:26862452

  19. Combined object-oriented approach for development of process control systems

    International Nuclear Information System (INIS)

    Antonova, I.; Batchkova, I.

    2013-01-01

    Full text: The traditional approaches for development of software control system in automation an information technology based on a directly code creation are no longer effective and successful enough. The response to these challenges is the Model Driven Engineering (MDE) or its counter part in the field of architectures Model Driven Architecture (MDA). One of the most promising approach supporting MDE and MDA is UML. It does not specify a methodology for software or system design but aims to provide an integrated modeling framework for structural, functional and behavior descriptions. The success of UML in many object-oriented approaches led to an idea of applying UML to design of multi agent systems. The approach proposed in this paper applies modified Harmony methodology and is based on the combined use of UML profile for system engineering, IEC61499 standard and FIPA standard protocols. The benefits of object-oriented paradigm and the models of IEC61499 standard are extended with UML/SysML and FIPA notations. The development phases are illustrated with the UML models of a simple process control system. The main benefits of using the proposed approach can be summarized as: it provides consistency in the syntax and underlying semantics; increases the potential and likelihood of reuse; supports the whole software development life cycle in the field of process control. Including the SysML features, based on extended activity and parametric diagrams, flow ports and items to the proposed approach opens the possibilities for modeling of continuous system and support the development in field of process control. Another advantage, connected to the UML/MARTE profile is the possibility for analysis of the designed system and detailed design of the hardware and software platform of the modeled application. Key words: object-oriented modeling, control system, UML, SysML, IEC 61499

  20. From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment

    Science.gov (United States)

    Klose, M.; Damm, B.

    2014-12-01

    The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their

  1. Combining Statistical Methodologies in Water Quality Monitoring in a Hydrological Basin - Space and Time Approaches

    OpenAIRE

    Costa, Marco; A. Manuela Gonçalves

    2012-01-01

    In this work are discussed some statistical approaches that combine multivariate statistical techniques and time series analysis in order to describe and model spatial patterns and temporal evolution by observing hydrological series of water quality variables recorded in time and space. These approaches are illustrated with a data set collected in the River Ave hydrological basin located in the Northwest region of Portugal.

  2. Approaches for assessing sustainable remediation

    DEFF Research Database (Denmark)

    Søndergaard, Gitte Lemming; Binning, Philip John; Bjerg, Poul Løgstrup

    Sustainable remediation seeks to reduce direct contaminant point source impacts on the environment, while minimizing the indirect cost of remediation to the environment, society and economy. This paper presents an overview of available approaches for assessing the sustainability of alternative...... remediation strategies for a contaminated site. Most approaches use multi-criteria assessment methods (MCA) to structure a decision support process. Different combinations of environmental, social and economic criteria are employed, and are assessed either in qualitative or quantitative forms with various...... tools such as life cycle assessment and cost benefit analysis. Stakeholder involvement, which is a key component of sustainable remediation, is conducted in various ways. Some approaches involve stakeholders directly in the evaluation or weighting of criteria, whereas other approaches only indirectly...

  3. Combined Tensor Fitting and TV Regularization in Diffusion Tensor Imaging Based on a Riemannian Manifold Approach.

    Science.gov (United States)

    Baust, Maximilian; Weinmann, Andreas; Wieczorek, Matthias; Lasser, Tobias; Storath, Martin; Navab, Nassir

    2016-08-01

    In this paper, we consider combined TV denoising and diffusion tensor fitting in DTI using the affine-invariant Riemannian metric on the space of diffusion tensors. Instead of first fitting the diffusion tensors, and then denoising them, we define a suitable TV type energy functional which incorporates the measured DWIs (using an inverse problem setup) and which measures the nearness of neighboring tensors in the manifold. To approach this functional, we propose generalized forward- backward splitting algorithms which combine an explicit and several implicit steps performed on a decomposition of the functional. We validate the performance of the derived algorithms on synthetic and real DTI data. In particular, we work on real 3D data. To our knowledge, the present paper describes the first approach to TV regularization in a combined manifold and inverse problem setup.

  4. Human in vitro 3D co-culture model to engineer vascularized bone-mimicking tissues combining computational tools and statistical experimental approach.

    Science.gov (United States)

    Bersini, Simone; Gilardi, Mara; Arrigoni, Chiara; Talò, Giuseppe; Zamai, Moreno; Zagra, Luigi; Caiolfa, Valeria; Moretti, Matteo

    2016-01-01

    The generation of functional, vascularized tissues is a key challenge for both tissue engineering applications and the development of advanced in vitro models analyzing interactions among circulating cells, endothelium and organ-specific microenvironments. Since vascularization is a complex process guided by multiple synergic factors, it is critical to analyze the specific role that different experimental parameters play in the generation of physiological tissues. Our goals were to design a novel meso-scale model bridging the gap between microfluidic and macro-scale studies, and high-throughput screen the effects of multiple variables on the vascularization of bone-mimicking tissues. We investigated the influence of endothelial cell (EC) density (3-5 Mcells/ml), cell ratio among ECs, mesenchymal stem cells (MSCs) and osteo-differentiated MSCs (1:1:0, 10:1:0, 10:1:1), culture medium (endothelial, endothelial + angiopoietin-1, 1:1 endothelial/osteo), hydrogel type (100%fibrin, 60%fibrin+40%collagen), tissue geometry (2 × 2 × 2, 2 × 2 × 5 mm(3)). We optimized the geometry and oxygen gradient inside hydrogels through computational simulations and we analyzed microvascular network features including total network length/area and vascular branch number/length. Particularly, we employed the "Design of Experiment" statistical approach to identify key differences among experimental conditions. We combined the generation of 3D functional tissue units with the fine control over the local microenvironment (e.g. oxygen gradients), and developed an effective strategy to enable the high-throughput screening of multiple experimental parameters. Our approach allowed to identify synergic correlations among critical parameters driving microvascular network development within a bone-mimicking environment and could be translated to any vascularized tissue. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Comparison of online, hands-on, and a combined approach for teaching cautery disbudding technique to dairy producers.

    Science.gov (United States)

    Winder, Charlotte B; LeBlanc, Stephen J; Haley, Derek B; Lissemore, Kerry D; Godkin, M Ann; Duffield, Todd F

    2018-01-01

    The use of pain control for disbudding and dehorning is important from both an animal and industry perspective. Best practices include the use of local anesthetic, commonly given as a cornual nerve block (CNB), and a nonsteroidal anti-inflammatory drug. The proportion is decreasing, but many dairy producers do not use local anesthesia, perhaps in part due to lack of knowledge of the CNB technique. Although this skill is typically learned in person from a veterinarian, alternative methods may be useful. The objective of this trial was to determine if there were differences in the efficacy of online training (n = 23), hands-on training (n = 20), and a combined approach (n = 23) for teaching producers to successfully administer a CNB and disbud a calf. The primary outcome was block efficacy, defined as a lack of established pain behaviors during iron application. Secondary outcomes were background knowledge (assessed by a written quiz), CNB and disbudding technique (evaluated by rubric scoring), time taken, and self-confidence before and after evaluation. Associations between training group and outcome were assessed with logistic regression, ordered logistic regression, and Cox-proportional hazard models, with a random effect for workshop. Block efficacy was not different between training groups, with 91% successful in both combined and online groups, and 75% in the hands-on trained group. Online learners had poorer technical scores than hands-on trainees. The combined group was not different from hands-on. Time to block completion tended to be longer for the online group (62 ± 11 s), whereas time to disbudding completion was not different between hands-on (41 ± 5 s) or combined trainees (41 ± 5 s). The combined group had the highest pre-evaluation confidence score, and remained higher after evaluation than online but was not different than hands-on. Although we saw some statistical differences between groups, absolute differences were small and block efficacy was

  6. Use of an interdisciplinary, participatory design approach to develop a usable patient self-assessment tool in atrial fibrillation

    Directory of Open Access Journals (Sweden)

    MacCallum L

    2013-11-01

    Full Text Available Lori MacCallum,1,2 Heather McGaw,1 Nazanin Meshkat,3 Alissia Valentinis,4 Leslie Beard Ashley,5 Rajan Sacha Bhatia,3,6,7 Kaye Benson,7 Noah Ivers,6,8 Kori Leblanc,2,7 Dante Morra3,5,7 1Li Ka Shing Knowledge Institute, St Michael's Hospital, Toronto, 2Leslie Dan Faculty of Pharmacy, University of Toronto, 3Department of Medicine, University of Toronto, Toronto, 4Taddle Creek Family Health Team, Toronto, 5Trillium Health Partners, Mississauga, 6Women's College Hospital, Toronto, 7Centre for Innovation in Complex Care, University Health Network, Toronto, 8Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada Abstract: After identifying that significant care gaps exist within the management of atrial fibrillation (AF, a patient-focused tool was developed to help patients better assess and manage their AF. This tool aims to provide education and awareness regarding the management of symptoms and stroke risk associated with AF, while engaging patients to identify if their condition is optimally managed and to become involved in their own care. An interdisciplinary group of health care providers and designers worked together in a participatory design approach to develop the tool with input from patients. Usability testing was completed with 22 patients of varying demographics to represent the characteristics of the patient population. The findings from usability testing interviews were used to further improve and develop the tool to improve ease of use. A physician-facing tool was also developed to help to explain the tool and provide a brief summary of the 2012 Canadian Cardiovascular Society atrial fibrillation guidelines. By incorporating patient input and human-centered design with the knowledge, experience, and medical expertise of health care providers, we have used an approach in developing the tool that tries to more effectively meet patients' needs. Keywords: patient education, atrial fibrillation, care gaps

  7. Recommender System and Web 2.0 Tools to Enhance a Blended Learning Model

    Science.gov (United States)

    Hoic-Bozic, Natasa; Dlab, Martina Holenko; Mornar, Vedran

    2016-01-01

    Blended learning models that combine face-to-face and online learning are of great importance in modern higher education. However, their development should be in line with the recent changes in e-learning that emphasize a student-centered approach and use tools available on the Web to support the learning process. This paper presents research on…

  8. Orthology detection combining clustering and synteny for very large datasets

    OpenAIRE

    Lechner, Marcus; Hernandez-Rosales, Maribel; Doerr, Daniel; Wieseke, Nicolas; Thévenin, Annelyse; Stoye, Jens; Hartmann, Roland K.; Prohaska, Sonja J.; Stadler, Peter F.

    2014-01-01

    The elucidation of orthology relationships is an important step both in gene function prediction as well as towards understanding patterns of sequence evolution. Orthology assignments are usually derived directly from sequence similarities for large data because more exact approaches exhibit too high computational costs. Here we present PoFF, an extension for the standalone tool Proteinortho, which enhances orthology detection by combining clustering, sequence similarity, and synteny. In the ...

  9. Tool life and surface roughness of ceramic cutting tool when turning AISI D2 tool steel

    International Nuclear Information System (INIS)

    Wan Emri Wan Abdul Rahaman

    2007-01-01

    The tool life of physical vapor deposition (PVD) titanium nitride (TiN) coated ceramic when turning AISI D2 tool steel of hardness 54-55 HRC was investigated. The experiments were conducted at various cutting speed and feed rate combinations with constant depth of cut and under dry cutting condition. The tool life of the cutting tool for all cutting conditions was obtained. The tool failure mode and wear mechanism were also investigated. The wear mechanism that is responsible for the wear form is abrasion and diffusion. Flank wear and crater wear are the main wear form found when turning AISI D2 grade hardened steel with 54-55 HRC using KY 4400 ceramic cutting tool. Additionally catastrophic failure is observed at cutting speed of 183 m/min and feed rate of 0.16 mm/ rev. (author)

  10. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  11. Evaluating the importance of surface soil contributions to reservoir sediment in alpine environments: a combined modelling and fingerprinting approach in the Posets-Maladeta Natural Park

    Science.gov (United States)

    Palazón, L.; Gaspar, L.; Latorre, B.; Blake, W. H.; Navas, A.

    2014-09-01

    Soil in alpine environments plays a key role in the development of ecosystem services and in order to maintain and preserve this important resource, information is required on processes that lead to soil erosion. Similar to other mountain alpine environments, the Benasque catchment is characterised by temperatures below freezing that can last from November to April, intense rainfall events, typically in spring and autumn, and rugged topography which makes assessment of erosion challenging. Indirect approaches to soil erosion assessment, such as combined model approaches, offer an opportunity to evaluate soil erosion in such areas. In this study (i) the SWAT (Soil and Water Assessment Tool) hydrological and erosion model and (ii) sediment fingerprinting procedures were used in parallel to assess the viability of a combined modelling and tracing approach to evaluate soil erosion processes in the area of the Posets-Maladeta Natural Park (central Spanish Pyrenees). Soil erosion rates and sediment contribution of potential sediment sources defined by soil type (Kastanozems/Phaeozems; Fluvisols and Cambisols) were assessed. The SWAT model suggested that, with the highest specific sediment yields, Cambisols are the main source of sediment in the Benasque catchment and Phaeozems and Fluvisols were identified as the lowest sediment contributors. Spring and winter model runs gave the highest and lowest specific sediment yield, respectively. In contrast, sediment fingerprinting analysis identified Fluvisols, which dominate the riparian zone, as the main sediment source at the time of sampling. This indicates the importance of connectivity as well as potential differences in the source dynamic of material in storage versus that transported efficiently from the system at times of high flow. The combined approach enabled us to better understand soil erosion processes in the Benasque alpine catchment, wherein SWAT identified areas of potential high sediment yield in large flood

  12. An approach for flood monitoring by the combined use of Landsat 8 optical imagery and COSMO-SkyMed radar imagery

    Science.gov (United States)

    Tong, Xiaohua; Luo, Xin; Liu, Shuguang; Xie, Huan; Chao, Wei; Liu, Shuang; Liu, Shijie; Makhinov, A. N.; Makhinova, A. F.; Jiang, Yuying

    2018-02-01

    Remote sensing techniques offer potential for effective flood detection with the advantages of low-cost, large-scale, and real-time surface observations. The easily accessible data sources of optical remote sensing imagery provide abundant spectral information for accurate surface water body extraction, and synthetic aperture radar (SAR) systems represent a powerful tool for flood monitoring because of their all-weather capability. This paper introduces a new approach for flood monitoring by the combined use of both Landsat 8 optical imagery and COSMO-SkyMed radar imagery. Specifically, the proposed method applies support vector machine and the active contour without edges model for water extent determination in the periods before and during the flood, respectively. A map difference method is used for the flood inundation analysis. The proposed approach is particularly suitable for large-scale flood monitoring, and it was tested on a serious flood that occurred in northeastern China in August 2013, which caused immense loss of human lives and properties. High overall accuracies of 97.46% for the optical imagery and 93.70% for the radar imagery are achieved by the use of the techniques presented in this study. The results show that about 12% of the whole study area was inundated, corresponding to 5466 km2 of land surface.

  13. A combined analytic-numeric approach for some boundary-value problems

    Directory of Open Access Journals (Sweden)

    Mustafa Turkyilmazoglu

    2016-02-01

    Full Text Available A combined analytic-numeric approach is undertaken in the present work for the solution of boundary-value problems in the finite or semi-infinite domains. Equations to be treated arise specifically from the boundary layer analysis of some two and three-dimensional flows in fluid mechanics. The purpose is to find quick but accurate enough solutions. Taylor expansions at either boundary conditions are computed which are next matched to the other asymptotic or exact boundary conditions. The technique is applied to the well-known Blasius as well as Karman flows. Solutions obtained in terms of series compare favorably with the existing ones in the literature.

  14. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  15. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  16. A new technology perspective and engineering tools approach for large, complex and distributed mission and safety critical systems components

    Science.gov (United States)

    Carrio, Miguel A., Jr.

    1988-01-01

    Rapidly emerging technology and methodologies have out-paced the systems development processes' ability to use them effectively, if at all. At the same time, the tools used to build systems are becoming obsolescent themselves as a consequence of the same technology lag that plagues systems development. The net result is that systems development activities have not been able to take advantage of available technology and have become equally dependent on aging and ineffective computer-aided engineering tools. New methods and tools approaches are essential if the demands of non-stop and Mission and Safety Critical (MASC) components are to be met.

  17. Partial maxillectomy for ameloblastoma of the maxilla with infratemporal fossa involvement: A combined endoscopic endonasal and transoral approach.

    Science.gov (United States)

    Guha, A; Hart, L; Polachova, H; Chovanec, M; Schalek, P

    2018-02-21

    Ameloblastoma represents the most common epithelial odontogenic tumor. Because of the proximity of the maxillary tumors to the orbit and skull base, it should be managed as radically as possible. Maxillectomy, mainly via the transfacial or transoral approach, represents the most common type of surgical procedure. Drawback of these approaches is limited control of the superiomedial extent of the tumour in the paranasal area. We report the use of a combined endoscopic endonasal and transoral approach to manage maxillary plexiform ameloblastoma in a 48-year-old male patient. A combined endoscopic endonasal and transoral approach enabled the radical removal of tumour with a 1.5cm margin of radiographically intact bone with good control from both intrasinusal and intraoral aspects. Adequate visualization of the extent of the lesion (e.g. orbit, infratemporal fossa, anterior cranial base) had been achieved. Non-complicated healing was achieved. This technique of partial maxillectomy led to very good aesthetic and functional results. No recurrence had been noted during review appointments. The combination of endoscopic endonasal and transoral approach for a partial maxillectomy allows sufficient reduction of the defect, thus eliminating the necessity for reconstruction and reducing the morbidity associated with it. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  18. PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.

    Directory of Open Access Journals (Sweden)

    Wenhan Chen

    Full Text Available Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/

  19. Hybrid Modeling and Optimization of Manufacturing Combining Artificial Intelligence and Finite Element Method

    CERN Document Server

    Quiza, Ramón; Davim, J Paulo

    2012-01-01

    Artificial intelligence (AI) techniques and the finite element method (FEM) are both powerful computing tools, which are extensively used for modeling and optimizing manufacturing processes. The combination of these tools has resulted in a new flexible and robust approach as several recent studies have shown. This book aims to review the work already done in this field as well as to expose the new possibilities and foreseen trends. The book is expected to be useful for postgraduate students and researchers, working in the area of modeling and optimization of manufacturing processes.

  20. A Benders decomposition approach for a combined heat and power economic dispatch

    International Nuclear Information System (INIS)

    Abdolmohammadi, Hamid Reza; Kazemi, Ahad

    2013-01-01

    Highlights: • Benders decomposition algorithm to solve combined heat and power economic dispatch. • Decomposing the CHPED problem into master problem and subproblem. • Considering non-convex heat-power feasible region efficiently. • Solving 4 units and 5 units system with 2 and 3 co-generation units, respectively. • Obtaining better or as well results in terms of objective values. - Abstract: Recently, cogeneration units have played an increasingly important role in the utility industry. Therefore the optimal utilization of multiple combined heat and power (CHP) systems is an important optimization task in power system operation. Unlike power economic dispatch, which has a single equality constraint, two equality constraints must be met in combined heat and power economic dispatch (CHPED) problem. Moreover, in the cogeneration units, the power capacity limits are functions of the unit heat productions and the heat capacity limits are functions of the unit power generations. Thus, CHPED is a complicated optimization problem. In this paper, an algorithm based on Benders decomposition (BD) is proposed to solve the economic dispatch (ED) problem for cogeneration systems. In the proposed method, combined heat and power economic dispatch problem is decomposed into a master problem and subproblem. The subproblem generates the Benders cuts and master problem uses them as a new inequality constraint which is added to the previous constraints. The iterative process will continue until upper and lower bounds of the objective function optimal values are close enough and a converged optimal solution is found. Benders decomposition based approach is able to provide a good framework to consider the non-convex feasible operation regions of cogeneration units efficiently. In this paper, a four-unit system with two cogeneration units and a five-unit system with three cogeneration units are analyzed to exhibit the effectiveness of the proposed approach. In all cases, the

  1. Micro tooling technologies for polymer micro replication: direct, indirect and hybrid process chains

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard

    2009-01-01

    The increasing employment of micro products, of products containing micro parts and of products with micro-structured surfaces calls for mass fabrication technologies based on replication processes. In many cases, a suitable solution is given by the use of polymer micro products, whose production...... and performance of the corresponding micro mould. Traditional methods of micro tooling, such as various machining processes (e.g. micro milling, micro electrical discharge machining) have already reached their limitations with decreasing dimensions of mould inserts and cavities. To this respect, tooling process...... chains based on combination of micro manufacturing processes (defined as hybrid tooling) have been established in order to obtain further features miniaturization and increased accuracy. In this paper, examples and performance of different hybrid tooling approaches as well as challenges, opportunities...

  2. A simple network agreement-based approach for combining evidences in a heterogeneous sensor network

    Directory of Open Access Journals (Sweden)

    Raúl Eusebio-Grande

    2015-12-01

    Full Text Available In this research we investigate how the evidences provided by both static and mobile nodes that are part of a heterogenous sensor network can be combined to have trustworthy results. A solution relying on a network agreement-based approach was implemented and tested.

  3. The Use of a Combined Bioinformatics Approach to Locate Antibiotic Resistance Genes on Plasmids From Whole Genome Sequences of Salmonella enterica Serovars From Humans in Ghana

    Directory of Open Access Journals (Sweden)

    Egle Kudirkiene

    2018-05-01

    Full Text Available In the current study, we identified plasmids carrying antimicrobial resistance genes in draft whole genome sequences of 16 selected Salmonella enterica isolates representing six different serovars from humans in Ghana. The plasmids and the location of resistance genes in the genomes were predicted using a combination of PlasmidFinder, ResFinder, plasmidSPAdes and BLAST genomic analysis tools. Subsequently, S1-PFGE was employed for analysis of plasmid profiles. Whole genome sequencing confirmed the presence of antimicrobial resistance genes in Salmonella isolates showing multidrug resistance phenotypically. ESBL, either blaTEM52−B or blaCTX−M15 were present in two cephalosporin resistant isolates of S. Virchow and S. Poona, respectively. The systematic genome analysis revealed the presence of different plasmids in different serovars, with or without insertion of antimicrobial resistance genes. In S. Enteritidis, resistance genes were carried predominantly on plasmids of IncN type, in S. Typhimurium on plasmids of IncFII(S/IncFIB(S/IncQ1 type. In S. Virchow and in S. Poona, resistance genes were detected on plasmids of IncX1 and TrfA/IncHI2/IncHI2A type, respectively. The latter two plasmids were described for the first time in these serovars. The combination of genomic analytical tools allowed nearly full mapping of the resistance plasmids in all Salmonella strains analyzed. The results suggest that the improved analytical approach used in the current study may be used to identify plasmids that are specifically associated with resistance phenotypes in whole genome sequences. Such knowledge would allow the development of rapid multidrug resistance tracking tools in Salmonella populations using WGS.

  4. Social Informatics: Natural Tools for Students' Information Training in The Conditions of Embodied and Mental Approaches Being Employed

    Directory of Open Access Journals (Sweden)

    Daria Barkhatova

    2017-09-01

    Full Text Available The relevance of the problem under study is due to the society's requirements for the quality information training of a personality which is oriented to forming the solid fundamental knowledge as well as to developing the cognitive capacities that are needed for solving mental tasks. With regard to this, the paper is aimed at finding out the opportunities of applying the natural tools in information training of students from the standpoints of embodied and mental approaches. The main idea of these is integrated studying of an object, beginning with learning it in an "embodied" way and finishing with abstract models formed in the human memory. The leading approach to the research is the integrated one taking into account the psychological and pedagogical, didactic and methodological constituents. It allows identifying the psychological and pedagogical conditions of application of natural tools as well as the possible ways of their use. The authors describe models of natural tools of computer science training in individual sections of the school course as the main results. The materials of the paper are of practical value in methods of teaching computer science to students at various stages of education.

  5. Design mentoring tool.

    Science.gov (United States)

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers : mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves se...

  6. New approaches for improving cardiovascular risk assessment.

    Science.gov (United States)

    Paredes, Simão; Rocha, Teresa; Mendes, Diana; Carvalho, Paulo; Henriques, Jorge; Morais, João; Ferreira, Jorge; Mendes, Miguel

    2016-01-01

    Clinical guidelines recommend the use of cardiovascular risk assessment tools (risk scores) to predict the risk of events such as cardiovascular death, since these scores can aid clinical decision-making and thereby reduce the social and economic costs of cardiovascular disease (CVD). However, despite their importance, risk scores present important weaknesses that can diminish their reliability in clinical contexts. This study presents a new framework, based on current risk assessment tools, that aims to minimize these limitations. Appropriate application and combination of existing knowledge is the main focus of this work. Two different methodologies are applied: (i) a combination scheme that enables data to be extracted and processed from various sources of information, including current risk assessment tools and the contributions of the physician; and (ii) a personalization scheme based on the creation of patient groups with the purpose of identifying the most suitable risk assessment tool to assess the risk of a specific patient. Validation was performed based on a real patient dataset of 460 patients at Santa Cruz Hospital, Lisbon, Portugal, diagnosed with non-ST-segment elevation acute coronary syndrome. Promising results were obtained with both approaches, which achieved sensitivity, specificity and geometric mean of 78.79%, 73.07% and 75.87%, and 75.69%, 69.79% and 72.71%, respectively. The proposed approaches present better performances than current CVD risk scores; however, additional datasets are required to back up these findings. Copyright © 2015 Sociedade Portuguesa de Cardiologia. Published by Elsevier España. All rights reserved.

  7. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  8. Connecting UML and VDM++ with Open Tool Support

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth; Listrup, Hans Kristian; Larsen, Peter Gorm

    2009-01-01

    Most formal method notations are text based, while tools used in industry often use graphical notations, such as UML. This paper demonstrates how the power of both approaches can be combined by providing the automatic translation of VDM++ models to and from UML. The translation is implemented as ...... as a plugin for the popular Eclipse development environment by the open-source Overture initiative. Both UML class diagrams and sequence diagrams can be translated, the latter enabling the novel ability to link with the combinatorial test facility of Overture....

  9. Combination could be another tool for bowel preparation?

    Science.gov (United States)

    Soh, Jae Seung; Kim, Kyung-Jo

    2016-01-01

    Optimal bowel preparation increases the cecal intubation rate and detection of neoplastic lesions while decreasing the procedural time and procedural-related complications. Although high-volume polyethylene glycol (PEG) solution is the most frequently used preparation for bowel cleansing, patients are often unwilling to take PEG solution due to its large volume, poor palatability, and high incidence of adverse events, such as abdominal bloating and nausea. Other purgatives include osmotic agents (e.g., sodium phosphate, magnesium citrate, and sodium sulfate), stimulant agents (e.g., senna, bisacodyl, and sodium picosulfate), and prokinetic agents (e.g., cisapride, mosapride, and itopride). A combination of PEG with an osmotic, stimulant, or prokinetic agent could effectively reduce the PEG solution volume and increase patients’ adherence. Some such solutions have been found in several published studies to not be inferior to PEG alone in terms of bowel cleansing quality. Although combination methods showed similar efficacy and safety, the value of these studies is limited by shortcomings in study design. New effective and well-tolerated combination preparations are required, in addition to rigorous new validated studies. PMID:26973388

  10. Combined numerical and experimental determination of the convective heat transfer coefficient between an AlCrN-coated Vanadis 4 tool and Rhenus oil

    DEFF Research Database (Denmark)

    Üstünyagiz, Esmeray; Nielsen, Chris V.; Tiedje, Niels S.

    2018-01-01

    Abstract Regardless of the field of application, the reliability of numerical simulations depends on correct description of boundary conditions. In thermal simulation, determination of heat transfer coefficients is important because it varies with material properties and process conditions....... This paper shows a combined experimental and numerical analysis applied for determination of the heat transfer coefficient between an AlCrN-coated Vanadis 4 tool and Rhenus LA722086 oil in an unloaded condition, i.e. without the tool being in contact with a workpiece. It is found that the heat transfer...... coefficient in unloaded conditions at 80°C oil temperature is 0.1 kW/(m2∙K) between the selected stamping tool and mineral oil. A sensitivity analysis of the numerical model was performed to verify the effects of mesh discretization, temperature measurement location and tool geometry. Among these parameters...

  11. Perception as a Tool

    Directory of Open Access Journals (Sweden)

    Jovana Komnenič

    2014-03-01

    Full Text Available The article presents a project of providing guidelines on art education for the blind and visually impaired, which was entitled Perception as a Tool and presented at the Berlin Biennale on 6 October 2010. It focuses on potential aspects of art education with regard to people with special needs and seeks to discover what happens with art if we cannot see it. This approach to art education combines elements of conventional tours of exhibitions and involves the participants through play. The methods that were used in our work included establishing dramatic tension and insecurity in the group as well as mutual trust by relying on different resources, including sensory perception, personal biography and different forms of knowledge and skills. A major part of the project is finding hidden, invisible or forgotten stories that are not directly linked to the exhibition and the aspects directly related to the exhibition. Such a generally inclusive approach enabled us to formulate political questions on the issue of ’invisibility’.

  12. A holistic method to assess building energy efficiency combining D-S theory and the evidential reasoning approach

    International Nuclear Information System (INIS)

    Yao Runming; Yang Yulan; Li Baizhan

    2012-01-01

    The assessment of building energy efficiency is one of the most effective measures for reducing building energy consumption. This paper proposes a holistic method (HMEEB) for assessing and certifying energy efficiency of buildings based on the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach. HMEEB has three main features: (i) it provides both a method to assess and certify building energy efficiency, and exists as an analytical tool to identify improvement opportunities; (ii) it combines a wealth of information on building energy efficiency assessment, including identification of indicators and a weighting mechanism; and (iii) it provides a method to identify and deal with inherent uncertainties within the assessment procedure. This paper demonstrates the robustness, flexibility and effectiveness of the proposed method, using two examples to assess the energy efficiency of two residential buildings, both located in the ‘Hot Summer and Cold Winter’ zone in China. The proposed certification method provides detailed recommendations for policymakers in the context of carbon emission reduction targets and promoting energy efficiency in the built environment. The method is transferable to other countries and regions, using an indicator weighting system to modify local climatic, economic and social factors. - Highlights: ► Assessing energy efficiency of buildings holistically; ► Applying the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach; ► Involving large information and uncertainties in the energy efficiency decision-making process. ► rigorous measures for policymakers to meet carbon emission reduction targets.

  13. In Search of Samoan Research Approaches to Education: Tofa'a'Anolasi and the Foucauldian Tool Box

    Science.gov (United States)

    Galuvao, Akata Sisigafu'aapulematumua

    2018-01-01

    This article introduces Tofa'a'anolasi, a novel Samoan research framework created by drawing on the work of other Samoan and Pacific education researchers, in combination with adapting the 'Foucauldian tool box' to use for research carried out from a Samoan perspective. The article starts with an account and explanation of the process of…

  14. Combining optimisation and simulation in an energy systems analysis of a Swedish iron foundry

    International Nuclear Information System (INIS)

    Mardan, Nawzad; Klahr, Roger

    2012-01-01

    To face global competition, and also reduce environmental and climate impact, industry-wide changes are needed, especially regarding energy use, which is closely related to global warming. Energy efficiency is therefore an essential task for the future as it has a significant impact on both business profits and the environment. For the analysis of possible changes in industrial production processes, and to choose what changes should be made, various modelling tools can be used as a decision support. This paper uses two types of energy analysis tool: Discrete Event Simulation (DES) and Energy Systems Optimisation (ESO). The aim of this study is to describe how a DES and an ESO tool can be combined. A comprehensive five-step approach is proposed for reducing system costs and making a more robust production system. A case study representing a new investment in part of a Swedish iron foundry is also included to illustrate the method's use. The method described in this paper is based on the use of the DES program QUEST and the ESO tool reMIND. The method combination itself is generic, i.e. other similar programs can be used as well with some adjustments and adaptations. The results from the case study show that when different boundary conditions are used the result obtained from the simulation tools is not optimum, in other words, the result shows only a feasible solution and not the best way to run the factory. It is therefore important to use the optimisation tool in such cases in order to obtain the optimum operating strategy. By using the optimisation tool a substantial amount of resources can be saved. The results also show that the combination of optimisation and simulation tools is useful to provide very detailed information about how the system works and to predict system behaviour as well as to minimise the system cost. -- Highlights: ► This study describes how a simulation and an optimisation tool can be combined. ► A case study representing a new

  15. Assessing impacts of ionizing radiation on non-human biota: the ERICA tool

    International Nuclear Information System (INIS)

    Brown, Justin; Lilanda, Astrid; Hosseini, Ali; Alfonso, B.; Avila, R.; Beresford, N.A.; Proehl, G.; Ulanovsky, A.

    2008-01-01

    There have been significant developments in the last few years concerning methods to explicitly quantify impacts on the environment arising from exposure by ionising radiation. Central to the ERICA integrated approach is the quantification of environmental risk whereby data on environmental transfer and dosimetry are combined to provide a measure of exposure which is compared to exposure levels at which detrimental effects are known to occur. In view of the large data sets underpinning the assessment approach and the potential to introduce errors when performing numerous calculations manually, a supporting computer-based tool (the ERICA Tool) has been developed. The ERICA Tool is a computerised, flexible software system that has a structure based upon the ERICA Integrated Assessment tiered approach to assessing the radiological risk to biota. The user is guided through the assessment process, recording information and decisions as the assessment progresses. The tool allows the necessary calculations to be performed to estimate risks to selected biota. Tier 1 assessments use pre-calculated environmental media concentration limits to estimate risk quotients and require inputs in the form of media concentrations. At Tier 2 dose-rates are calculated but at this stage, the user is allowed to examine and edit most of the parameters used in the calculation. For Tier 3 assessments, the same flexibility as Tier 2 is allowed but assessments may be run probability if the underling parameter probability distribution functions are defined. Results from the tool can be put into context using incorporated data on dose-effects relationships and background dose-rates. (author)

  16. Is Combining Child Labour and School Education the Right Approach? Investigating the Cambodian Case

    Science.gov (United States)

    Kim, Chae-Young

    2009-01-01

    The paper considers whether letting children combine work and school is a valid and effective approach in Cambodia. Policy makers' suggestions that child labour should be allowed to some extent due to household poverty appear ungrounded as no significant relation between children's work and household poverty is found while arranging school…

  17. Refurbishment of damaged tools using the combination of GTAW and laser beam welding

    Directory of Open Access Journals (Sweden)

    J. Tušek

    2014-10-01

    Full Text Available This paper presents the use of two welding processes for the refurbishment of damaged industrial tools. In the first part the problem is presented followed by the comparison of GTAW and laser welding in terms of repair welding of damaged tools. The macrosections of the welds show the difference between both welding processes in repairing of damaged tools. At the conclusion the main findings are presented. In many cases it is useful to use both welding processes in order to achieve better weld quality and to make welding more economical. The order of the technology used depends on the tool material, the use of the tool and the tool damage.

  18. DC Voltage Droop Control Implementation in the AC/DC Power Flow Algorithm: Combinational Approach

    DEFF Research Database (Denmark)

    Akhter, F.; Macpherson, D.E.; Harrison, G.P.

    2015-01-01

    of operational flexibility, as more than one VSC station controls the DC link voltage of the MTDC system. This model enables the study of the effects of DC droop control on the power flows of the combined AC/DC system for steady state studies after VSC station outages or transient conditions without needing...... to use its complete dynamic model. Further, the proposed approach can be extended to include multiple AC and DC grids for combined AC/DC power flow analysis. The algorithm is implemented by modifying the MATPOWER based MATACDC program and the results shows that the algorithm works efficiently....

  19. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  20. COMPOSE-HPC: A Transformational Approach to Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, David E [ORNL; Allan, Benjamin A. [Sandia National Laboratories (SNL); Armstrong, Robert C. [Sandia National Laboratories (SNL); Chavarria-Miranda, Daniel [Pacific Northwest National Laboratory (PNNL); Dahlgren, Tamara L. [Lawrence Livermore National Laboratory (LLNL); Elwasif, Wael R [ORNL; Epperly, Tom [Lawrence Livermore National Laboratory (LLNL); Foley, Samantha S [ORNL; Hulette, Geoffrey C. [Sandia National Laboratories (SNL); Krishnamoorthy, Sriram [Pacific Northwest National Laboratory (PNNL); Prantl, Adrian [Lawrence Livermore National Laboratory (LLNL); Panyala, Ajay [Louisiana State University; Sottile, Matthew [Galois, Inc.

    2012-04-01

    The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, which include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.

  1. Combined action of ionizing radiation with another factor: common rules and theoretical approach

    International Nuclear Information System (INIS)

    Kim, Jin Kyu; Roh, Changhyun; Komarova, Ludmila N.; Petin, Vladislav G.

    2013-01-01

    Two or more factors can simultaneously make their combined effects on the biological objects. This study has focused on theoretical approach to synergistic interaction due to the combined action of radiation and another factor on cell inactivation. A mathematical model for the synergistic interaction of different environmental agents was suggested for quantitative prediction of irreversibly damaged cells after combined exposures. The model takes into account the synergistic interaction of agents and based on the supposition that additional effective damages responsible for the synergy are irreversible and originated from an interaction of ineffective sub lesions. The experimental results regarding the irreversible component of radiation damage of diploid yeast cells simultaneous exposed to heat with ionizing radiation or UV light are presented. A good agreement of experimental results with model predictions was demonstrated. The importance of the results obtained for the interpretation of the mechanism of synergistic interaction of various environmental factors is discussed. (author)

  2. Combined action of ionizing radiation with another factor: common rules and theoretical approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Kyu; Roh, Changhyun, E-mail: jkkim@kaeri.re.kr [Korea Atomic Energy Research Institute, Jeongeup (Korea, Republic of); Komarova, Ludmila N.; Petin, Vladislav G., E-mail: vgpetin@yahoo.com [Medical Radiological Research Center, Obninsk (Russian Federation)

    2013-07-01

    Two or more factors can simultaneously make their combined effects on the biological objects. This study has focused on theoretical approach to synergistic interaction due to the combined action of radiation and another factor on cell inactivation. A mathematical model for the synergistic interaction of different environmental agents was suggested for quantitative prediction of irreversibly damaged cells after combined exposures. The model takes into account the synergistic interaction of agents and based on the supposition that additional effective damages responsible for the synergy are irreversible and originated from an interaction of ineffective sub lesions. The experimental results regarding the irreversible component of radiation damage of diploid yeast cells simultaneous exposed to heat with ionizing radiation or UV light are presented. A good agreement of experimental results with model predictions was demonstrated. The importance of the results obtained for the interpretation of the mechanism of synergistic interaction of various environmental factors is discussed. (author)

  3. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain

    OpenAIRE

    Srivastava, Subodh; Sharma, Neeraj; Singh, S. K.; Srivastava, R.

    2014-01-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based uns...

  4. Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos

    Directory of Open Access Journals (Sweden)

    Odette Laneuville

    2015-05-01

    Full Text Available To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4% accessed all videos, and the majority (98.4% found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1% always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class.

  5. A phenomenological approach for the analysis of combined fatigue and creep

    International Nuclear Information System (INIS)

    Bui-Quoc, T.; Biron, A.

    1982-01-01

    An approach is proposed for the life prediction, under cumulative damage conditions, for fatigue and for creep. An interaction effect is introduced to account for a modification in the material behavior due to previous loading. A predictive technique is then developed which is applied to several materials for fatigue and which could potentially be used for creep. With due consideration to the similarity of the formulation for both phenomena, the analysis for the combination of fatigue and creep is then carried out through a straightforward sequential use of the two damage functions. Several patterns are studied without and with an interaction effect. (orig.)

  6. Systems Prototyping with Fourth Generation Tools: One Answer to the Productivity Puzzle? AIR 1983 Annual Forum Paper.

    Science.gov (United States)

    Sholtys, Phyllis A.

    The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…

  7. Combining new tools to assess renal function and morphology: a holistic approach to study the effects of aging and a congenital nephron deficit.

    Science.gov (United States)

    Geraci, Stefania; Chacon-Caldera, Jorge; Cullen-McEwen, Luise; Schad, Lothar R; Sticht, Carsten; Puelles, Victor G; Bertram, John F; Gretz, Norbert

    2017-09-01

    Recently, new methods for assessing renal function in conscious mice (transcutaneous assessment) and for counting and sizing all glomeruli in whole kidneys (MRI) have been described. In the present study, these methods were used to assess renal structure and function in aging mice, and in mice born with a congenital low-nephron endowment. Age-related nephron loss was analyzed in adult C57BL/6 mice (10-50 wk of age), and congenital nephron deficit was assessed in glial cell line-derived neurotrophic factor heterozygous (GDNF HET)-null mutant mice. Renal function was measured through the transcutaneous quantitation of fluorescein isothiocyanate-sinistrin half-life ( t 1/2 ) in conscious mice. MRI was used to image, count, and size cationic-ferritin labeled glomeruli in whole kidneys ex vivo. Design-based stereology was used to validate the MRI measurements of glomerular number and mean volume. In adult C57BL/6 mice, older age was associated with fewer and larger glomeruli, and a rightward shift in the glomerular size distribution. These changes coincided with a decrease in renal function. GNDF HET mice had a congenital nephron deficit that was associated with glomerular hypertrophy and exacerbated by aging. These findings suggest that glomerular hypertrophy and hyperfiltration are compensatory processes that can occur in conjunction with both age-related nephron loss and congenital nephron deficiency. The combination of measurement of renal function in conscious animals and quantitation of glomerular number, volume, and volume distribution provides a powerful new tool for investigating aspects of renal aging and functional changes. Copyright © 2017 the American Physiological Society.

  8. An integrated approach using high time-resolved tools to study the origin of aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Di Gilio, A. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Gennaro, G. de, E-mail: gianluigi.degennaro@uniba.it [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Dambruoso, P. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Ventrella, G. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy)

    2015-10-15

    Long-range transport of natural and/or anthropogenic particles can contribute significantly to PM10 and PM2.5 concentrations and some European cities often fail to comply with PM daily limit values due to the additional impact of particles from remote sources. For this reason, reliable methodologies to identify long-range transport (LRT) events would be useful to better understand air pollution phenomena and support proper decision-making. This study explores the potential of an integrated and high time-resolved monitoring approach for the identification and characterization of local, regional and long-range transport events of high PM. In particular, the goal of this work was also the identification of time-limited event. For this purpose, a high time-resolved monitoring campaign was carried out at an urban background site in Bari (southern Italy) for about 20 days (1st–20th October 2011). The integration of collected data as the hourly measurements of inorganic ions in PM{sub 2.5} and their gas precursors and of the natural radioactivity, in addition to the analyses of aerosol maps and hourly back trajectories (BT), provided useful information for the identification and chemical characterization of local sources and trans-boundary intrusions. Non-sea salt (nss) sulfate levels were found to increase when air masses came from northeastern Europe and higher dispersive conditions of the atmosphere were detected. Instead, higher nitrate and lower nss-sulfate concentrations were registered in correspondence with air mass stagnation and attributed to local traffic source. In some cases, combinations of local and trans-boundary sources were observed. Finally, statistical investigations such as the principal component analysis (PCA) applied on hourly ion concentrations and the cluster analyses, the Potential Source Contribution Function (PSCF) and the Concentration Weighted Trajectory (CWT) models computed on hourly back-trajectories enabled to complete a cognitive

  9. An integrated approach using high time-resolved tools to study the origin of aerosols

    International Nuclear Information System (INIS)

    Di Gilio, A.; Gennaro, G. de; Dambruoso, P.; Ventrella, G.

    2015-01-01

    Long-range transport of natural and/or anthropogenic particles can contribute significantly to PM10 and PM2.5 concentrations and some European cities often fail to comply with PM daily limit values due to the additional impact of particles from remote sources. For this reason, reliable methodologies to identify long-range transport (LRT) events would be useful to better understand air pollution phenomena and support proper decision-making. This study explores the potential of an integrated and high time-resolved monitoring approach for the identification and characterization of local, regional and long-range transport events of high PM. In particular, the goal of this work was also the identification of time-limited event. For this purpose, a high time-resolved monitoring campaign was carried out at an urban background site in Bari (southern Italy) for about 20 days (1st–20th October 2011). The integration of collected data as the hourly measurements of inorganic ions in PM 2.5 and their gas precursors and of the natural radioactivity, in addition to the analyses of aerosol maps and hourly back trajectories (BT), provided useful information for the identification and chemical characterization of local sources and trans-boundary intrusions. Non-sea salt (nss) sulfate levels were found to increase when air masses came from northeastern Europe and higher dispersive conditions of the atmosphere were detected. Instead, higher nitrate and lower nss-sulfate concentrations were registered in correspondence with air mass stagnation and attributed to local traffic source. In some cases, combinations of local and trans-boundary sources were observed. Finally, statistical investigations such as the principal component analysis (PCA) applied on hourly ion concentrations and the cluster analyses, the Potential Source Contribution Function (PSCF) and the Concentration Weighted Trajectory (CWT) models computed on hourly back-trajectories enabled to complete a cognitive framework

  10. An Approach to Design of Power-Mechatronic Systems

    DEFF Research Database (Denmark)

    Andersen, T. O.; Hansen, M. R.; Conrad, Finn

    2003-01-01

    successful mechatronic engineers. The paper presents and discusses new mechtonic design approach and design methods anr IT-Tools used to demonstrate such approach and IT-Tool for modelling, simulation and control useful in analysis, synthesis, design and application of mechatronic systems with fluid power......The paper focus on today´s, cost-effective electronics, microcomputers, and digital signal processors have brought new advanced technology to appliances and consumer products. Systems with precision sensors and actuators have increased performance by order of magnitude over what was once possible....... The paper discusses what set these new, high-performance, cost-effective systems and devices apart from those of the past? Is it more than just technological advancement? There are many designs where electronics and control are combined with mechanical components, but with very little synergy and poor...

  11. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  12. Design tools and materials in creative work

    DEFF Research Database (Denmark)

    Hansen, Nicolai Brodersen; Dalsgaard, Peter; Halskov, Kim

    2017-01-01

    -oriented perspectives, we wish to examine the potentials and limitations in current uses of design tools and materials, and discuss and explore when and how we can introduce ones. Participation in the workshop requires participants to document and analyse central themes in a case, and the resulting material will serve......This workshop aims to examine and discuss the role and nature of design tools and materials in creative work, and to explore how novel tools can meaningfully combine existing and novel tools to support and augment creative work. By exploring and combining methodological, theoretical, and design...

  13. Acoustic Emission Methodology to Evaluate the Fracture Toughness in Heat Treated AISI D2 Tool Steel

    Science.gov (United States)

    Mostafavi, Sajad; Fotouhi, Mohamad; Motasemi, Abed; Ahmadi, Mehdi; Sindi, Cevat Teymuri

    2012-10-01

    In this article, fracture toughness behavior of tool steel was investigated using Acoustic Emission (AE) monitoring. Fracture toughness ( K IC) values of a specific tool steel was determined by applying various approaches based on conventional AE parameters, such as Acoustic Emission Cumulative Count (AECC), Acoustic Emission Energy Rate (AEER), and the combination of mechanical characteristics and AE information called sentry function. The critical fracture toughness values during crack propagation were achieved by means of relationship between the integral of the sentry function and cumulative fracture toughness (KICUM). Specimens were selected from AISI D2 cold-work tool steel and were heat treated at four different tempering conditions (300, 450, 525, and 575 °C). The results achieved through AE approaches were then compared with a methodology proposed by compact specimen testing according to ASTM standard E399. It was concluded that AE information was an efficient method to investigate fracture characteristics.

  14. Phytophagous insects on native and non-native host plants: combining the community approach and the biogeographical approach.

    Directory of Open Access Journals (Sweden)

    Kim Meijer

    Full Text Available During the past centuries, humans have introduced many plant species in areas where they do not naturally occur. Some of these species establish populations and in some cases become invasive, causing economic and ecological damage. Which factors determine the success of non-native plants is still incompletely understood, but the absence of natural enemies in the invaded area (Enemy Release Hypothesis; ERH is one of the most popular explanations. One of the predictions of the ERH, a reduced herbivore load on non-native plants compared with native ones, has been repeatedly tested. However, many studies have either used a community approach (sampling from native and non-native species in the same community or a biogeographical approach (sampling from the same plant species in areas where it is native and where it is non-native. Either method can sometimes lead to inconclusive results. To resolve this, we here add to the small number of studies that combine both approaches. We do so in a single study of insect herbivory on 47 woody plant species (trees, shrubs, and vines in the Netherlands and Japan. We find higher herbivore diversity, higher herbivore load and more herbivory on native plants than on non-native plants, generating support for the enemy release hypothesis.

  15. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  16. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    International Nuclear Information System (INIS)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard

    2013-01-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  17. Combined Simulated Annealing and Genetic Algorithm Approach to Bus Network Design

    Science.gov (United States)

    Liu, Li; Olszewski, Piotr; Goh, Pong-Chai

    A new method - combined simulated annealing (SA) and genetic algorithm (GA) approach is proposed to solve the problem of bus route design and frequency setting for a given road network with fixed bus stop locations and fixed travel demand. The method involves two steps: a set of candidate routes is generated first and then the best subset of these routes is selected by the combined SA and GA procedure. SA is the main process to search for a better solution to minimize the total system cost, comprising user and operator costs. GA is used as a sub-process to generate new solutions. Bus demand assignment on two alternative paths is performed at the solution evaluation stage. The method was implemented on four theoretical grid networks of different size and a benchmark network. Several GA operators (crossover and mutation) were utilized and tested for their effectiveness. The results show that the proposed method can efficiently converge to the optimal solution on a small network but computation time increases significantly with network size. The method can also be used for other transport operation management problems.

  18. The need for combining IEA and IE tools. The potential effects of a global ban on PVC on climate change

    International Nuclear Information System (INIS)

    Kleijn, Rene; Van der Voet, Ester; Udo de Haes, Helias A.

    2008-01-01

    Over the last decades the concepts of Integrated Environmental Assessment (IEA) and Industrial Ecology (IE), both claiming to provide analyses and solutions for sustainability issues, have been developed separately as they emerged in response to questions from different policy-fields. In both fields, specific tools are used to support national and international environmental policy. The focus of IEA and IE tools, however, is different. IEA tools focus on one or a limited number of specific environmental issues. They often model the chain environmental processes with high spatial (and temporal) resolution, but have a low resolution for the material structure of the economy and only partly take into account indirect effects that occur via physical and socio-economic linkages. IE tools take into account all environmental issues related to a specific substance or product. They have a high resolution for the material structure of the economy and take into account indirect effects that occur via physical linkages, however, their environmental modelling is very limited. Both IE and IEA tools have proven to be very useful and neither is superior to the other. However, a combination of both can provide additional information that can be used for more effective policy making. We use the case of a hypothetical world-wide ban on PVC to show that a measure that is not directly related to climate change could still have significant climate effects. This indirect effect is a result of the linkages of material flows in society. We show that IEA tools are not well suited to include these types of effects and that IE tools can fill this gap partially. What is really needed is a broader systems perspective that takes into account the full range of possible side-effects of environmental policy measures. (author)

  19. ANALYSIS OF COMBINED UAV-BASED RGB AND THERMAL REMOTE SENSING DATA: A NEW APPROACH TO CROWD MONITORING

    Directory of Open Access Journals (Sweden)

    S. Schulte

    2017-08-01

    Full Text Available Collecting vast amount of data does not solely help to fulfil information needs related to crowd monitoring, it is rather important to collect data that is suitable to meet specific information requirements. In order to address this issue, a prototype is developed to facilitate the combination of UAV-based RGB and thermal remote sensing datasets. In an experimental approach, image sensors were mounted on a remotely piloted aircraft and captured two video datasets over a crowd. A group of volunteers performed diverse movements that depict real world scenarios. The prototype is deriving the movement on the ground and is programmed in MATLAB. This novel detection approach using combined data is afterwards evaluated against detection algorithms that only use a single data source. Our tests show that the combination of RGB and thermal remote sensing data is beneficial for the field of crowd monitoring regarding the detection of crowd movement.

  20. Multiport Combined Endoscopic Approach to Nonembolized Juvenile Nasopharyngeal Angiofibroma with Parapharyngeal Extension: An Emerging Concept

    Directory of Open Access Journals (Sweden)

    Tiruchy Narayanan Janakiram

    2016-01-01

    Full Text Available Background. Surgical approaches to the parapharyngeal space (PPS are challenging by virtue of deep location and neurovascular content. Juvenile Nasopharyngeal Angiofibroma (JNA is a formidable hypervascular tumor that involves multiple compartments with increase in size. In tumors with extension to parapharyngeal space, the endonasal approach was observed to be inadequate. Combined Endoscopic Endonasal Approaches and Endoscopic Transoral Surgery (EEA-ETOS approach has provided a customized alternative of multicorridor approach to access JNA for its safe and efficient resection. Methods. The study demonstrates a case series of patients of JNA with prestyloid parapharyngeal space extension operated by endoscopic endonasal and endoscopic transoral approach for tumor excision. Results. The multiport EEA-ETOS approach was used to provide wide exposure to access JNA in parapharyngeal space. No major complications were observed. No conversion to external approach was required. Postoperative morbidity was low and postoperative scans showed no residual tumor. A one-year follow-up was maintained and there was no evidence of disease recurrence. Conclusion. Although preliminary, our experience demonstrates safety and efficacy of multiport approach in providing access to multiple compartments, facilitating total excision of JNA in selected cases.

  1. Language Management Tools

    DEFF Research Database (Denmark)

    Sanden, Guro Refsum

    This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...... may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language...

  2. Global Practical Stabilization and Tracking for an Underactuated Ship - A Combined Averaging and Backstepping Approach

    Directory of Open Access Journals (Sweden)

    Kristin Y. Pettersen

    1999-10-01

    Full Text Available We solve both the global practical stabilization and tracking problem for an underactuated ship, using a combined integrator backstepping and averaging approach. Exponential convergence to an arbitrarily small neighbourhood of the origin and of the reference trajectory, respectively, is proved. Simulation results are included.

  3. [Suicide and evaluation. Review of French tools: Non-dimensional approach and self-assessment].

    Science.gov (United States)

    Ducher, J-L; de Chazeron, I; Llorca, P-M

    2016-06-01

    Suicide prevention represents a major challenge to public health, and the suicide risk is a permanent concern in psychiatry. But the main difficulty is its diagnosis. What resources are available in French which seem to help therapists in this process? We can distinguish the non-dimensional approach, the use of self-administered questionnaires or interviewer-administrated questionnaires. In this paper, for reasons of editing constraints, we are interested only in a non-dimensional approach and direct assessment measures by self-assessment, analysing the strengths and limitations of each and taking into account scientific studies that have been devoted to them and their clinical relevance. We first considered various aspects of non-dimensional approach through suicidal risk factors research, suicidal emergency and suicidal potential concepts, Shea approach, the model of Mann and some recommended evaluations. This type of approach has a number of advantages, but also limitations. Dimensional approach allows going further. In this article, we also discuss the existing self-assessment tools in French as for example dedicated item for Beck Depression Inventory (BDI) or specific scales such as Reasons for Living Inventory (RFL), Suicidal Probability Scale (SPS), Beck Hopelessness Scale (BHS) and self-administered Suicide Risk Assessment Scale of Ducher (aRSD). These last two seem to be used as a priority regarding result of their validation studies. The strong correlation between the self-administered questionnaire aRSD and the interviewer-administered Suicide Risk Assessment Scale of Ducher RSD (r=0.92; P<10(-7)) shows the ability of patients to express their suicidal ideation if we want to invite them to do so. Copyright © 2015 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  4. Use of Monocrystalline Silicon as Tool Material for Highly Accurate Blanking of Thin Metal Foils

    International Nuclear Information System (INIS)

    Hildering, Sven; Engel, Ulf; Merklein, Marion

    2011-01-01

    The trend towards miniaturisation of metallic mass production components combined with increased component functionality is still unbroken. Manufacturing these components by forming and blanking offers economical and ecological advantages combined with the needed accuracy. The complexity of producing tools with geometries below 50 μm by conventional manufacturing methods becomes disproportional higher. Expensive serial finishing operations are required to achieve an adequate surface roughness combined with accurate geometry details. A novel approach for producing such tools is the use of advanced etching technologies for monocrystalline silicon that are well-established in the microsystems technology. High-precision vertical geometries with a width down to 5 μm are possible. The present study shows a novel concept using this potential for the blanking of thin copper foils with monocrystallline silicon as a tool material. A self-contained machine-tool with compact outer dimensions was designed to avoid tensile stresses in the brittle silicon punch by an accurate, careful alignment of the punch, die and metal foil. A microscopic analysis of the monocrystalline silicon punch shows appropriate properties regarding flank angle, edge geometry and surface quality for the blanking process. Using a monocrystalline silicon punch with a width of 70 μm blanking experiments on as-rolled copper foils with a thickness of 20 μm demonstrate the general applicability of this material for micro production processes.

  5. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  6. Synergy Maps: exploring compound combinations using network-based visualization.

    Science.gov (United States)

    Lewis, Richard; Guha, Rajarshi; Korcsmaros, Tamás; Bender, Andreas

    2015-01-01

    The phenomenon of super-additivity of biological response to compounds applied jointly, termed synergy, has the potential to provide many therapeutic benefits. Therefore, high throughput screening of compound combinations has recently received a great deal of attention. Large compound libraries and the feasibility of all-pairs screening can easily generate large, information-rich datasets. Previously, these datasets have been visualized using either a heat-map or a network approach-however these visualizations only partially represent the information encoded in the dataset. A new visualization technique for pairwise combination screening data, termed "Synergy Maps", is presented. In a Synergy Map, information about the synergistic interactions of compounds is integrated with information about their properties (chemical structure, physicochemical properties, bioactivity profiles) to produce a single visualization. As a result the relationships between compound and combination properties may be investigated simultaneously, and thus may afford insight into the synergy observed in the screen. An interactive web app implementation, available at http://richlewis42.github.io/synergy-maps, has been developed for public use, which may find use in navigating and filtering larger scale combination datasets. This tool is applied to a recent all-pairs dataset of anti-malarials, tested against Plasmodium falciparum, and a preliminary analysis is given as an example, illustrating the disproportionate synergism of histone deacetylase inhibitors previously described in literature, as well as suggesting new hypotheses for future investigation. Synergy Maps improve the state of the art in compound combination visualization, by simultaneously representing individual compound properties and their interactions. The web-based tool allows straightforward exploration of combination data, and easier identification of correlations between compound properties and interactions.

  7. Towards a Platform of Investigative Tools for Biomimicry as a New Approach for Energy-Efficient Building Design

    Directory of Open Access Journals (Sweden)

    Natasha Chayaamor-Heil

    2017-03-01

    Full Text Available Major problems worldwide are environmental concern and energy shortage along with the high consumption of energy in buildings and the lack of sources. Buildings are the most intensive energy consumers, and account for 40% of worldwide energy use, which is much more than transportation. In next 25 years, CO2 emissions from buildings are projected to grow faster than in other sectors. Thus, architects must attempt to find solutions for managing buildings energy consumption. One of new innovative approaches is Biomimicry, which is defined as the applied science that derives inspiration for solutions to human problems through the study of natural designs’ principles. Although biomimicry is considered to be a new approach for achieving sustainable architecture, but there is still not enough access for architects to make use of it, especially to implement biomimetic design strategy in architectural project. The main objective of this paper is to raise awareness of architects making use of biomimetic strategies with better accessible facility. We propose to create the tool setting relationship to formalize and bridge between biological and architectural knowledge, along with investigative tools to investigate the ability of reducing energy consumption by applying the biomimetic strategies on efficient-energy building design. This article hypothetically proposes an investigative tool based on Bayesian networks for testing the rapid result of choices from natural devices according to specific multi-criteria requirements in each case study.

  8. An alternative approach to the determination of scaling law expressions for the L–H transition in Tokamaks utilizing classification tools instead of regression

    International Nuclear Information System (INIS)

    Gaudio, P; Gelfusa, M; Lupelli, I; Murari, A; Vega, J

    2014-01-01

    A new approach to determine the power law expressions for the threshold between the H and L mode of confinement is presented. The method is based on two powerful machine learning tools for classification: neural networks and support vector machines. Using as inputs clear examples of the systems on either side of the transition, the machine learning tools learn the input–output mapping corresponding to the equations of the boundary separating the confinement regimes. Systematic tests with synthetic data show that the machine learning tools provide results competitive with traditional statistical regression and more robust against random noise and systematic errors. The developed tools have then been applied to the multi-machine International Tokamak Physics Activity International Global Threshold Database of validated ITER-like Tokamak discharges. The machine learning tools converge on the same scaling law parameters obtained with non-linear regression. On the other hand, the developed tools allow a reduction of 50% of the uncertainty in the extrapolations to ITER. Therefore the proposed approach can effectively complement traditional regression since its application poses much less stringent requirements on the experimental data, to be used to determine the scaling laws, because they do not require examples exactly at the moment of the transition. (paper)

  9. Developing a planning tool for South African prosecution resources: challenges and approach

    Directory of Open Access Journals (Sweden)

    R Koen

    2012-12-01

    Full Text Available In every country the prosecution of criminal cases is governed by different laws, policies and processes. In South Africa, the National Prosecuting Authority (NPA has the responsibility of planning and managing all prosecution functions. The NPA has certain unique characteristics that make it different from other similar organisations internationally. The development of a planning tool that the NPA could use to plan their future resource requirements over the short to medium term required extensive modelling, and its final form included features which, to the best knowledge of the development team, make it unique both locally and internationally. Model design was largely influenced by the challenges emanating from the special requirements and context of the problem. Resources were not forecasted directly, but were derived with the help of simulation models that traced docket flows through various resource-driven processes. Docket flows were derived as a proportion of reported crimes, and these were forecasted using a multivariate statistical model which could take into account explanatory variables as well as the correlations between the patterns observed within different crime categories. The simulation consisted of a number of smaller models which could be run independently, and not of one overarching model. This approach was found to make the best use of available data, and compensated for the fact that certain parameters, linking different courts and court types, were not available. In addition, it simplified scenario testing and sensitivity analysis. The various components of the planning tool, including inputs and outputs of the simulation models and the linkages between the forecasts and the simulation models, were implemented in a set of spreadsheets. By using spreadsheets as a common user interface, the planning tool could be used by prosecutors and managers who may not have extensive mathematical or modelling experience.

  10. Combined Use of Morphological and Molecular Tools to Resolve Species Mis-Identifications in the Bivalvia The Case of Glycymeris glycymeris and G. pilosa.

    Science.gov (United States)

    Purroy, Ariadna; Šegvić-Bubić, Tanja; Holmes, Anna; Bušelić, Ivana; Thébault, Julien; Featherstone, Amy; Peharda, Melita

    Morphological and molecular tools were combined to resolve the misidentification between Glycymeris glycymeris and Glycymeris pilosa from Atlantic and Mediterranean populations. The ambiguous literature on the taxonomic status of these species requires this confirmation as a baseline to studies on their ecology and sclerochronology. We used classical and landmark-based morphometric approaches and performed bivariate and multivariate analyses to test for shell character interactions at the individual and population level. Both approaches generated complementary information. The former showed the shell width to length ratio and the valve asymmetry to be the main discriminant characters between Atlantic and Mediterranean populations. Additionally, the external microsculpture of additional and finer secondary ribs in G. glycymeris discriminates it from G. pilosa. Likewise, landmark-based geometric morphometrics revealed a stronger opisthogyrate beak and prosodetic ligament in G. pilosa than G. glycymeris. Our Bayesian and maximum likelihood phylogenetic analyses based on COI and ITS2 genes identified that G. glycymeris and G. pilosa form two separate monophyletic clades with mean interspecific divergence of 11% and 0.9% for COI and ITS2, respectively. The congruent patterns of morphometric analysis together with mitochondrial and nuclear phylogenetic reconstructions indicated the separation of the two coexisting species. The intraspecific divergence occurred during the Eocene and accelerated during the late Pliocene and Pleistocene. Glycymeris pilosa showed a high level of genetic diversity, appearing as a more robust species whose tolerance of environmental conditions allowed its expansion throughout the Mediterranean.

  11. A Psychometric Tool for a Virtual Reality Rehabilitation Approach for Dyslexia

    Directory of Open Access Journals (Sweden)

    Elisa Pedroli

    2017-01-01

    Full Text Available Dyslexia is a chronic problem that affects the life of subjects and often influences their life choices. The standard rehabilitation methods all use a classic paper and pencil training format but these exercises are boring and demanding for children who may have difficulty in completing the treatments. It is important to develop a new rehabilitation program that would help children in a funny and engaging way. A Wii-based game was developed to demonstrate that a short treatment with an action video game, rather than phonological or orthographic training, may improve the reading abilities in dyslexic children. According to the results, an approach using cues in the context of a virtual environment may represent a promising tool to improve attentional skills. On the other hand, our results do not demonstrate an immediate effect on reading performance, suggesting that a more prolonged protocol may be a future direction.

  12. A novel approach for honey pollen profile assessment using an electronic tongue and chemometric tools

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Luís G., E-mail: ldias@ipb.pt [Escola Superior Agrária, Instituto Politécnico de Bragança, Campus Santa Apolónia, 5301-855 Bragança (Portugal); CQ-VR, Centro de Química – Vila Real, University of Trás-os-Montes e Alto Douro, Apartado 1013, 5001-801 Vila Real (Portugal); Veloso, Ana C.A. [Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); CEB-Centre of Biological Engineering, University of Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Sousa, Mara E.B.C.; Estevinho, Letícia [CIMO-Escola Superior Agrária, Instituto Politécnico de Bragança, Campus Santa Apolónia, 5301-855 Bragança (Portugal); Machado, Adélio A.S.C. [LAQUIPAI – Laboratório de Química Inorgânica Pura e de Aplicação Interdisciplinar, Departamento de Química, Faculdade de Ciências da, Universidade do Porto, Rua Campo Alegre n°. 687, 4169-007 Porto (Portugal); and others

    2015-11-05

    Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors' sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10–20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation. - Highlights: • Honey's floral origin labeling is a legal requirement. • Melissopalynology analysis usually used to evaluate pollens profile is laborious. • A novel E-tongue based approach is applied to assess pollens

  13. A novel approach for honey pollen profile assessment using an electronic tongue and chemometric tools

    International Nuclear Information System (INIS)

    Dias, Luís G.; Veloso, Ana C.A.; Sousa, Mara E.B.C.; Estevinho, Letícia; Machado, Adélio A.S.C.

    2015-01-01

    Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors' sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10–20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation. - Highlights: • Honey's floral origin labeling is a legal requirement. • Melissopalynology analysis usually used to evaluate pollens profile is laborious. • A novel E-tongue based approach is applied to assess pollens relative

  14. Tools for a collective management of the climate risks. An analysis framework centered on the means to derive a signal-price

    International Nuclear Information System (INIS)

    Godard, O.; Hourcade, J.C.

    1991-01-01

    The greenhouse effect appears as a particular problem (collective uncertainty, imprecise understanding, etc.) that necessitates a specific strategical approach and tool selection adequate criteria. The applicability of any environmental regulation concerning greenhouse gas abatement must take into account aspects such as efficiency, predictability, political sovereignty, perverted effects and practicability. Combined tools have to be coordinated: regulations, negotiation, financial incentives, exchangeable emission rights..., that could be controlled by a signal price system

  15. How to Conduct Multimethod Field Studies in the Operating Room: The iPad Combined With a Survey App as a Valid and Reliable Data Collection Tool.

    Science.gov (United States)

    Tscholl, David W; Weiss, Mona; Spahn, Donat R; Noethiger, Christoph B

    2016-01-05

    Tablet computers such as the Apple iPad are progressively replacing traditional paper-and-pencil-based data collection. We combined the iPad with the ready-to-use survey software, iSurvey (from Harvestyourdata), to create a straightforward tool for data collection during the Anesthesia Pre-Induction Checklist (APIC) study, a hospital-wide multimethod intervention study involving observation of team performance and team member surveys in the operating room (OR). We aimed to provide an analysis of the factors that led to the use of the iPad- and iSurvey-based tool for data collection, illustrate our experiences with the use of this data collection tool, and report the results of an expert survey about user experience with this tool. We used an iPad- and iSurvey-based tool to observe anesthesia inductions conducted by 205 teams (N=557 team members) in the OR. In Phase 1, expert raters used the iPad- and iSurvey-based tool to rate team performance during anesthesia inductions, and anesthesia team members were asked to indicate their perceptions after the inductions. In Phase 2, we surveyed the expert raters about their perceptions regarding the use of the iPad- and iSurvey-based tool to observe, rate, and survey teams in the ORs. The results of Phase 1 showed that training data collectors on the iPad- and iSurvey-based data collection tool was effortless and there were no serious problems during data collection, upload, download, and export. Interrater agreement of the combined data collection tool was found to be very high for the team observations (median Fleiss' kappa=0.88, 95% CI 0.78-1.00). The results of the follow-up expert rater survey (Phase 2) showed that the raters did not prefer a paper-and-pencil-based data collection method they had used during other earlier studies over the iPad- and iSurvey-based tool (median response 1, IQR 1-1; 1=do not agree, 2=somewhat disagree, 3=neutral, 4=somewhat agree, 5=fully agree). They found the iPad (median 5, IQR 4

  16. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    Science.gov (United States)

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  18. Lathe tool bit and holder for machining fiberglass materials

    Science.gov (United States)

    Winn, L. E. (Inventor)

    1972-01-01

    A lathe tool and holder combination for machining resin impregnated fiberglass cloth laminates is described. The tool holder and tool bit combination is designed to accommodate a conventional carbide-tipped, round shank router bit as the cutting medium, and provides an infinite number of cutting angles in order to produce a true and smooth surface in the fiberglass material workpiece with every pass of the tool bit. The technique utilizes damaged router bits which ordinarily would be discarded.

  19. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    Science.gov (United States)

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial

  20. The combined theoretical and experimental approach to arrive at optimum parameters in friction stir welding

    Science.gov (United States)

    Jagadeesha, C. B.

    2017-12-01

    Even though friction stir welding was invented long back (1991) by TWI England, till now there has no method or procedure or approach developed, which helps to obtain quickly optimum or exact parameters yielding good or sound weld. An approach has developed in which an equation has been derived, by which approximate rpm can be obtained and by setting range of rpm ±100 or 50 rpm over approximate rpm and by setting welding speed equal to 60 mm/min or 50 mm/min one can conduct FSW experiment to reach optimum parameters; one can reach quickly to optimum parameters, i.e. desired rpm, and welding speed, which yield sound weld by the approach. This approach can be effectively used to obtain sound welds for all similar and dissimilar combinations of materials such as Steel, Al, Mg, Ti, etc.

  1. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  2. Tool geometry and damage mechanisms influencing CNC turning efficiency of Ti6Al4V

    Science.gov (United States)

    Suresh, Sangeeth; Hamid, Darulihsan Abdul; Yazid, M. Z. A.; Nasuha, Nurdiyanah; Ain, Siti Nurul

    2017-12-01

    Ti6Al4V or Grade 5 titanium alloy is widely used in the aerospace, medical, automotive and fabrication industries, due to its distinctive combination of mechanical and physical properties. Ti6Al4V has always been perverse during its machining, strangely due to the same mix of properties mentioned earlier. Ti6Al4V machining has resulted in shorter cutting tool life which has led to objectionable surface integrity and rapid failure of the parts machined. However, the proven functional relevance of this material has prompted extensive research in the optimization of machine parameters and cutting tool characteristics. Cutting tool geometry plays a vital role in ensuring dimensional and geometric accuracy in machined parts. In this study, an experimental investigation is actualized to optimize the nose radius and relief angles of the cutting tools and their interaction to different levels of machining parameters. Low elastic modulus and thermal conductivity of Ti6Al4V contribute to the rapid tool damage. The impact of these properties over the tool tips damage is studied. An experimental design approach is utilized in the CNC turning process of Ti6Al4V to statistically analyze and propose optimum levels of input parameters to lengthen the tool life and enhance surface characteristics of the machined parts. A greater tool nose radius with a straight flank, combined with low feed rates have resulted in a desirable surface integrity. The presence of relief angle has proven to aggravate tool damage and also dimensional instability in the CNC turning of Ti6Al4V.

  3. A software tool for ecosystem services assessments

    Science.gov (United States)

    Riegels, Niels; Klinting, Anders; Butts, Michael; Middelboe, Anne Lise; Mark, Ole

    2017-04-01

    proposed project can be estimated to determine whether the project affects drivers, pressures, states or a combination of these. • In part III, information about impacts on drivers, pressures, and states is used to identify ESS impacted by a proposed project. Potential beneficiaries of impacted ESS are also identified. • In part IV, changes in ESS are estimated. These estimates include changes in the provision of ESS, the use of ESS, and the value of ESS. • A sustainability assessment in Part V estimates the broader impact of a proposed project according to social, environmental, governance and other criteria. The ESS evaluation software tool is designed to assist an evaluation or study leader carrying out an ESS assessment. The tool helps users move through the logic of the ESS evaluation and make sense of relationships between elements of the DPSIR framework, the CICES classification scheme, and the FEGS approach. The tool also provides links to useful indicators and assessment methods in order to help users quantify changes in ESS and ESS values. The software tool is developed in collaboration with the DESSIN user group, who will use the software to estimate changes in ESS resulting from the implementation of green technologies addressing water quality and water scarcity issues. Although the software is targeted to this user group, it will be made available for free to the public after the conclusion of the project.

  4. Expert tool use

    DEFF Research Database (Denmark)

    Thorndahl, Kathrine Liedtke; Ravn, Susanne

    2017-01-01

    on a case study of elite rope skipping, we argue that the phenomenological concept of incorporation does not suffice to adequately describe how expert tool users feel when interacting with their tools. By analyzing a combination of insights gained from participant observation of 11 elite rope skippers......According to some phenomenologists, a tool can be experienced as incorporated when, as a result of habitual use or deliberate practice, someone is able to manipulate it without conscious effort. In this article, we specifically focus on the experience of expertise tool use in elite sport. Based...... and autoethnographic material from one former elite skipper, we take some initial steps toward the development of a more nuanced understanding of the concept of incorporation; one that is able to accommodate the experiences of expert tool users. In sum, our analyses indicate that the possibility for experiencing...

  5. Tool steels

    DEFF Research Database (Denmark)

    Højerslev, C.

    2001-01-01

    On designing a tool steel, its composition and heat treatment parameters are chosen to provide a hardened and tempered martensitic matrix in which carbides are evenly distributed. In this condition the matrix has an optimum combination of hardness andtoughness, the primary carbides provide...... resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...... serves primarily two purpose (i) to improve the hardenabillity and (ii) to provide harder and thermally more stable carbides than cementite. Assuming proper heattreatment, the properties of a tool steel depends on the which alloying elements are added and their respective concentrations....

  6. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    Science.gov (United States)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  7. Combined perventricular septal defect closure and patent ductus arteriosus ligation via the lower ministernotomy approach.

    Science.gov (United States)

    Voitov, Alexey; Omelchenko, Alexander; Gorbatykh, Yuriy; Bogachev-Prokophiev, Alexander; Karaskov, Alexander

    2018-02-01

    Over the past decade, minimally invasive approaches have been advocated for surgical correction of congenital defects to reduce costs related to hospitalization and for improved cosmesis. Minimal skin incisions and partial sternotomy reduce surgical trauma, however these techniques might not be successful in treating a number of congenital pathological conditions, particularly for combined congenital defects. We focused on cases with a combined presentation of ventricular septal defect and patent ductus arteriosus. We studied 12 infants who successfully underwent surgical treatment for a combined single-stage ventricular septal defect and patent ductus arteriosus closure through a lower ministernotomy without using cardiopulmonary bypass and X-rays. No intraoperative and early postoperative complications or mortality were noted. Postoperative echocardiography did not reveal residual shunts. The proposed technique is safe and reproducible in infants. © Crown copyright 2017.

  8. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  9. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  10. System approach as a tool for optimization of the dismantling technological process of NPP decommissioning

    International Nuclear Information System (INIS)

    Bylkin, B.K.; Shpitser, V.Ya.

    1994-01-01

    The concept of NPP unit decommissioning has been considered. Special attention was paid to the stage of dismantling of NPP unit equipment. Employment of systematic approach as a tool for optimization of dismantling processes permits formalizing manipulations with certified indices of quality and it permits an objective assessment of the dismantling technology level attained during designing as compared with the basic one. It seems appropriate to develop a basic project of NPP unit decommissioning as a technical means of planning, predicting and evaluating ecological and social aftereffects

  11. A novel combined interventional radiologic and hepatobiliary surgical approach to a complex traumatic hilar biliary stricture

    Directory of Open Access Journals (Sweden)

    Rachel E. NeMoyer

    Full Text Available Introduction: Benign strictures of the biliary system are challenging and uncommon conditions requiring a multidisciplinary team for appropriate management. Presentation of case: The patient is a 32-year-old male that developed a hilar stricture as sequelae of a gunshot wound. Due to the complex nature of the stricture and scarring at the porta hepatis a combined interventional radiologic and surgical approach was carried out to approach the hilum of the right and left hepatic ducts. The location of this stricture was found by ultrasound guidance intraoperatively using a balloon tipped catheter placed under fluoroscopy in the interventional radiology suite prior to surgery. This allowed the surgeons to select the line of parenchymal transection for best visualization of the stricture. A left hepatectomy was performed, the internal stent located and the right hepatic duct opened tangentially to allow a side-to-side Roux-en-Y hepaticojejunostomy (a Puestow-like anastomosis. Discussion: Injury to the intrahepatic biliary ductal confluence is rarely fatal, however, the associated injuries lead to severe morbidity as seen in this example. Management of these injuries poses a considerable challenge to the surgeon and treating physicians. Conclusion: Here we describe an innovative multi-disciplinary approach to the repair of this rare injury. Keywords: Combined approach, Interventional radiology, Hepatobiliary surgery, Complex traumatic hilar biliary stricture, Case report

  12. A combined data mining approach using rough set theory and case-based reasoning in medical datasets

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Rezvan

    2014-06-01

    Full Text Available Case-based reasoning (CBR is the process of solving new cases by retrieving the most relevant ones from an existing knowledge-base. Since, irrelevant or redundant features not only remarkably increase memory requirements but also the time complexity of the case retrieval, reducing the number of dimensions is an issue worth considering. This paper uses rough set theory (RST in order to reduce the number of dimensions in a CBR classifier with the aim of increasing accuracy and efficiency. CBR exploits a distance based co-occurrence of categorical data to measure similarity of cases. This distance is based on the proportional distribution of different categorical values of features. The weight used for a feature is the average of co-occurrence values of the features. The combination of RST and CBR has been applied to real categorical datasets of Wisconsin Breast Cancer, Lymphography, and Primary cancer. The 5-fold cross validation method is used to evaluate the performance of the proposed approach. The results show that this combined approach lowers computational costs and improves performance metrics including accuracy and interpretability compared to other approaches developed in the literature.

  13. Prognostic factors in invasive bladder carcinoma treated by combined modality protocol (organ-sparing approach)

    International Nuclear Information System (INIS)

    Matos, Tadeja; Cufer, Tanja; Cervek, Jozica; Borstnar, Simona; Kragelj, Borut; Zumer-Pregelj, Mirjana

    2000-01-01

    Purpose: The results of bladder sparing approach for the treatment of muscle-invasive bladder cancer, using a combination of transurethral resection (TUR), chemotherapy, and radiotherapy, are encouraging. The survival of patients treated by this method is similar to the survival of patients treated by radical cystectomy. The aim of our study was to find out which pretreatment characteristics influence the survival of patients treated by organ sparing approach that would enable us to identify the patients most suitable for this type of treatment. Methods and Materials: The prognostic value of different factors, such as age, gender, performance status, hemoglobin level, clinical stage, histologic grade, presence of obstructive uropathy, and completeness of TUR, has been studied in 105 patients with invasive bladder cancer, who received a bladder sparing treatment in the period from 1988 to 1995. They were treated with a combination of TUR, followed by 2-4 cycles of methotrexate, cisplatinum, and vinblastine polychemotherapy. In complete responders the treatment was completed by radiotherapy (50 Gy to the bladder and 40 Gy to the regional lymph nodes), whereas nonresponders underwent cystectomy whenever feasible. Results: Our study has confirmed an independent prognostic value of performance status, histologic grade, and obstructive uropathy, for the disease-specific survival (DSS) of bladder cancer patients treated by a conservative approach. We believe that performance status best reflects the extent of disease and exerts significant influence on the extent and course of treatment, while obstructive uropathy is a good indicator of local spread of the disease, better than clinical T-stage. Our finding that histologic grade is one of the strongest prognostic factors shows that tumor biology also is a very important prognostic factor in patients treated by conservative approach. Conclusion: Patients with muscle-invasive bladder cancer who are most likely to benefit

  14. An Interactive Simulation Tool for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  15. Multi-UAV Flight using Virtual Structure Combined with Behavioral Approach

    Directory of Open Access Journals (Sweden)

    Kownacki Cezary

    2016-06-01

    Full Text Available Implementations of multi-UAV systems can be divided mainly into two different approaches, centralised system that synchronises positions of each vehicle by a ground station and an autonomous system based on decentralised control, which offers more flexibility and independence. Decentralisation of multi-UAV control entails the need for information sharing between all vehicles, what in some cases could be problematic due to a significant amount of data to be sent over the wireless network. To improve the reliability and the throughput of information sharing inside the formation of UAVs, this paper proposes an approach that combines virtual structure with a leader and two flocking behaviours. Each UAV has assigned different virtual migration point referenced to the leader's position which is simultaneously the origin of a formation reference frame. All migration points create together a virtual rigid structure. Each vehicle uses local behaviours of cohesion and repulsion respectively, to track its own assigned point in the structure and to avoid a collision with the previous UAV in the structure. To calculate parameters of local behaviours, each UAV should know position and attitude of the leader to define the formation reference frame and also the actual position of the previous UAV in the structure. Hence, information sharing can be based on a chain of local peer-to-peer communication between two consecutive vehicles in the structure. In such solution, the information about the leader could be sequentially transmitted from one UAV to another. Numerical simulations were prepared and carried out to verify the effectiveness of the presented approach. Trajectories recorded during those simulations show collective, coherence and collision-free flights of the formation created with five UAVs.

  16. Modified Lip Repositioning with Esthetic Crown Lengthening: A Combined Approach to Treating Excessive Gingival Display.

    Science.gov (United States)

    Sánchez, Isis M; Gaud-Quintana, Sadja; Stern, Jacob K

    Lip repositioning surgery to address excessive gingival display induced by different etiologies has received major attention recently. Several techniques and variations have been reported, including myotomy or repositioning of the levator labii superioris muscle, Le Fort impaction, maxillary gingivectomies, botulinum toxin injections, and lip stabilization. This study reports a case of excessive gingival display treated by a modified combined approach. A 25-year-old woman with a 4- to 8-mm gingival display when smiling caused by a combination of short clinical crowns induced by an altered passive eruption and hypermobility of the upper lip underwent a staged esthetic crown-lengthening procedure followed by a modified lip repositioning technique. A description of the technique and a comparison with other modes of therapy is discussed. This modified approach for treating the hypermobile lip included a bilateral removal of a partial-thickness strip of mucosa from the maxillary buccal vestibule without severing the muscle, leaving the midline frenum intact and suturing the lip mucosa to the mucogingival line. The narrower vestibule and increased tooth length resulted in a symmetric and pleasing gingival display when smiling that remained stable over time. With proper diagnosis and sequence of therapy, modified lip repositioning surgery combined with esthetic crown lengthening can be used predictably to treat excessive gingival display and enhance smile esthetics.

  17. Gas fired combined cycle plant in Singapore: energy use, GWP and cost-a life cycle approach

    International Nuclear Information System (INIS)

    Kannan, R.; Leong, K.C.; Osman, Ramli; Ho, H.K.; Tso, C.P.

    2005-01-01

    A life cycle assessment was performed to quantify the non-renewable (fossil) energy use and global warming potential (GWP) in electricity generation from a typical gas fired combined cycle power plant in Singapore. The cost of electricity generation was estimated using a life cycle cost analysis (LCCA) tool. The life cycle assessment (LCA) of a 367.5 MW gas fired combined cycle power plant operating in Singapore revealed that hidden processes consume about 8% additional energy in addition to the fuel embedded energy, and the hidden GWP is about 18%. The natural gas consumed during the operational phase accounted for 82% of the life cycle cost of electricity generation. An empirical relation between plant efficiency and life cycle energy use and GWP in addition to a scenario for electricity cost with varying gas prices and plant efficiency have been established

  18. A goal-based approach for qualification of new technologies: Foundations, tool support, and industrial validation

    International Nuclear Information System (INIS)

    Sabetzadeh, Mehrdad; Falessi, Davide; Briand, Lionel; Di Alesio, Stefano

    2013-01-01

    New technologies typically involve innovative aspects that are not addressed by the existing normative standards and hence are not assessable through common certification procedures. To ensure that new technologies can be implemented in a safe and reliable manner, a specific kind of assessment is performed, which in many industries, e.g., the energy sector, is known as Technology Qualification (TQ). TQ aims at demonstrating with an acceptable level of confidence that a new technology will function within specified limits. Expert opinion plays an important role in TQ, both to identify the safety and reliability evidence that needs to be developed and to interpret the evidence provided. Since there are often multiple experts involved in TQ, it is crucial to apply a structured process for eliciting expert opinions, and to use this information systematically when analyzing the satisfaction of the technology's safety and reliability objectives. In this paper, we present a goal-based approach for TQ. Our approach enables analysts to quantitatively reason about the satisfaction of the technology's overall goals and further to identify the aspects that must be improved to increase goal satisfaction. The approach is founded on three main components: goal models, expert elicitation, and probabilistic simulation. We describe a tool, named Modus, that we have developed in support of our approach. We provide an extensive empirical validation of our approach through two industrial case studies and a survey

  19. ED-WAVE tool design approach: Case of a textile wastewater treatment plant in Blantyre, Malawi

    Science.gov (United States)

    Chipofya, V.; Kraslawski, A.; Avramenko, Y.

    The ED-WAVE tool is a PC based package for imparting training on wastewater treatment technologies. The system consists of four modules viz. Reference Library, Process Builder, Case Study Manager, and Treatment Adviser. The principles of case-based design and case-based reasoning as applied in the ED-WAVE tool are utilised in this paper to evaluate the design approach of the wastewater treatment plant at Mapeto David Whitehead & Sons (MDW&S) textile and garments factory, Blantyre, Malawi. The case being compared with MDW&S in the ED-WAVE tool is Textile Case 4 in Sri Lanka (2003). Equalisation, coagulation and rotating biological contactors is the sequencing of treatment units at Textile Case 4 in Sri Lanka. Screening, oxidation ditches and sedimentation is the sequencing of treatment units at MDW&S textile and garments factory. The study suggests that aerobic biological treatment is necessary in the treatment of wastewater from a textile and garments factory. MDW&S incorporates a sedimentation process which is necessary for the removal of settleable matter before the effluent is discharged to the municipal wastewater treatment plant. The study confirmed the practical use of the ED-WAVE tool in the design of wastewater treatment systems, where after encountering a new situation; already collected decision scenarios (cases) are invoked and modified in order to arrive at a particular design alternative. What is necessary, however, is to appropriately modify the case arrived at through the Case Study Manager in order to come up with a design appropriate to the local situation taking into account technical, socio-economic and environmental aspects.

  20. Attitudes Toward Combining Psychological, Mind-Body Therapies and Nutritional Approaches for the Enhancement of Mood.

    Science.gov (United States)

    Lores, Taryn Jade; Henke, Miriam; Chur-Hansen, Anna

    2016-01-01

    Context • Interest has been rising in the use of complementary and alternative medicine (CAM) for the promotion of health and treatment of disease. To date, the majority of CAM research has focused on exploring the demographic characteristics, attitudes, and motivations of CAM users and on the efficacy of different therapies and products. Less is known with respect to the psychological characteristics of people who use CAM. Previous research has not investigated the usefulness of integrating mind-body therapies with natural products in a combined mood intervention. Objective • The study intended to investigate attitudes toward a proposed new approach to the treatment of mood, one that integrates psychological mind-body therapies and natural nutritional products. Design • Participants completed an online survey covering demographics, personality traits, locus of control, use of CAM, attitudes toward the proposed psychonutritional approach, and mood. Setting • This study was conducted at the University of Adelaide School of Psychology (Adelaide, SA, Australia). Participants • Participants were 333 members of the Australian general public, who were recruited online via the social-media platform Facebook. The majority were women (83.2%), aged between 18 and 81 y. Outcome Measures • Measures included the Multidimensional Health Locus of Control Scale Form B, the Ten-Item Personality Inventory, and the Depression, Anxiety and Stress Scale. Results • Participants were positive about the proposed approach and were likely to try it to enhance their moods. The likeliness of use of the combined approach was significantly higher in the female participants and was associated with higher levels of the personality trait openness and an internal health locus of control, after controlling for all other variables. Conclusions • Interest exists for an intervention for mood that incorporates both psychological and nutritional approaches. Further research into the

  1. Application of DETECTER, an evolutionary genomic tool to analyze genetic variation, to the cystic fibrosis gene family

    Directory of Open Access Journals (Sweden)

    De Kee Danny W

    2006-03-01

    Full Text Available Abstract Background The medical community requires computational tools that distinguish missense genetic differences having phenotypic impact within the vast number of sense mutations that do not. Tools that do this will become increasingly important for those seeking to use human genome sequence data to predict disease, make prognoses, and customize therapy to individual patients. Results An approach, termed DETECTER, is proposed to identify sites in a protein sequence where amino acid replacements are likely to have a significant effect on phenotype, including causing genetic disease. This approach uses a model-dependent tool to estimate the normalized replacement rate at individual sites in a protein sequence, based on a history of those sites extracted from an evolutionary analysis of the corresponding protein family. This tool identifies sites that have higher-than-average, average, or lower-than-average rates of change in the lineage leading to the sequence in the population of interest. The rates are then combined with sequence data to determine the likelihoods that particular amino acids were present at individual sites in the evolutionary history of the gene family. These likelihoods are used to predict whether any specific amino acid replacements, if introduced at the site in a modern human population, would have a significant impact on fitness. The DETECTER tool is used to analyze the cystic fibrosis transmembrane conductance regulator (CFTR gene family. Conclusion In this system, DETECTER retrodicts amino acid replacements associated with the cystic fibrosis disease with greater accuracy than alternative approaches. While this result validates this approach for this particular family of proteins only, the approach may be applicable to the analysis of polymorphisms generally, including SNPs in a human population.

  2. Assessment of multi-version NPP I and C systems safety. Metric-based approach, technique and tool

    International Nuclear Information System (INIS)

    Kharchenko, Vyacheslav; Volkovoy, Andrey; Bakhmach, Eugenii; Siora, Alexander; Duzhyi, Vyacheslav

    2011-01-01

    The challenges related to problem of assessment of actual diversity level and evaluation of diversity-oriented NPP I and C systems safety are analyzed. There are risks of inaccurate assessment and problems of insufficient decreasing probability of CCFs. CCF probability of safety-critical systems may be essentially decreased due to application of several different types of diversity (multi-diversity). Different diversity types of FPGA-based NPP I and C systems, general approach and stages of diversity and safety assessment as a whole are described. Objectives of the report are: (a) analysis of the challenges caused by use of diversity approach in NPP I and C systems in context of FPGA and other modern technologies application; (b) development of multi-version NPP I and C systems assessment technique and tool based on check-list and metric-oriented approach; (c) case-study of the technique: assessment of multi-version FPGA-based NPP I and C developed by use of Radiy TM Platform. (author)

  3. A combined energetic and economic approach for the sustainable design of geothermal plants

    International Nuclear Information System (INIS)

    Franco, Alessandro; Vaccaro, Maurizio

    2014-01-01

    Highlights: • Exploitation of medium to low temperature geothermal sources: ORC power plants. • Integrated energetic and economic approach for the analysis of geothermal power plants. • A brief overview of the cost items of geothermal power plants. • Analysis of specific cost of geothermal power plants based on the method proposed. • Analysis of sustainability of geothermal energy systems based on resource durability. - Abstract: The perspectives of future development of geothermal power plants, mainly of small size for the exploitation of medium–low temperature reservoirs, are discussed and analyzed in the present paper. Even if there is a general interest in new power plants and investments in this sector are recognized, the new installations are reduced; the apparent advantage of null cost of the energy source is negatively balanced by the high drilling and installation costs. A key element for the design of a geothermal plant for medium temperature geothermal source is the definition of the power of the plant (size): this is important in order to define not only the economic plan but also the durability of the reservoir. Considering that it is not possible that the development of geothermal industry could be driven only by an economic perspective, the authors propose a method for joining energetic and economic approaches. The result of the combined energetic and economic analysis is interesting particularly in case of Organic Rankine Cycle (ORC) power plants in order to define a suitable and optimal size and to maximize the resource durability. The method is illustrated with reference to some particular case studies, showing that the sustainability of small size geothermal plants will be approached only if the research for more economic solutions will be combined with efforts in direction of efficiency increase

  4. Tools for the automation of large control systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit – SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting in real-time to changes in the system, thus providing for the automation of standard procedures and the for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  5. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  6. Combined acute ecotoxicity of malathion and deltamethrin to Daphnia magna (Crustacea, Cladocera): comparison of different data analysis approaches.

    Science.gov (United States)

    Toumi, Héla; Boumaiza, Moncef; Millet, Maurice; Radetski, Claudemir Marcos; Camara, Baba Issa; Felten, Vincent; Masfaraud, Jean-François; Férard, Jean-François

    2018-04-19

    We studied the combined acute effect (i.e., after 48 h) of deltamethrin (a pyrethroid insecticide) and malathion (an organophosphate insecticide) on Daphnia magna. Two approaches were used to examine the potential interaction effects of eight mixtures of deltamethrin and malathion: (i) calculation of mixture toxicity index (MTI) and safety factor index (SFI) and (ii) response surface methodology coupled with isobole-based statistical model (using generalized linear model). According to the calculation of MTI and SFI, one tested mixture was found additive while the two other tested mixtures were found no additive (MTI) or antagonistic (SFI), but these differences between index responses are only due to differences in terminology related to these two indexes. Through the surface response approach and isobologram analysis, we concluded that there was a significant antagonistic effect of the binary mixtures of deltamethrin and malathion that occurs on D. magna immobilization, after 48 h of exposure. Index approaches and surface response approach with isobologram analysis are complementary. Calculation of mixture toxicity index and safety factor index allows identifying punctually the type of interaction for several tested mixtures, while the surface response approach with isobologram analysis integrates all the data providing a global outcome about the type of interactive effect. Only the surface response approach and isobologram analysis allowed the statistical assessment of the ecotoxicological interaction. Nevertheless, we recommend the use of both approaches (i) to identify the combined effects of contaminants and (ii) to improve risk assessment and environmental management.

  7. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  8. Real Analysis A Historical Approach

    CERN Document Server

    Stahl, Saul

    2011-01-01

    A provocative look at the tools and history of real analysis This new edition of Real Analysis: A Historical Approach continues to serve as an interesting read for students of analysis. Combining historical coverage with a superb introductory treatment, this book helps readers easily make the transition from concrete to abstract ideas. The book begins with an exciting sampling of classic and famous problems first posed by some of the greatest mathematicians of all time. Archimedes, Fermat, Newton, and Euler are each summoned in turn, illuminating the utility of infinite, power, and trigonome

  9. Estimation of raw material performance in mammalian cell culture using near infrared spectra combined with chemometrics approaches.

    Science.gov (United States)

    Lee, Hae Woo; Christie, Andrew; Liu, Jun Jay; Yoon, Seongkyu

    2012-01-01

    Understanding variability in raw materials and their impacts on product quality is of critical importance in the biopharmaceutical manufacturing processes. For this purpose, several spectroscopic techniques have been studied for raw material characterization, providing fast and nondestructive ways to measure quality of raw materials. However, investigations of correlation between spectra of raw materials and cell culture performance have been scarce due to their complexity and uncertainty. In this study, near-infrared spectra and bioassays of multiple soy hydrolysate lots manufactured by different vendors were analyzed using chemometrics approaches in order to address variability of raw materials as well as correlation between raw material properties and corresponding cell culture performance. Principal component analysis revealed that near-infrared spectra of different soy lots contain enough physicochemical information about soy hydrolysates to allow identification of lot-to-lot variability as well as vendor-to-vendor differences. The identified compositional variability was further analyzed in order to estimate cell growth and protein production of two mammalian cell lines under the condition of varying soy dosages using partial least square regression combined with optimal variable selection. The performance of the resulting models demonstrates the potential of near-infrared spectroscopy as a robust lot selection tool for raw materials while providing a biological link between chemical composition of raw materials and cell culture performance. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  10. Neutral current Drell-Yan with combined QCD and electroweak corrections in the POWHEG BOX

    CERN Document Server

    Barze', Luca; Nason, Paolo; Nicrosini, Oreste; Piccinini, Fulvio; Vicini, Alessandro

    2013-01-01

    Following recent work on the combination of electroweak and strong radiative corrections to single W-boson hadroproduction in the POWHEG BOX framework, we generalize the above treatment to cover the neutral current Drell-Yan process. According to the POWHEG method, we combine both the next-to-leading order (NLO) electroweak and QED multiple photon corrections with the native NLO and Parton Shower QCD contributions. We show comparisons with the predictions of the electroweak generator HORACE, to validate the reliability and accuracy of the approach. We also present phenomenological results obtained with the new tool for physics studies at the LHC.

  11. The failure combination method: presentation, application to a simple collection of systems

    International Nuclear Information System (INIS)

    Llory, M.; Villemeur, A.

    1981-11-01

    The main advantages of this particular method for analyzing the reliability and safety of systems, the method of failure combinations, are presented. This is an inductive method of analysis; it makes it possible to pursue the Failure Modes and Effect Analysis (FMEA) until overall failures are obtained. In this manner, through an inductive approach all the combinations of failure modes leading to abnormal functioning of systems are obtained. It also makes it possible to carry out the overall study of complex systems in interaction and the systematic inventory of abnormal functioning of these systems, as from the failure modes of the components and their combinations. It can be used as from the design stages of systems and is an excellent dialogue tool between the various specialists concerned in problems of safety, operation and reliability [fr

  12. Image edge detection based tool condition monitoring with morphological component analysis.

    Science.gov (United States)

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C

    2013-08-30

    A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, Jr., Cressie E.; Gorin, Andrew H.; Seals, Roland D.

    1994-01-01

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  15. Ceramic-bonded abrasive grinding tools

    Science.gov (United States)

    Holcombe, C.E. Jr.; Gorin, A.H.; Seals, R.D.

    1994-11-22

    Abrasive grains such as boron carbide, silicon carbide, alumina, diamond, cubic boron nitride, and mullite are combined with a cement primarily comprised of zinc oxide and a reactive liquid setting agent and solidified into abrasive grinding tools. Such grinding tools are particularly suitable for grinding and polishing stone, such as marble and granite.

  16. A Web tool for calculating k0-NAA uncertainties

    International Nuclear Information System (INIS)

    Younes, N.; Robouch, P.

    2003-01-01

    The calculation of uncertainty budgets is becoming a standard step in reporting analytical results. This gives rise to the need for simple, easily accessed tools to calculate uncertainty budgets. An example of such a tool is the Excel spreadsheet approach of Robouch et al. An internet application which calculates uncertainty budgets for k 0 -NAA is presented. The Web application has built in 'Literature' values for standard isotopes and accepts as inputs fixed information such as the thermal to epithermal neutron flux ratio, as well as experiment specific data such as the mass of the sample. The application calculates and displays intermediate uncertainties as well as the final combined uncertainty of the element concentration in the sample. The interface only requires access to a standard browser and is thus easily accessible to researchers and laboratories. This may facilitate and standardize the calculation of k 0 -NAA uncertainty budgets. (author)

  17. Combining radiotherapy and immunotherapy: A revived partnership

    International Nuclear Information System (INIS)

    Demaria, Sandra; Bhardwaj, Nina; McBride, William H.; Formenti, Silvia C.

    2005-01-01

    Ionizing radiation therapy (RT) is an important local modality for the treatment of cancer. The current rationale for its use is based largely on the ability of RT to kill the cancer cells by a direct cytotoxic effect. Nevertheless, considerable evidence indicates that RT effects extend beyond the mere elimination of the more radiosensitive fraction of cancer cells present within a tumor at the time of radiation exposure. For instance, a large body of evidence is accumulating on the ability of RT to modify the tumor microenvironment and generate inflammation. This might have far-reaching consequences regarding the response of a patient to treatment, especially if radiation-induced tumor cell kill were to translate into the generation of effective antitumor immunity. Although much remains to be learned about how radiation can impact tumor immunogenicity, data from preclinical studies provide the proof of principle that different immunotherapeutic strategies can be combined with RT to enhance antitumor effects. Conversely, RT could be a useful tool to combine with immunotherapy. This article will briefly summarize what is known about the impact of RT on tumor immunity, including tumor-associated antigens, antigen-presenting cells, and effector mechanisms. In addition, the experimental evidence supporting the contention that RT can be used as a tool to induce antitumor immunity is discussed, and a new approach to radioimmunotherapy of cancer is proposed

  18. On Combining Language Models: Oracle Approach

    National Research Council Canada - National Science Library

    Hacioglu, Kadri; Ward, Wayne

    2001-01-01

    In this paper, we address the of combining several language models (LMs). We find that simple interpolation methods, like log-linear and linear interpolation, improve the performance but fall short of the performance of an oracle...

  19. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    OpenAIRE

    Baraka D. Sija; Young-Hoon Goo; Kyu-Seok Shim; Huru Hasanova; Myung-Sup Kim

    2018-01-01

    A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE) defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards ...

  20. Laparoscopic complete mesocolic excision via combined medial and cranial approaches for transverse colon cancer.

    Science.gov (United States)

    Mori, Shinichiro; Kita, Yoshiaki; Baba, Kenji; Yanagi, Masayuki; Tanabe, Kan; Uchikado, Yasuto; Kurahara, Hiroshi; Arigami, Takaaki; Uenosono, Yoshikazu; Mataki, Yuko; Okumura, Hiroshi; Nakajo, Akihiro; Maemura, Kosei; Natsugoe, Shoji

    2017-05-01

    To evaluate the safety and feasibility of laparoscopic complete mesocolic excision via combined medial and cranial approaches with three-dimensional visualization around the gastrocolic trunk and middle colic vessels for transverse colon cancer. We evaluated prospectively collected data of 30 consecutive patients who underwent laparoscopic complete mesocolic excision between January 2010 and December 2015, 6 of whom we excluded, leaving 24 for the analysis. We assessed the completeness of excision, operative data, pathological findings, length of large bowel resected, complications, length of hospital stay, and oncological outcomes. Complete mesocolic excision completeness was graded as the mesocolic and intramesocolic planes in 21 and 3 patients, respectively. Eleven, two, eight, and three patients had T1, T2, T3, and T4a tumors, respectively; none had lymph node metastases. A mean of 18.3 lymph nodes was retrieved, and a mean of 5.4 lymph nodes was retrieved around the origin of the MCV. The mean large bowel length was 21.9 cm, operative time 274 min, intraoperative blood loss 41 mL, and length of hospital stay 15 days. There were no intraoperative and two postoperative complications. Our procedure for laparoscopic complete mesocolic excision via combined medial and cranial approaches is safe and feasible for transverse colon cancer.

  1. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  2. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    Directory of Open Access Journals (Sweden)

    Matthew V Caruana

    Full Text Available Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  3. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    Science.gov (United States)

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  4. Eviromental Economic and Technological Residues Management Demands: An Optimization Tool.

    Directory of Open Access Journals (Sweden)

    Marisa Soares Borges

    2012-12-01

    Full Text Available Industrial residues management is a very demanding task since many different goals must be achieved. The combination of different approaches used by people from different stuff is very challenging activity that can misuse the residues potential value and applicability. An interactive WEB base tool, to integrate different sectors and overcome residues management difficulties will be presented. The system must be loaded with all data concerning the residue life cycle, and through data integration and modeling routine will give the best alternative as output. As wider and complete the system data becomes, by information loading from differen t segment, more efficient the residues management becomes. The user friendly tool will encourage the participation of industries, labs and research institutions to obtain qualified information about industrial residues inventory, raw materials recovery, characteristics, treatment and alternative uses, to achieve residues management sustainability.

  5. Efficiency of stormwater control measures for combined sewer retrofitting under varying rain conditions: Quantifying the Three Points Approach (3PA)

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Lerer, Sara Maria; Arnbjerg-Nielsen, Karsten

    2016-01-01

    We present a method to assess and communicate the efficiency of stormwater control measures for retrofitting existing urban areas. The tool extends the Three Points Approach to quantitatively distinguish three rainfall domains: (A) rainwater resource utilisation, (B) urban stormwater drainage pip...

  6. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  7. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  8. Frequency Response Studies using Receptance Coupling Approach in High Speed Spindles

    Science.gov (United States)

    Shaik, Jakeer Hussain; Ramakotaiah, K.; Srinivas, J.

    2018-01-01

    In order to assess the stability of high speed machining, estimate the frequency response at the end of tool tip is of great importance. Evaluating dynamic response of several combinations of integrated spindle-tool holder-tool will consume a lot of time. This paper presents coupled field dynamic response at tool tip for the entire integrated spindle tool unit. The spindle unit is assumed to be relying over the front and rear bearings and investigated using the Timoshenko beam theory to arrive the receptances at different locations of the spindle-tool unit. The responses are further validated with conventional finite element model as well as with the experiments. This approach permits quick outputs without losing accuracy of solution and further these methods are utilized to analyze the various design variables on system dynamics. The results obtained through this analysis are needed to design the better spindle unit in an attempt to reduce the frequency amplitudes at the tool tip to improvise the milling stability during cutting process.

  9. Approaches of data combining for reliability assessments with taking into account the priority of data application

    International Nuclear Information System (INIS)

    Zelenyj, O.V.; Pecheritsa, A.V.

    2004-01-01

    Based upon the available experience on assessments of risk from Ukrainian NPP's operational events as well as on results of State review of PSA studies for pilot units it should be noted that historical information on domestic NPP's operation is not always available or used properly under implementation of mentioned activities. The several approaches for combining of available generic and specific information for reliability parameters assessment (taking into account the priority of data application) are briefly described in the article along with some recommendations how to apply those approaches

  10. Measuring combined exposure to environmental pressures in urban areas: an air quality and noise pollution assessment approach.

    Science.gov (United States)

    Vlachokostas, Ch; Achillas, Ch; Michailidou, A V; Moussiopoulos, Nu

    2012-02-01

    This study presents a methodological scheme developed to provide a combined air and noise pollution exposure assessment based on measurements from personal portable monitors. Provided that air and noise pollution are considered in a co-exposure approach, they represent a significant environmental hazard to public health. The methodology is demonstrated for the city of Thessaloniki, Greece. The results of an extensive field campaign are presented and the variations in personal exposure between modes of transport, routes, streets and transport microenvironments are evaluated. Air pollution and noise measurements were performed simultaneously along several commuting routes, during the morning and evening rush hours. Combined exposure to environmental pollutants is highlighted based on the Combined Exposure Factor (CEF) and Combined Dose and Exposure Factor (CDEF). The CDEF takes into account the potential relative uptake of each pollutant by considering the physical activities of each citizen. Rather than viewing environmental pollutants separately for planning and environmental sustainability considerations, the possibility of an easy-to-comprehend co-exposure approach based on these two indices is demonstrated. Furthermore, they provide for the first time a combined exposure assessment to these environmental pollutants for Thessaloniki and in this sense they could be of importance for local public authorities and decision makers. A considerable environmental burden for the citizens of Thessaloniki, especially for VOCs and noise pollution levels is observed. The material herein points out the importance of measuring public health stressors and the necessity of considering urban environmental pollution in a holistic way. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Treatment of premature ejaculation: a new combined approach

    Directory of Open Access Journals (Sweden)

    Adel Kurkar

    2015-01-01

    Causes of PE differ considerably. In this paper, we compared the outcomes of two single treatment lines together with a combination of both. The combination therapy was more effective than either line alone.

  12. Combined simulation of energy and thermal management for an electric vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Mohrmann, Bjoern; Jeck, Peter [Institut fuer Kraftfahrzeuge Aachen (Germany); Simon, Carsten [fortiss GmbH, Muenchen (Germany); Ungermann, Jochen [Audi AG, Ingolstadt (Germany)

    2012-11-01

    The project eperformance, which is funded by the BMBF, is conducted by project partners from RWTH Aachen, Audi, Bosch Engineering and fortiss GmbH, in order to demonstrate the concept of an electric vehicle on the basis of a holistic development approach. To support this, several simulation platforms come into use, i.e. CFD Simulation for cooling concepts, electromagnetic simulations for electric machine design, physical simulation of cooling circuits as well as vehicle mechanics and controller design. To develop an energy efficient vehicle management, some of these simulation domains have to be combined, to simulate interdependencies between for example usage of high-voltage batteries, their thermal response and the impact for controller strategies. Within the project it was decided to use the Tool TISC (TLK Inter Software Connector) to combine as well a physical model, based on Modelica/Dymola to simulate thermal behaviours of components with a longitudinal vehicle model and a controller model, both based in MATLAB/Simulink. Advantages of such a coupled simulation are the re-usability of existing models in both tools with their tool-specific benefits as well as the possibility to cluster the models on different computers. The article will explain how the combined simulation is set up and parameterized, and will show two use cases: the thermal management of the two independent battery systems of the demonstrator vehicle and the torque distribution on the three electric machines in the vehicle, depending on the drive situation and the thermal state of the machines. (orig)

  13. Combined analgesics in (headache pain therapy: shotgun approach or precise multi-target therapeutics?

    Directory of Open Access Journals (Sweden)

    Fiebich Bernd L

    2011-03-01

    Full Text Available Abstract Background Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix" are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. Discussion In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect. As an example the fixesd-dose combination of acetylsalicylic acid (ASA, paracetamol (acetaminophen and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Summary Multitarget therapeutics like combined analgesics broaden

  14. Combined analgesics in (headache) pain therapy: shotgun approach or precise multi-target therapeutics?

    Science.gov (United States)

    2011-01-01

    Background Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. Discussion In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect. As an example the fixesd-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Summary Multitarget therapeutics like combined analgesics broaden the array of therapeutic

  15. Combined analgesics in (headache) pain therapy: shotgun approach or precise multi-target therapeutics?

    Science.gov (United States)

    Straube, Andreas; Aicher, Bernhard; Fiebich, Bernd L; Haag, Gunther

    2011-03-31

    Pain in general and headache in particular are characterized by a change in activity in brain areas involved in pain processing. The therapeutic challenge is to identify drugs with molecular targets that restore the healthy state, resulting in meaningful pain relief or even freedom from pain. Different aspects of pain perception, i.e. sensory and affective components, also explain why there is not just one single target structure for therapeutic approaches to pain. A network of brain areas ("pain matrix") are involved in pain perception and pain control. This diversification of the pain system explains why a wide range of molecularly different substances can be used in the treatment of different pain states and why in recent years more and more studies have described a superior efficacy of a precise multi-target combination therapy compared to therapy with monotherapeutics. In this article, we discuss the available literature on the effects of several fixed-dose combinations in the treatment of headaches and discuss the evidence in support of the role of combination therapy in the pharmacotherapy of pain, particularly of headaches. The scientific rationale behind multi-target combinations is the therapeutic benefit that could not be achieved by the individual constituents and that the single substances of the combinations act together additively or even multiplicatively and cooperate to achieve a completeness of the desired therapeutic effect.As an example the fixed-dose combination of acetylsalicylic acid (ASA), paracetamol (acetaminophen) and caffeine is reviewed in detail. The major advantage of using such a fixed combination is that the active ingredients act on different but distinct molecular targets and thus are able to act on more signalling cascades involved in pain than most single analgesics without adding more side effects to the therapy. Multitarget therapeutics like combined analgesics broaden the array of therapeutic options, enable the completeness

  16. Decision-Making Tool for Cost-Efficient and Environmentally Friendly Wood Mobilisation

    Directory of Open Access Journals (Sweden)

    Matevž Triplat

    2015-06-01

    Full Text Available Background and Purpose: With development of forest management technologies, the efficiency of wood production was significantly improved, and thus the impact on forests has changed as well. The article presents a practical decision-making tool for selection of most suitable harvesting system, considering given terrain as well as expected soil conditions on harvesting sites. The decision-making tool should support cost-efficient and environmentally friendly mobilisation of wood. Materials and Methods: The presented decision-making tool is based on ground bearing capacities (relevant environmental parameter and nominal ground pressure (harvesting system characteristics. Soil and terrain (slope characteristics were taken into account for selection of the most suitable harvesting system. Three-step methodological approach was suggested, where soil and terrain conditions were defined in first step, while harvesting system were described using wood process charts (“functiogramms” in second step. In final step ecological and technological requirements were matched. Results: To exemplify the three-step methodology, a decision-making tool was prepared for the three selected harvesting systems. The proposed harvesting systems differ in technological, ecological and economic aspects, but each is limited by at least one of the aspect. Conclusions: The decision-making tool in combination with the presented wood process charts (“functiogramms” can simplify and facilitate forest production planning, although it can also be used in case of unforeseen event e.g. changing of soil moisture, machinery failure and insufficient current capacities. Considering the envisaged quantities and types of forest wooden assortments, it is possible to use the decision-making tool for a basic selection of most appropriate harvesting systems. The main idea behind the suggested three step methodological approach is that forest workers can prepare individual decision

  17. Combination Drug Delivery Approaches in Metastatic Breast Cancer

    Directory of Open Access Journals (Sweden)

    Jun H. Lee

    2012-01-01

    Full Text Available Disseminated metastatic breast cancer needs aggressive treatment due to its reduced response to anticancer treatment and hence low survival and quality of life. Although in theory a combination drug therapy has advantages over single-agent therapy, no appreciable survival enhancement is generally reported whereas increased toxicity is frequently seen in combination treatment especially in chemotherapy. Currently used combination treatments in metastatic breast cancer will be discussed with their challenges leading to the introduction of novel combination anticancer drug delivery systems that aim to overcome these challenges. Widely studied drug delivery systems such as liposomes, dendrimers, polymeric nanoparticles, and water-soluble polymers can concurrently carry multiple anticancer drugs in one platform. These carriers can provide improved target specificity achieved by passive and/or active targeting mechanisms.

  18. Design mentoring tool : [technical summary].

    Science.gov (United States)

    2011-01-01

    In 2004 a design engineer on-line mentoring tool was developed and implemented The purpose of the tool was to assist senior engineers mentoring new engineers to the INDOT design process and improve their technical competency. This approach saves seni...

  19. Receiver Architectures for MIMO-OFDM Based on a Combined VMP-SP Algorithm

    DEFF Research Database (Denmark)

    Manchón, Carles Navarro; Kirkelund, Gunvor Elisabeth; Riegler, Erwin

    2011-01-01

    , such as the sum-product (SP) and variational message passing (VMP) algorithms, have become increasingly popular. In this contribution, we apply a combined VMP-SP message-passing technique to the design of receivers for MIMO-ODFM systems. The message-passing equations of the combined scheme can be obtained from......Iterative information processing, either based on heuristics or analytical frameworks, has been shown to be a very powerful tool for the design of efficient, yet feasible, wireless receiver architectures. Within this context, algorithms performing message-passing on a probabilistic graph...... assessment of our solutions, based on Monte Carlo simulations, corroborates the high performance of the proposed algorithms and their superiority to heuristic approaches....

  20. Unrewarded Object Combinations in Captive Parrots

    Directory of Open Access Journals (Sweden)

    Alice Marie Isabel Auersperg

    2014-11-01

    Full Text Available In primates, complex object combinations during play are often regarded as precursors of functional behavior. Here we investigate combinatory behaviors during unrewarded object manipulation in seven parrot species, including kea, African grey parrots and Goffin cockatoos, three species previously used as model species for technical problem solving. We further examine a habitually tool using species, the black palm cockatoo. Moreover, we incorporate three neotropical species, the yellow- and the black-billed Amazon and the burrowing parakeet. Paralleling previous studies on primates and corvids, free object-object combinations and complex object-substrate combinations such as inserting objects into tubes/holes or stacking rings onto poles prevailed in the species previously linked to advanced physical cognition and tool use. In addition, free object-object combinations were intrinsically structured in Goffin cockatoos and in kea.

  1. Endoscopic combined “transseptal/transnasal” approach for pituitary adenoma: reconstruction of skull base using pedicled nasoseptal flap in 91 consecutive cases

    Directory of Open Access Journals (Sweden)

    Yasunori Fujimoto

    2015-07-01

    Full Text Available Objective The purpose of this study was to describe the endoscopic combined “transseptal/transnasal” approach with a pedicled nasoseptal flap for pituitary adenoma and skull base reconstruction, especially with respect to cerebrospinal fluid (CSF fistula.Method Ninety-one consecutive patients with pituitary adenomas were retrospectively reviewed. All patients underwent the endoscopic combined “transseptal/transnasal” approach by the single team including the otorhinolaryngologists and neurosurgeons. Postoperative complications related to the flap were analyzed.Results Intra- and postoperative CSF fistulae were observed in 36 (40% and 4 (4.4% patients, respectively. Among the 4 patients, lumbar drainage and bed rest healed the CSF fistula in 3 patients and reoperation for revision was necessary in one patient. Other flap-related complications included nasal bleeding in 3 patients (3.3%.Conclusion The endoscopic combined “transseptal/transnasal” approach is most suitable for a two-surgeon technique and a pedicled nasoseptal flap is a reliable technique for preventing postoperative CSF fistula in pituitary surgery.

  2. Integrated Systems-Based Approach for Reaching Acceptable End Points for Groundwater - 13629

    International Nuclear Information System (INIS)

    Lee, M. Hope; Wellman, Dawn; Truex, Mike; Freshley, Mark D.; Sorenson, Kent S. Jr.; Wymore, Ryan

    2013-01-01

    The sheer mass and nature of contaminated materials at DOE and DoD sites, makes it impractical to completely restore these sites to pre-disposal conditions. DOE faces long-term challenges, particularly with developing monitoring and end state approaches for clean-up that are protective of the environment, technically based and documented, sustainable, and most importantly cost effective. Integrated systems-based monitoring approaches (e.g., tools for characterization and monitoring, multi-component strategies, geophysical modeling) could provide novel approaches and a framework to (a) define risk-informed endpoints and/or conditions that constitute completion of cleanup and (b) provide the understanding for implementation of advanced scientific approaches to meet cleanup goals. Multi-component strategies which combine site conceptual models, biological, chemical, and physical remediation strategies, as well as iterative review and optimization have proven successful at several DOE sites. Novel tools such as enzyme probes and quantitative PCR for DNA and RNA, and innovative modeling approaches for complex subsurface environments, have been successful at facilitating the reduced operation or shutdown of pump and treat facilities and transition of clean-up activities into monitored natural attenuation remedies. Integrating novel tools with site conceptual models and other lines of evidence to characterize, optimize, and monitor long term remedial approaches for complex contaminant plumes are critical for transitioning active remediation into cost effective, yet technically defensible endpoint strategies. (authors)

  3. A cloud based tool for knowledge exchange on local scale flood risk.

    Science.gov (United States)

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the

  4. A Survey of Automatic Protocol Reverse Engineering Approaches, Methods, and Tools on the Inputs and Outputs View

    Directory of Open Access Journals (Sweden)

    Baraka D. Sija

    2018-01-01

    Full Text Available A network protocol defines rules that control communications between two or more machines on the Internet, whereas Automatic Protocol Reverse Engineering (APRE defines the way of extracting the structure of a network protocol without accessing its specifications. Enough knowledge on undocumented protocols is essential for security purposes, network policy implementation, and management of network resources. This paper reviews and analyzes a total of 39 approaches, methods, and tools towards Protocol Reverse Engineering (PRE and classifies them into four divisions, approaches that reverse engineer protocol finite state machines, protocol formats, and both protocol finite state machines and protocol formats to approaches that focus directly on neither reverse engineering protocol formats nor protocol finite state machines. The efficiency of all approaches’ outputs based on their selected inputs is analyzed in general along with appropriate reverse engineering inputs format. Additionally, we present discussion and extended classification in terms of automated to manual approaches, known and novel categories of reverse engineered protocols, and a literature of reverse engineered protocols in relation to the seven layers’ OSI (Open Systems Interconnection model.

  5. A new methodology for predictive tool wear

    Science.gov (United States)

    Kim, Won-Sik

    An empirical approach to tool wear, which requires a series of machining tests for each combination of insert and work material, has been a standard practice for industries since early part of the twentieth century. With many varieties of inserts and work materials available for machining, the empirical approach is too experiment-intensive that the demand for the development of a model-based approach is increasing. With a model-based approach, the developed wear equation can be extended without additional machining experiments. The main idea is that the temperatures on the primary wear areas are increasing such that the physical properties of the tool material degrade substantially and consequently tool wear increases. Dissolution and abrasion are identified to be the main mechanisms for tool wear. Flank wear is predominantly a phenomenon of abrasion as evident by the presence of a scoring mark on the flank surface. Based on this statement, it is reasonable to expect that the flank-wear rate would increase with the content of hard inclusions. However, experimental flank wear results did not necessary correspond to the content of cementite phase present in the steels. Hence, other phenomena are believed to significantly affect wear behavior under certain conditions. When the cutting temperature in the flank interface is subjected to high enough temperatures, pearlitic structure austenizes. During the formation of a new austenitic phase, the existing carbon is dissolved into the ferrite matrix, which will reduce the abrasive action. To verify the austenitic transformation, turning tests were conducted with plain carbon steels. The machined surface areas are imaged using X-ray diffraction the Scanning Electron Microscope (SEM) and the Transmission Electron Microscope (TEM). On the other hand, crater wear occurs as a result of dissolution wear and abrasive wear. To verify the wear mechanisms of crater wear, various coating inserts as well as uncoated inserts were

  6. On the bioavailability of trace metals in surface sediments: a combined geochemical and biological approach.

    Science.gov (United States)

    Roosa, Stéphanie; Prygiel, Emilie; Lesven, Ludovic; Wattiez, Ruddy; Gillan, David; Ferrari, Benoît J D; Criquet, Justine; Billon, Gabriel

    2016-06-01

    The bioavailability of metals was estimated in three river sediments (Sensée, Scarpe, and Deûle Rivers) impacted by different levels of Cu, Cd, Pb, and Zn (Northern France). For that, a combination of geochemistry and biological responses (bacteria and chironomids) was used. The results obtained illustrate the complexity of the notion of "bioavailability." Indeed, geochemical indexes suggested a low toxicity, even in surface sediments with high concentrations of total metals and a predicted severe effect levels for the organisms. This was also suggested by the abundance of total bacteria as determined by DAPI counts, with high bacterial cell numbers even in contaminated areas. However, a fraction of metals may be bioavailable as it was shown for chironomid larvae which were able to accumulate an important quantity of metals in surface sediments within just a few days.We concluded that (1) the best approach to estimate bioavailability in the selected sediments is a combination of geochemical and biological approaches and that (2) the sediments in the Deûle and Scarpe Rivers are highly contaminated and may impact bacterial populations but also benthic invertebrates.

  7. Large central lesions compressing the hypothalamus and brainstem. Operative approaches and combination treatment with radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, Hiroshi K.; Negishi, Masatoshi; Kohga, Hideaki; Hirato, Masafumi; Ohye, Chihiro [Gunma Univ., Maebashi (Japan). School of Medicine; Shibazaki, Tohru

    1998-09-01

    A major aim of minimally invasive neurosurgery is to preserve function in the brain and cranial nerves. Based on previous results of radiosurgery for central lesions (19 craniopharyngiomas, 46 pituitary adenomas, 9 meningeal tumors), combined micro- and/or radiosurgery was applied for large lesions compressing the hypothalamus and/or brainstem. A basal interhemispheric approach via superomedial orbitotomy or a transcallosal-transforaminal approach was used for these large tumors. Tumors left behind in the hypothalamus or cavernous sinus were treated with radiosurgery using a gamma unit. Preoperative hypothalamo-pituitary functions were preserved in most of these patients. Radiosurgical results were evaluated in patients followed for more than 2 years after treatment. All 9 craniopharyngiomas decreased in size after radiosurgery, although a second treatment was required in 4 patients. All 20 pituitary adenomas were stable or decreased in size and 5 of 7 functioning adenomas showed normalized values of hormones in the serum. All 3 meningeal tumors were stable or decreased in size after treatment. No cavernous sinus symptoms developed after radiosurgery. We conclude that combined micro- and radio-neurosurgery is an effective and less invasive treatment for large central lesions compressing the hypothalamus and brainstem. (author)

  8. Large central lesions compressing the hypothalamus and brainstem. Operative approaches and combination treatment with radiosurgery

    International Nuclear Information System (INIS)

    Inoue, Hiroshi K.; Negishi, Masatoshi; Kohga, Hideaki; Hirato, Masafumi; Ohye, Chihiro; Shibazaki, Tohru

    1998-01-01

    A major aim of minimally invasive neurosurgery is to preserve function in the brain and cranial nerves. Based on previous results of radiosurgery for central lesions (19 craniopharyngiomas, 46 pituitary adenomas, 9 meningeal tumors), combined micro- and/or radiosurgery was applied for large lesions compressing the hypothalamus and/or brainstem. A basal interhemispheric approach via superomedial orbitotomy or a transcallosal-transforaminal approach was used for these large tumors. Tumors left behind in the hypothalamus or cavernous sinus were treated with radiosurgery using a gamma unit. Preoperative hypothalamo-pituitary functions were preserved in most of these patients. Radiosurgical results were evaluated in patients followed for more than 2 years after treatment. All 9 craniopharyngiomas decreased in size after radiosurgery, although a second treatment was required in 4 patients. All 20 pituitary adenomas were stable or decreased in size and 5 of 7 functioning adenomas showed normalized values of hormones in the serum. All 3 meningeal tumors were stable or decreased in size after treatment. No cavernous sinus symptoms developed after radiosurgery. We conclude that combined micro- and radio-neurosurgery is an effective and less invasive treatment for large central lesions compressing the hypothalamus and brainstem. (author)

  9. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  10. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Directory of Open Access Journals (Sweden)

    Henny Rydberg

    Full Text Available Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity.We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001.Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  11. Travel fosters tool use in wild chimpanzees.

    Science.gov (United States)

    Gruber, Thibaud; Zuberbühler, Klaus; Neumann, Christof

    2016-07-19

    Ecological variation influences the appearance and maintenance of tool use in animals, either due to necessity or opportunity, but little is known about the relative importance of these two factors. Here, we combined long-term behavioural data on feeding and travelling with six years of field experiments in a wild chimpanzee community. In the experiments, subjects engaged with natural logs, which contained energetically valuable honey that was only accessible through tool use. Engagement with the experiment was highest after periods of low fruit availability involving more travel between food patches, while instances of actual tool-using were significantly influenced by prior travel effort only. Additionally, combining data from the main chimpanzee study communities across Africa supported this result, insofar as groups with larger travel efforts had larger tool repertoires. Travel thus appears to foster tool use in wild chimpanzees and may also have been a driving force in early hominin technological evolution.

  12. Modeling a Decision Support Tool for Buildable and Sustainable Building Envelope Designs

    Directory of Open Access Journals (Sweden)

    Natee Singhaputtangkul

    2015-05-01

    Full Text Available Sustainability and buildability requirements in building envelope design have significantly gained more importance nowadays, yet there is a lack of an appropriate decision support system (DSS that can help a building design team to incorporate these requirements and manage their tradeoffs at once. The main objective of this study is to build such a tool to facilitate a building design team to take into account sustainability and buildability criteria for assessment of building envelopes of high-rise residential buildings in Singapore. Literature reviews were conducted to investigate a comprehensive set of the sustainability and buildability criteria. This also included development of the tool using a Quality Functional Deployment (QFD approach combined with fuzzy set theory. A building design team was engaged to test the tool with the aim to evaluate usefulness of the tool in managing the tradeoffs among the sustainability and buildability criteria. The results from a qualitative data analysis suggested that the tool allowed the design team to effectively find a balance between the tradeoffs among the criteria when assessing multiple building envelope design alternatives. Main contributions of using this tool are achievement of a more efficient assessment of the building envelopes and more sustainable and buildable building envelope design.

  13. Tools for Microbiological risk assessment

    DEFF Research Database (Denmark)

    Bassett, john; Nauta, Maarten; Lindqvist, Roland

    can increase the understanding of microbiological risks in foods. It is timely to inform food safety professionals about the availability and utility of MRA tools. Therefore, the focus of this report is to aid the food safety manager by providing a concise summary of the tools available for the MRA......Microbiological Risk Assessment (MRA) has emerged as a comprehensive and systematic approach for addressing the risk of pathogens in specific foods and/or processes. At government level, MRA is increasingly recognised as a structured and objective approach to understand the level of risk in a given...... food/pathogen scenario. Tools developed so far support qualitative and quantitative assessments of the risk that a food pathogen poses to a particular population. Risk can be expressed as absolute numbers or as relative (ranked) risks. The food industry is beginning to appreciate that the tools for MRA...

  14. A teaching skills assessment tool inspired by the Calgary-Cambridge model and the patient-centered approach.

    Science.gov (United States)

    Sommer, Johanna; Lanier, Cédric; Perron, Noelle Junod; Nendaz, Mathieu; Clavet, Diane; Audétat, Marie-Claude

    2016-04-01

    The aim of this study was to develop a descriptive tool for peer review of clinical teaching skills. Two analogies framed our research: (1) between the patient-centered and the learner-centered approach; (2) between the structures of clinical encounters (Calgary-Cambridge communication model) and teaching sessions. During the course of one year, each step of the action research was carried out in collaboration with twelve clinical teachers from an outpatient general internal medicine clinic and with three experts in medical education. The content validation consisted of a literature review, expert opinion and the participatory research process. Interrater reliability was evaluated by three clinical teachers coding thirty audiotaped standardized learner-teacher interactions. This tool contains sixteen items covering the process and content of clinical supervisions. Descriptors define the expected teaching behaviors for three levels of competence. Interrater reliability was significant for eleven items (Kendall's coefficient pteaching skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  16. A linear combination of pharmacophore hypotheses as a new tool in search of new active compounds--an application for 5-HT1A receptor ligands.

    Directory of Open Access Journals (Sweden)

    Dawid Warszycki

    Full Text Available This study explores a new approach to pharmacophore screening involving the use of an optimized linear combination of models instead of a single hypothesis. The implementation and evaluation of the developed methodology are performed for a complete known chemical space of 5-HT1AR ligands (3616 active compounds with K i < 100 nM acquired from the ChEMBL database. Clusters generated from three different methods were the basis for the individual pharmacophore hypotheses, which were assembled into optimal combinations to maximize the different coefficients, namely, MCC, accuracy and recall, to measure the screening performance. Various factors that influence filtering efficiency, including clustering methods, the composition of test sets (random, the most diverse and cluster population-dependent and hit mode (the compound must fit at least one or two models from a final combination were investigated. This method outmatched both single hypothesis and random linear combination approaches.

  17. Alternative approaches to postgraduate supervision: A planning tool ...

    African Journals Online (AJOL)

    Increased demands on academics due to the changing work and higher educational environments challenge traditional approaches to postgraduate supervision. Supervisors often tend to follow the apprenticeship approach uncritically. Supervisors therefore need to be aware of alternative approaches to supervision and of ...

  18. Induction of angiogenesis in tissue-engineered scaffolds designed for bone repair: a combined gene therapy-cell transplantation approach.

    Science.gov (United States)

    Jabbarzadeh, Ehsan; Starnes, Trevor; Khan, Yusuf M; Jiang, Tao; Wirtel, Anthony J; Deng, Meng; Lv, Qing; Nair, Lakshmi S; Doty, Steven B; Laurencin, Cato T

    2008-08-12

    One of the fundamental principles underlying tissue engineering approaches is that newly formed tissue must maintain sufficient vascularization to support its growth. Efforts to induce vascular growth into tissue-engineered scaffolds have recently been dedicated to developing novel strategies to deliver specific biological factors that direct the recruitment of endothelial cell (EC) progenitors and their differentiation. The challenge, however, lies in orchestration of the cells, appropriate biological factors, and optimal factor doses. This study reports an approach as a step forward to resolving this dilemma by combining an ex vivo gene transfer strategy and EC transplantation. The utility of this approach was evaluated by using 3D poly(lactide-co-glycolide) (PLAGA) sintered microsphere scaffolds for bone tissue engineering applications. Our goal was achieved by isolation and transfection of adipose-derived stromal cells (ADSCs) with adenovirus encoding the cDNA of VEGF. We demonstrated that the combination of VEGF releasing ADSCs and ECs results in marked vascular growth within PLAGA scaffolds. We thereby delineate the potential of ADSCs to promote vascular growth into biomaterials.

  19. Induction of angiogenesis in tissue-engineered scaffolds designed for bone repair: A combined gene therapy–cell transplantation approach

    Science.gov (United States)

    Jabbarzadeh, Ehsan; Starnes, Trevor; Khan, Yusuf M.; Jiang, Tao; Wirtel, Anthony J.; Deng, Meng; Lv, Qing; Nair, Lakshmi S.; Doty, Steven B.; Laurencin, Cato T.

    2008-01-01

    One of the fundamental principles underlying tissue engineering approaches is that newly formed tissue must maintain sufficient vascularization to support its growth. Efforts to induce vascular growth into tissue-engineered scaffolds have recently been dedicated to developing novel strategies to deliver specific biological factors that direct the recruitment of endothelial cell (EC) progenitors and their differentiation. The challenge, however, lies in orchestration of the cells, appropriate biological factors, and optimal factor doses. This study reports an approach as a step forward to resolving this dilemma by combining an ex vivo gene transfer strategy and EC transplantation. The utility of this approach was evaluated by using 3D poly(lactide-co-glycolide) (PLAGA) sintered microsphere scaffolds for bone tissue engineering applications. Our goal was achieved by isolation and transfection of adipose-derived stromal cells (ADSCs) with adenovirus encoding the cDNA of VEGF. We demonstrated that the combination of VEGF releasing ADSCs and ECs results in marked vascular growth within PLAGA scaffolds. We thereby delineate the potential of ADSCs to promote vascular growth into biomaterials. PMID:18678895

  20. Tools for the Automation of Large Distributed Control Systems

    CERN Document Server

    Gaspar, Clara

    2005-01-01

    The new LHC experiments at CERN will have very large numbers of channels to operate. In order to be able to configure and monitor such large systems, a high degree of parallelism is necessary. The control system is built as a hierarchy of sub-systems distributed over several computers. A toolkit - SMI++, combining two approaches: finite state machines and rule-based programming, allows for the description of the various sub-systems as decentralized deciding entities, reacting is real-time to changes in the system, thus providing for the automation of standard procedures and for the automatic recovery from error conditions in a hierarchical fashion. In this paper we will describe the principles and features of SMI++ as well as its integration with an industrial SCADA tool for use by the LHC experiments and we will try to show that such tools, can provide a very convenient mechanism for the automation of large scale, high complexity, applications.

  1. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  2. Estimation and optimization of flank wear and tool lifespan in finish turning of AISI 304 stainless steel using desirability function approach

    Directory of Open Access Journals (Sweden)

    Lakhdar Bouzid

    2018-10-01

    Full Text Available The wear of cutting tools remains a major obstacle. The effects of wear are not only antagonistic at the lifespan and productivity, but also harmful with the surface quality. The present work deals with some machinability studies on flank wear, surface roughness, and lifespan in finish turning of AISI 304 stainless steel using multilayer Ti(C,N/Al2O3/TiN coated carbide inserts. The machining experiments are conducted based on the response surface methodology (RSM. Combined effects of three cutting parameters, namely cutting speed, feed rate and cutting time on the two performance outputs (i.e. VB and Ra, and combined effects of two cutting parameters, namely cutting speed and feed rate on lifespan (T, are explored employing the analysis of variance (ANOVA. The relationship between the variables and the technological parameters is determined using a quadratic regression model and optimal cutting conditions for each performance level are established through desirability function approach (DFA optimization. The results show that the flank wear is influenced principally by the cutting time and in the second level by the cutting speed. In addition, it is indicated that the cutting time is the dominant factor affecting workpiece surface roughness followed by feed rate, while lifespan is influenced by cutting speed. The optimum level of input parameters for composite desirability was found Vc1-f1-t1 for VB, Ra and Vc1-f1 for T, with a maximum percentage of error 6.38%.

  3. SNMP-SI: A Network Management Tool Based on Slow Intelligence System Approach

    Science.gov (United States)

    Colace, Francesco; de Santo, Massimo; Ferrandino, Salvatore

    The last decade has witnessed an intense spread of computer networks that has been further accelerated with the introduction of wireless networks. Simultaneously with, this growth has increased significantly the problems of network management. Especially in small companies, where there is no provision of personnel assigned to these tasks, the management of such networks is often complex and malfunctions can have significant impacts on their businesses. A possible solution is the adoption of Simple Network Management Protocol. Simple Network Management Protocol (SNMP) is a standard protocol used to exchange network management information. It is part of the Transmission Control Protocol/Internet Protocol (TCP/IP) protocol suite. SNMP provides a tool for network administrators to manage network performance, find and solve network problems, and plan for network growth. SNMP has a big disadvantage: its simple design means that the information it deals with is neither detailed nor well organized enough to deal with the expanding modern networking requirements. Over the past years much efforts has been given to improve the lack of Simple Network Management Protocol and new frameworks has been developed: A promising approach involves the use of Ontology. This is the starting point of this paper where a novel approach to the network management based on the use of the Slow Intelligence System methodologies and Ontology based techniques is proposed. Slow Intelligence Systems is a general-purpose systems characterized by being able to improve performance over time through a process involving enumeration, propagation, adaptation, elimination and concentration. Therefore, the proposed approach aims to develop a system able to acquire, according to an SNMP standard, information from the various hosts that are in the managed networks and apply solutions in order to solve problems. To check the feasibility of this model first experimental results in a real scenario are showed.

  4. Efficiency of stormwater control measures for combined sewer retrofitting under varying rain conditions: Quantifying the Three Points Approach (3PA)

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Lerer, Sara Maria; Arnbjerg-Nielsen, Karsten

    2016-01-01

    We present a method to assess and communicate the efficiency of stormwater control measures for retrofitting existing urban areas. The tool extends the Three Points Approach to quantitatively distinguish three rainfall domains: (A) rainwater resource utilisation, (B) urban stormwater drainage pipe......, stormwater drainage and flood risks....

  5. Fast scattering simulation tool for multi-energy x-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Sossin, A., E-mail: artur.sossin@cea.fr [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Tabary, J.; Rebuffel, V. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France); Létang, J.M.; Freud, N. [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Claude Bernard Lyon 1, Centre Léon Bérard (France); Verger, L. [CEA-LETI MINATEC Grenoble, F-38054 Grenoble (France)

    2015-12-01

    A combination of Monte Carlo (MC) and deterministic approaches was employed as a means of creating a simulation tool capable of providing energy resolved x-ray primary and scatter images within a reasonable time interval. Libraries of Sindbad, a previously developed x-ray simulation software, were used in the development. The scatter simulation capabilities of the tool were validated through simulation with the aid of GATE and through experimentation by using a spectrometric CdTe detector. A simple cylindrical phantom with cavities and an aluminum insert was used. Cross-validation with GATE showed good agreement with a global spatial error of 1.5% and a maximum scatter spectrum error of around 6%. Experimental validation also supported the accuracy of the simulations obtained from the developed software with a global spatial error of 1.8% and a maximum error of around 8.5% in the scatter spectra.

  6. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    Science.gov (United States)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  7. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  8. Approaching Sentient Building Performance Simulation Systems

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas; Heller, Alfred

    2014-01-01

    Sentient BPS systems can combine one or more high precision BPS and provide near instantaneous performance feedback directly in the design tool, thus providing speed and precision of building performance in the early design stages. Sentient BPS systems are essentially combining: 1) design tools, 2......) parametric tools, 3) BPS tools, 4) dynamic databases 5) interpolation techniques and 6) prediction techniques as a fast and valid simulation system, in the early design stage....

  9. Combined Heat and Power

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    At their 2007 Summit in Heiligendamm, G8 leaders called on countries to 'adopt instruments and measures to significantly increase the share of combined heat and power (CHP) in the generation of electricity.' As a result, energy, economic, environmental and utility regulators are looking for tools and information to understand the potential of CHP and to identify appropriate policies for their national circumstances. This report forms the first part of the response. It includes answers to policy makers' questions about the potential economic, energy and environmental benefits of an increased policy commitment to CHP. It also includes for the first time integrated IEA data on global CHP installations, and analyses the benefits of increased CHP investment in the G8+5 countries. A companion report will be produced later in 2008 to document best practice policy approaches that have been used to expand the use of CHP in a variety of countries.

  10. When one model is not enough: combining epistemic tools in systems biology.

    Science.gov (United States)

    Green, Sara

    2013-06-01

    In recent years, the philosophical focus of the modeling literature has shifted from descriptions of general properties of models to an interest in different model functions. It has been argued that the diversity of models and their correspondingly different epistemic goals are important for developing intelligible scientific theories (Leonelli, 2007; Levins, 2006). However, more knowledge is needed on how a combination of different epistemic means can generate and stabilize new entities in science. This paper will draw on Rheinberger's practice-oriented account of knowledge production. The conceptual repertoire of Rheinberger's historical epistemology offers important insights for an analysis of the modelling practice. I illustrate this with a case study on network modeling in systems biology where engineering approaches are applied to the study of biological systems. I shall argue that the use of multiple representational means is an essential part of the dynamic of knowledge generation. It is because of-rather than in spite of-the diversity of constraints of different models that the interlocking use of different epistemic means creates a potential for knowledge production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Bridging the gap between LCA, LCC and CBA as sustainability assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Hoogmartens, Rob, E-mail: rob.hoogmartens@uhasselt.be [Hasselt University, Faculty of Business Economics, Centre for Environmental Sciences, Agoralaan, Building D, 3590 Diepenbeek (Belgium); Van Passel, Steven, E-mail: steven.vanpassel@uhasselt.be [Hasselt University, Faculty of Business Economics, Centre for Environmental Sciences, Agoralaan, Building D, 3590 Diepenbeek (Belgium); Van Acker, Karel, E-mail: karel.vanacker@lrd.kuleuven.be [Katholieke Universiteit Leuven, Department of Metallurgy and Materials Engineering, Kasteelpark Arenberg 44, 3001 Leuven (Belgium); Dubois, Maarten, E-mail: maarten.dubois@kuleuven.be [Katholieke Universiteit Leuven, Policy Research Centre for Sustainable Materials, Kasteelpark Arenberg 44, 3001 Leuven (Belgium)

    2014-09-15

    Increasing interest in sustainability has led to the development of sustainability assessment tools such as Life Cycle Analysis (LCA), Life Cycle Costing (LCC) and Cost–Benefit Analysis (CBA). Due to methodological disparity of these three tools, conflicting assessment results generate confusion for many policy and business decisions. In order to interpret and integrate assessment results, the paper provides a framework that clarifies the connections and coherence between the included assessment methodologies. Building on this framework, the paper further focuses on key aspects to adapt any of the methodologies to full sustainability assessments. Aspects dealt with in the review are for example the reported metrics, the scope, data requirements, discounting, product- or project-related and approaches with respect to scarcity and labor requirements. In addition to these key aspects, the review shows that important connections exist: (i) the three tools can cope with social inequality, (ii) processes such as valuation techniques for LCC and CBA are common, (iii) Environmental Impact Assessment (EIA) is used as input in both LCA and CBA and (iv) LCA can be used in parallel with LCC. Furthermore, the most integrated sustainability approach combines elements of LCA and LCC to achieve the Life Cycle Sustainability Assessment (LCSA). The key aspects and the connections referred to in the review are illustrated with a case study on the treatment of end-of-life automotive glass. - Highlights: • Proliferation of assessment tools creates ambiguity and confusion. • The developed assessment framework clarifies connections between assessment tools. • Broadening LCA, key aspects are metric and data requirements. • Broadening LCC, key aspects are scope, time frame and discounting. • Broadening CBA, focus point, timespan, references, labor and scarcity are key.

  12. Bridging the gap between LCA, LCC and CBA as sustainability assessment tools

    International Nuclear Information System (INIS)

    Hoogmartens, Rob; Van Passel, Steven; Van Acker, Karel; Dubois, Maarten

    2014-01-01

    Increasing interest in sustainability has led to the development of sustainability assessment tools such as Life Cycle Analysis (LCA), Life Cycle Costing (LCC) and Cost–Benefit Analysis (CBA). Due to methodological disparity of these three tools, conflicting assessment results generate confusion for many policy and business decisions. In order to interpret and integrate assessment results, the paper provides a framework that clarifies the connections and coherence between the included assessment methodologies. Building on this framework, the paper further focuses on key aspects to adapt any of the methodologies to full sustainability assessments. Aspects dealt with in the review are for example the reported metrics, the scope, data requirements, discounting, product- or project-related and approaches with respect to scarcity and labor requirements. In addition to these key aspects, the review shows that important connections exist: (i) the three tools can cope with social inequality, (ii) processes such as valuation techniques for LCC and CBA are common, (iii) Environmental Impact Assessment (EIA) is used as input in both LCA and CBA and (iv) LCA can be used in parallel with LCC. Furthermore, the most integrated sustainability approach combines elements of LCA and LCC to achieve the Life Cycle Sustainability Assessment (LCSA). The key aspects and the connections referred to in the review are illustrated with a case study on the treatment of end-of-life automotive glass. - Highlights: • Proliferation of assessment tools creates ambiguity and confusion. • The developed assessment framework clarifies connections between assessment tools. • Broadening LCA, key aspects are metric and data requirements. • Broadening LCC, key aspects are scope, time frame and discounting. • Broadening CBA, focus point, timespan, references, labor and scarcity are key

  13. Assessing the role of learning devices and geovisualisation tools for collective action in natural resource management: Experiences from Vietnam.

    Science.gov (United States)

    Castella, Jean-Christophe

    2009-02-01

    In northern Vietnam uplands the successive policy reforms that accompanied agricultural decollectivisation triggered very rapid changes in land use in the 1990s. From a centralized system of natural resource management, a multitude of individual strategies emerged which contributed to new production interactions among farming households, changes in landscape structures, and conflicting strategies among local stakeholders. Within this context of agrarian transition, learning devices can help local communities to collectively design their own course of action towards sustainable natural resource management. This paper presents a collaborative approach combining a number of participatory methods and geovisualisation tools (e.g., spatially explicit multi-agent models and role-playing games) with the shared goal to analyse and represent the interactions between: (i) decision-making processes by individual farmers based on the resource profiles of their farms; (ii) the institutions which regulate resource access and usage; and (iii) the biophysical and socioeconomic environment. This methodological pathway is illustrated by a case study in Bac Kan Province where it successfully led to a communication platform on natural resource management. In a context of rapid socioeconomic changes, learning devices and geovisualisation tools helped embed the participatory approach within a process of community development. The combination of different tools, each with its own advantages and constraints, proved highly relevant for supporting collective natural resource management.

  14. Two-dimensional gap analysis: a tool for efficient conservation planning and biodiversity policy implementation.

    Science.gov (United States)

    Angelstam, Per; Mikusiński, Grzegorz; Rönnbäck, Britt-Inger; Ostman, Anders; Lazdinis, Marius; Roberge, Jean-Michel; Arnberg, Wolter; Olsson, Jan

    2003-12-01

    The maintenance of biodiversity by securing representative and well-connected habitat networks in managed landscapes requires a wise combination of protection, management, and restoration of habitats at several scales. We suggest that the integration of natural and social sciences in the form of "Two-dimensional gap analysis" is an efficient tool for the implementation of biodiversity policies. The tool links biologically relevant "horizontal" ecological issues with "vertical" issues related to institutions and other societal issues. Using forest biodiversity as an example, we illustrate how one can combine ecological and institutional aspects of biodiversity conservation, thus facilitating environmentally sustainable regional development. In particular, we use regional gap analysis for identification of focal forest types, habitat modelling for ascertaining the functional connectivity of "green infrastructures", as tools for the horizontal gap analysis. For the vertical dimension we suggest how the social sciences can be used for assessing the success in the implementation of biodiversity policies in real landscapes by identifying institutional obstacles while implementing policies. We argue that this interdisciplinary approach could be applied in a whole range of other environments including other terrestrial biota and aquatic ecosystems where functional habitat connectivity, nonlinear response to habitat loss and a multitude of economic and social interests co-occur in the same landscape.

  15. Conceptual Framework To Extend Life Cycle Assessment Using Near-Field Human Exposure Modeling and High-Throughput Tools for Chemicals.

    Science.gov (United States)

    Csiszar, Susan A; Meyer, David E; Dionisio, Kathie L; Egeghy, Peter; Isaacs, Kristin K; Price, Paul S; Scanlon, Kelly A; Tan, Yu-Mei; Thomas, Kent; Vallero, Daniel; Bare, Jane C

    2016-11-01

    Life Cycle Assessment (LCA) is a decision-making tool that accounts for multiple impacts across the life cycle of a product or service. This paper presents a conceptual framework to integrate human health impact assessment with risk screening approaches to extend LCA to include near-field chemical sources (e.g., those originating from consumer products and building materials) that have traditionally been excluded from LCA. A new generation of rapid human exposure modeling and high-throughput toxicity testing is transforming chemical risk prioritization and provides an opportunity for integration of screening-level risk assessment (RA) with LCA. The combined LCA and RA approach considers environmental impacts of products alongside risks to human health, which is consistent with regulatory frameworks addressing RA within a sustainability mindset. A case study is presented to juxtapose LCA and risk screening approaches for a chemical used in a consumer product. The case study demonstrates how these new risk screening tools can be used to inform toxicity impact estimates in LCA and highlights needs for future research. The framework provides a basis for developing tools and methods to support decision making on the use of chemicals in products.

  16. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    Science.gov (United States)

    Lee, G.; Jun, K. S.; Chung, E.-S.

    2015-04-01

    This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  17. Hybrid approach to structure modeling of the histamine H3 receptor: Multi-level assessment as a tool for model verification.

    Directory of Open Access Journals (Sweden)

    Jakub Jończyk

    Full Text Available The crucial role of G-protein coupled receptors and the significant achievements associated with a better understanding of the spatial structure of known receptors in this family encouraged us to undertake a study on the histamine H3 receptor, whose crystal structure is still unresolved. The latest literature data and availability of different software enabled us to build homology models of higher accuracy than previously published ones. The new models are expected to be closer to crystal structures; and therefore, they are much more helpful in the design of potential ligands. In this article, we describe the generation of homology models with the use of diverse tools and a hybrid assessment. Our study incorporates a hybrid assessment connecting knowledge-based scoring algorithms with a two-step ligand-based docking procedure. Knowledge-based scoring employs probability theory for global energy minimum determination based on information about native amino acid conformation from a dataset of experimentally determined protein structures. For a two-step docking procedure two programs were applied: GOLD was used in the first step and Glide in the second. Hybrid approaches offer advantages by combining various theoretical methods in one modeling algorithm. The biggest advantage of hybrid methods is their intrinsic ability to self-update and self-refine when additional structural data are acquired. Moreover, the diversity of computational methods and structural data used in hybrid approaches for structure prediction limit inaccuracies resulting from theoretical approximations or fuzziness of experimental data. The results of docking to the new H3 receptor model allowed us to analyze ligand-receptor interactions for reference compounds.

  18. Reification in the Learning of Square Roots in a Ninth Grade Classroom: Combining Semiotic and Discursive Approaches

    Science.gov (United States)

    Shinno, Yusuke

    2018-01-01

    This paper reports on combining semiotic and discursive approaches to reification in classroom interactions. It focuses on the discursive characteristics and semiotic processes involved in the teaching and learning of square roots in a ninth grade classroom in Japan. The purpose of this study is to characterize the development of mathematical…

  19. Nonlinear Prediction As A Tool For Determining Parameters For Phase Space Reconstruction In Meteorology

    Science.gov (United States)

    Miksovsky, J.; Raidl, A.

    Time delays phase space reconstruction represents one of useful tools of nonlinear time series analysis, enabling number of applications. Its utilization requires the value of time delay to be known, as well as the value of embedding dimension. There are sev- eral methods how to estimate both these parameters. Typically, time delay is computed first, followed by embedding dimension. Our presented approach is slightly different - we reconstructed phase space for various combinations of mentioned parameters and used it for prediction by means of the nearest neighbours in the phase space. Then some measure of prediction's success was computed (correlation or RMSE, e.g.). The position of its global maximum (minimum) should indicate the suitable combination of time delay and embedding dimension. Several meteorological (particularly clima- tological) time series were used for the computations. We have also created a MS- Windows based program in order to implement this approach - its basic features will be presented as well.

  20. Spreadsheet tool for estimating noise reduction costs

    International Nuclear Information System (INIS)

    Frank, L.; Senden, V.; Leszczynski, Y.

    2009-01-01

    The Northeast Capital Industrial Association (NCIA) represents industry in Alberta's industrial heartland. The organization is in the process of developing a regional noise management plan (RNMP) for their member companies. The RNMP includes the development of a noise reduction cost spreadsheet tool to conduct reviews of practical noise control treatments available for individual plant equipment, inclusive of ranges of noise attenuation achievable, which produces a budgetary prediction of the installed cost of practical noise control treatments. This paper discussed the noise reduction cost spreadsheet tool, with particular reference to noise control best practices approaches and spreadsheet tool development such as prerequisite, assembling data required, approach, and unit pricing database. Use and optimization of the noise reduction cost spreadsheet tool was also discussed. It was concluded that the noise reduction cost spreadsheet tool is an easy interactive tool to estimate implementation costs related to different strategies and options of noise control mitigating measures and was very helpful in gaining insight for noise control planning purposes. 2 tabs.

  1. Clinical treatment approach of a child with molar incisor hypomineralization (MIH combined with malocclusion.

    Directory of Open Access Journals (Sweden)

    Rossitza Kabaktchieva

    2012-04-01

    Full Text Available Introduction. Molar incisor hypomineralization (MIH was defined as "hypomineralisation of systemic origin of permanent first molars, frequently associated with affected incisors". MIH includes the presence of demarcated opacity, post eruptive enamel breakdown, atypical restoration. Тhe approach to management suggested: risk identification, early diagnosis, remineralization for prevention of caries and post eruption breakdown, restorations. The clinicians very seldom notice that children with MIH usually have both- hypomineralisation and malocclusions, and they do not discuss combine treatment plan.Aim. To present our interdisciplinary approach to a patient with MIH, combined with malocclusion.Material and methods. We are presenting 9 year old child with contusio and fractura coronae dentis noncomplicata, distal occlusion, overjet, overbite and retrusion. Two consecutive stages were defined: First stage:- Professional oral hygiene and local remineralisation therapy- Vital pulp therapy of tooth 21 - Space gaining for restoration of the lost height of the molars by the means of posterior bite-plane removable appliance- Restoration of the molars with metal inlays- Lingual tipping of the lower incisorsSecond stage:- Class II correction- Growth control Results.First phase: - The tooth 21 was restored with aesthetic composite material;- Occlusion was raised with occlusal restorations (inleys and orthodontic appliance. Second phase:Medialisation of mandible and holding maxillary growth with functional appliance and occipital EOA until class one occlusal relations.Conclusion. Children with MIH should be examined and treated complex in collaboration with orthodontist and if necessary by other specialists too.

  2. Minilaparoscopic technique for inguinal hernia repair combining transabdominal pre-peritoneal and totally extraperitoneal approaches.

    Science.gov (United States)

    Carvalho, Gustavo L; Loureiro, Marcelo P; Bonin, Eduardo A; Claus, Christiano P; Silva, Frederico W; Cury, Antonio M; Fernandes, Flavio A M

    2012-01-01

    Endoscopic surgical repair of inguinal hernia is currently conducted using 2 techniques: the totally extraperitoneal (TEP) and the transabdominal (TAPP) hernia repair. The TEP procedure is technically advantageous, because of the use of no mesh fixation and the elimination of the peritoneal flap, leading to less postoperative pain and faster recovery. The drawback is that TEP is not performed as frequently, because of its complexity and longer learning curve. In this study, we propose a hybrid technique that could potentially become the gold standard of minimally invasive inguinal hernia surgery. This will be achieved by combining established advantages of TEP and TAPP associated with the precision and cosmetics of minilaparoscopy (MINI). Between January and July 2011, 22 patients were admitted for endoscopic inguinal hernia repair. The combined technique was initiated with TAPP inspection and direct visualization of a minilaparoscopic trocar dissection of the preperitoneum space. A10-mm trocar was then placed inside the previously dissected preperitoneal space, using the same umbilical TAPP skin incision. Minilaparoscopic retroperitoneal dissection was completed by TEP, and the surgical procedure was finalized with intraperitoneal review and correction of the preperitoneal work. The minilaparoscopic TEP-TAPP combined approach for inguinal hernia is feasible, safe, and allows a simple endoscopic repair. This is achieved by combining features and advantages of both TAPP and TEP techniques using precise and sophisticated MINI instruments. Minilaparoscopic preperitoneal dissection allows a faster and easier creation of the preperitoneal space for the TEP component of the procedure.

  3. Combined Antimicrobial Activity of Photodynamic Inactivation and Antimicrobials–State of the Art

    Directory of Open Access Journals (Sweden)

    Agata Wozniak

    2018-05-01

    Full Text Available Antimicrobial photodynamic inactivation (aPDI is a promising tool for the eradication of life-threatening pathogens with different profiles of resistance. This study presents the state-of-the-art published studies that have been dedicated to analyzing the bactericidal effects of combining aPDI and routinely applied antibiotics in in vitro (using biofilm and planktonic cultures and in vivo experiments. Furthermore, the current paper reviews the methodology used to obtain the published data that describes the synergy between these antimicrobial approaches. The authors are convinced that even though the combined efficacy of aPDI and antimicrobials could be investigated with the wide range of methods, the use of a unified experimental methodology that is in agreement with antimicrobial susceptibility testing (AST is required to investigate possible synergistic cooperation between aPDI and antimicrobials. Conclusions concerning the possible synergistic activity between the two treatments can be drawn only when appropriate assays are employed. It must be noticed that some of the described papers were just aimed at determination if combined treatments exert enhanced antibacterial outcome, without following the standard methodology to evaluate the synergistic effect, but in most of them (18 out of 27 authors indicated the existence of synergy between described antibacterial approaches. In general, the increase in bacterial inactivation was observed when both therapies were used in combination.

  4. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools

    International Nuclear Information System (INIS)

    Bade, Richard; Causanilles, Ana; Emke, Erik; Bijlsma, Lubertus; Sancho, Juan V.; Hernandez, Felix; Voogt, Pim de

    2016-01-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of > 200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds. - Highlights: • A hidden target non-target screening method is utilised using two databases • Two software (MsXelerator and Sieve 2.1) used for both methods • 22 compounds tentatively identified following MS/MS reinjection • More information gleaned from this combined approach than individually

  5. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools

    Energy Technology Data Exchange (ETDEWEB)

    Bade, Richard [Research Institute for Pesticides and Water, University Jaume I, Avda. Sos Baynat s/n, E-12071 Castellón (Spain); Causanilles, Ana; Emke, Erik [KWR Watercycle Research Institute, Chemical Water Quality and Health, P.O. Box 1072, 3430 BB Nieuwegein (Netherlands); Bijlsma, Lubertus; Sancho, Juan V.; Hernandez, Felix [Research Institute for Pesticides and Water, University Jaume I, Avda. Sos Baynat s/n, E-12071 Castellón (Spain); Voogt, Pim de, E-mail: w.p.devoogt@uva.nl [KWR Watercycle Research Institute, Chemical Water Quality and Health, P.O. Box 1072, 3430 BB Nieuwegein (Netherlands); Institute for Biodiversity and Ecosystem Dynamics, University of Amsterdam, P.O. Box 94248, 1090 GE Amsterdam (Netherlands)

    2016-11-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of > 200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds. - Highlights: • A hidden target non-target screening method is utilised using two databases • Two software (MsXelerator and Sieve 2.1) used for both methods • 22 compounds tentatively identified following MS/MS reinjection • More information gleaned from this combined approach than individually.

  6. Combining non-invasive transcranial brain stimulation with neuroimaging and electrophysiology: Current approaches and future perspectives

    DEFF Research Database (Denmark)

    Bergmann, Til Ole; Karabanov, Anke; Hartwigsen, Gesa

    2016-01-01

    Non-invasive transcranial brain stimulation (NTBS) techniques such as transcranial magnetic stimulation (TMS) and transcranial current stimulation (TCS) are important tools in human systems and cognitive neuroscience because they are able to reveal the relevance of certain brain structures...... are technically demanding. We argue that the benefit from this combination is twofold. Firstly, neuroimaging and electrophysiology can inform subsequent NTBS, providing the required information to optimize where, when, and how to stimulate the brain. Information can be achieved both before and during the NTBS...... experiment, requiring consecutive and concurrent applications, respectively. Secondly, neuroimaging and electrophysiology can provide the readout for neural changes induced by NTBS. Again, using either concurrent or consecutive applications, both "online" NTBS effects immediately following the stimulation...

  7. Chemical entity recognition in patents by combining dictionary-based and statistical approaches

    Science.gov (United States)

    Akhondi, Saber A.; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F.H.; Hettne, Kristina M.; van Mulligen, Erik M.; Kors, Jan A.

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small. Database URL: http://biosemantics.org/chemdner-patents PMID:27141091

  8. Microgrid Analysis Tools Summary

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Antonio [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Haase, Scott G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mathur, Shivani [Formerly NREL

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimization tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).

  9. Combining multiple approaches and optimized data resolution for an improved understanding of stream temperature dynamics of a forested headwater basin in the Southern Appalachians

    Science.gov (United States)

    Belica, L.; Mitasova, H.; Caldwell, P.; McCarter, J. B.; Nelson, S. A. C.

    2017-12-01

    Thermal regimes of forested headwater streams continue to be an area of active research as climatic, hydrologic, and land cover changes can influence water temperature, a key aspect of aquatic ecosystems. Widespread monitoring of stream temperatures have provided an important data source, yielding insights on the temporal and spatial patterns and the underlying processes that influence stream temperature. However, small forested streams remain challenging to model due to the high spatial and temporal variability of stream temperatures and the climatic and hydrologic conditions that drive them. Technological advances and increased computational power continue to provide new tools and measurement methods and have allowed spatially explicit analyses of dynamic natural systems at greater temporal resolutions than previously possible. With the goal of understanding how current stream temperature patterns and processes may respond to changing landcover and hydroclimatoligical conditions, we combined high-resolution, spatially explicit geospatial modeling with deterministic heat flux modeling approaches using data sources that ranged from traditional hydrological and climatological measurements to emerging remote sensing techniques. Initial analyses of stream temperature monitoring data revealed that high temporal resolution (5 minutes) and measurement resolutions (guide field data collection for further heat flux modeling. By integrating multiple approaches and optimizing data resolution for the processes being investigated, small, but ecologically significant differences in stream thermal regimes were revealed. In this case, multi-approach research contributed to the identification of the dominant mechanisms driving stream temperature in the study area and advanced our understanding of the current thermal fluxes and how they may change as environmental conditions change in the future.

  10. Combining Approach in Stages with Least Squares for fits of data in hyperelasticity

    Science.gov (United States)

    Beda, Tibi

    2006-10-01

    The present work concerns a method of continuous approximation by block of a continuous function; a method of approximation combining the Approach in Stages with the finite domains Least Squares. An identification procedure by sub-domains: basic generating functions are determined step-by-step permitting their weighting effects to be felt. This procedure allows one to be in control of the signs and to some extent of the optimal values of the parameters estimated, and consequently it provides a unique set of solutions that should represent the real physical parameters. Illustrations and comparisons are developed in rubber hyperelastic modeling. To cite this article: T. Beda, C. R. Mecanique 334 (2006).

  11. A linguistic approach to solving of the problem of technological adjustment of combines

    Directory of Open Access Journals (Sweden)

    Lyudmila V. Borisova

    2017-06-01

    Full Text Available Introduction: The article deals with a linguistic approach to the technological adjustment of difficult harvesters in field conditions. The short characteristic of subject domain is provided. The place of the task of adjustment of the combine harvester working bodies in harvesting is considered. Various groups of signs of the considered task are allocated: external signs of violation of quality of work, regulated parameters of the machine, and parameters of technical condition. The numerical data characterizing interrelations between external signs and parameters of the machine are provided. Materials and Methods: A combine harvester is the difficult dynamic system functioning under constantly changing external conditions. This fact imposes characteristics on the used methods of technological adjustment. Quantitative and qualitative information is used to control harvesting. Availability of different types of uncertainty in considering semantic spaces of factors of the external environment and parameters of the machine allows offering the method of technological adjustment based on an indistinct logical conclusion for the solution of the task. Results: As the analysis result, the decision making methodology for indistinct environment conditions is adapted for the studied subject domain. The generalized scheme of indistinct management of process is offered to technological adjustment of the machine. Models of the studied semantic spaces are considered. Feasibility of use of deductive and inductive conclusions of decisions for various tasks of preliminary setup and adjustment of technological adjustments is shown. The formal and logical scheme of the decision making process based on indistinct expert knowledge is offered. The scheme includes the main stages of the task solution: fazzifikation, composition and defazzifikation. The question of the quantitative assessment of expert knowledge coordination is considered. The examples of the formulation

  12. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    Science.gov (United States)

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. An empirical approach for evaluating the usability of model-driven tools

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Panach, Jose Ignacio; Baars, Arthur Iwan; Vos, Tanja; Pastor, Oscar

    2013-01-01

    MDD tools are very useful to draw conceptual models and to automate code generation. Even though this would bring many benefits, wide adoption of MDD tools is not yet a reality. Various research activities are being undertaken to find why and to provide the required solutions. However, insufficient

  14. Systems Prototyping with Fourth Generation Tools.

    Science.gov (United States)

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  15. Successful Recanalization of a Complete Lobar Bronchial Stenosis in a Lung Transplant Patient Using a Combined Percutaneous and Bronchoscopic Approach

    International Nuclear Information System (INIS)

    Miraglia, Roberto; Vitulo, Patrizio; Maruzzelli, Luigi; Burgio, Gaetano; Caruso, Settimo; Bertani, Alessandro; Callari, Adriana; Luca, Angelo

    2016-01-01

    Airway stenosis is a major complication after lung transplantation that is usually managed with a combination of interventional endoscopic techniques, including endobronchial debridement, balloon dilation, and stent placement. Herein, we report a successful case of recanalization of a complete stenosis of the right middle lobe bronchus in a lung transplant patient, by using a combined percutaneous–bronchoscopic approach after the failure of endobronchial debridement

  16. Successful Recanalization of a Complete Lobar Bronchial Stenosis in a Lung Transplant Patient Using a Combined Percutaneous and Bronchoscopic Approach

    Energy Technology Data Exchange (ETDEWEB)

    Miraglia, Roberto, E-mail: rmiraglia@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Radiology Service, Department of Diagnostic and Therapeutic Services (Italy); Vitulo, Patrizio, E-mail: pvitulo@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Pulmonology Unit, Department for the Treatment and Study of Cardiothoracic Diseases and Cardiothoracic Transplantation (Italy); Maruzzelli, Luigi, E-mail: lmaruzzelli@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Radiology Service, Department of Diagnostic and Therapeutic Services (Italy); Burgio, Gaetano, E-mail: gburgio@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Operating Room Service, Department of Anesthesia and Intensive Care (Italy); Caruso, Settimo, E-mail: secaruso@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Radiology Service, Department of Diagnostic and Therapeutic Services (Italy); Bertani, Alessandro, E-mail: abertani@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Thoracic Surgery and Lung Transplantation Unit, Department for the Treatment and Study of Cardiothoracic Diseases and Cardiothoracic Transplantation (Italy); Callari, Adriana, E-mail: acallari@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Pulmonology Unit, Department for the Treatment and Study of Cardiothoracic Diseases and Cardiothoracic Transplantation (Italy); Luca, Angelo, E-mail: aluca@ismett.edu [Mediterranean Institute for Transplantation and Advanced Specialized Therapies (ISMETT), Radiology Service, Department of Diagnostic and Therapeutic Services (Italy)

    2016-03-15

    Airway stenosis is a major complication after lung transplantation that is usually managed with a combination of interventional endoscopic techniques, including endobronchial debridement, balloon dilation, and stent placement. Herein, we report a successful case of recanalization of a complete stenosis of the right middle lobe bronchus in a lung transplant patient, by using a combined percutaneous–bronchoscopic approach after the failure of endobronchial debridement.

  17. Willingness to pay and price elasticities of demand for energy-efficient appliances: Combining the hedonic approach and demand systems

    Energy Technology Data Exchange (ETDEWEB)

    Galarraga, Ibon, E-mail: ibon.galarraga@bc3research.org; Gonzalez-Eguino, Mikel, E-mail: mikel.gonzalez@bc3research.org; Markandya, Anil, E-mail: anil.markandya@bc3research.org

    2011-12-15

    This article proposes a combined approach for estimating willingness to pay for the attributes represented by energy efficiency labels and providing reliable price elasticities of demand (own and cross) for close substitutes (e.g. those with low energy efficiency and those with higher energy efficiency). This is done by using the results of the hedonic approach together with the Quantity Based Demand System (QBDS) model. The elasticity results obtained with the latter are then compared with those simulated using the Linear Almost Ideal Demand System (LA/AIDS). The methodology is applied to the dishwasher market in Spain: it is found that 15.6% of the final price is actually paid for the energy efficiency attribute. This accounts for about Euro 80 of the average market price. The elasticity results confirm that energy efficient appliances are more price elastic than regular ones. - Highlights: > The article shows a combined approach for estimating willingness to pay for energy efficiency labels and price elasticities. > The results of the hedonic approach is used together with the Quantity Based Demand System (QBDS) model. > The elasticity results are compared with those simulated using the Linear Almost Ideal Demand System (LA/AIDS). > The methodology is applied to the dishwasher market in Spain.

  18. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  19. The Halden Reactor Project workshop on improved system development using case-tools based on formal methods

    International Nuclear Information System (INIS)

    Gran, Bjoern Axel; Sivertsen, Terje; Stoelen, Ketil; Thunem, Harald; Zhang, Wenhui

    1999-02-01

    The workshop 'Improved system development using case-tools based on formal methods' was organised in Halden, December 1-2, 1998. The purpose of the workshop was to present and discuss the state-of-the-art with respect to formal approaches. The workshop had two invited presentations: 'Formality in specification and modelling: developments in software engineering practice' by John Fitzgerald (Centre for Software Reliability, UK), and 'Formal methods in industry - reaching results when correctness is not the only issue' by Oeystein Haugen (Ericsson NorARC, Norway). The workshop also had several presentations divided into three sessions on industrial experience, tools, and combined approaches. Each day there was a discussion. The first was on the effect of formalization, while the second was on the role of formal verification. At the end of the workshop, the presentations and discussions were summarised into specific recommendations. This report summarises the presentations of the speakers, the discussions, the recommendations, and the demonstrations given at the workshop (author) (ml)

  20. Setup Analysis: Combining SMED with Other Tools

    Directory of Open Access Journals (Sweden)

    Stadnicka Dorota

    2015-02-01

    Full Text Available The purpose of this paper is to propose the methodology for the setup analysis, which can be implemented mainly in small and medium enterprises which are not convinced to implement the setups development. The methodology was developed after the research which determined the problem. Companies still have difficulties with a long setup time. Many of them do nothing to decrease this time. A long setup is not a sufficient reason for companies to undertake any actions towards the setup time reduction. To encourage companies to implement SMED it is essential to make some analyses of changeovers in order to discover problems. The methodology proposed can really encourage the management to take a decision about the SMED implementation, and that was verified in a production company. The setup analysis methodology is made up of seven steps. Four of them concern a setups analysis in a chosen area of a company, such as a work stand which is a bottleneck with many setups. The goal is to convince the management to begin actions concerning the setups improvement. The last three steps are related to a certain setup and, there, the goal is to reduce a setup time and the risk of problems which can appear during the setup. In this paper, the tools such as SMED, Pareto analysis, statistical analysis, FMEA and other were used.

  1. ApkCombiner: Combining Multiple Android Apps to Support Inter-App Analysis

    OpenAIRE

    Li , Li; Bartel , Alexandre; Bissyandé , Tegawendé ,; Klein , Jacques; Traon , Yves ,

    2015-01-01

    Part 8: Mobile and Cloud Services Security; International audience; Android apps are made of components which can leak information between one another using the ICC mechanism. With the growing momentum of Android, a number of research contributions have led to tools for the intra-app analysis of Android apps. Unfortunately, these state-of-the-art approaches, and the associated tools, have long left out the security flaws that arise across the boundaries of single apps, in the interaction betw...

  2. Advanced strategies for end-stage heart failure: combining regenerative approaches with LVAD, a new horizon?

    Directory of Open Access Journals (Sweden)

    Cheyenne eTseng

    2015-04-01

    Full Text Available Despite the improved treatment of cardiovascular diseases the population with end-stage heart failure is progressively growing. The scarcity of the gold standard therapy, heart transplantation, demands novel therapeutic approaches. For patients awaiting transplantation ventricular assist devices have been of great benefit on survival. To allow explantation of the assist device and obviate heart transplantation, sufficient and durable myocardial recovery is necessary. However, explant rates so far are low. Combining mechanical circulatory support with regenerative therapies such as cell(-based therapy and biomaterials might give rise to improved long-term results. Although synergistic effects are suggested with mechanical support and stem cell therapy, evidence in both preclinical and clinical setting is lacking. This review focuses on advanced and innovative strategies for the treatment of end-stage heart failure and furthermore appraises clinical experience with combined strategies.

  3. Integrating uncertainties to the combined environmental and economic assessment of algal biorefineries: A Monte Carlo approach.

    Science.gov (United States)

    Pérez-López, Paula; Montazeri, Mahdokht; Feijoo, Gumersindo; Moreira, María Teresa; Eckelman, Matthew J

    2018-06-01

    The economic and environmental performance of microalgal processes has been widely analyzed in recent years. However, few studies propose an integrated process-based approach to evaluate economic and environmental indicators simultaneously. Biodiesel is usually the single product and the effect of environmental benefits of co-products obtained in the process is rarely discussed. In addition, there is wide variation of the results due to inherent variability of some parameters as well as different assumptions in the models and limited knowledge about the processes. In this study, two standardized models were combined to provide an integrated simulation tool allowing the simultaneous estimation of economic and environmental indicators from a unique set of input parameters. First, a harmonized scenario was assessed to validate the joint environmental and techno-economic model. The findings were consistent with previous assessments. In a second stage, a Monte Carlo simulation was applied to evaluate the influence of variable and uncertain parameters in the model output, as well as the correlations between the different outputs. The simulation showed a high probability of achieving favorable environmental performance for the evaluated categories and a minimum selling price ranging from $11gal -1 to $106gal -1 . Greenhouse gas emissions and minimum selling price were found to have the strongest positive linear relationship, whereas eutrophication showed weak correlations with the other indicators (namely greenhouse gas emissions, cumulative energy demand and minimum selling price). Process parameters (especially biomass productivity and lipid content) were the main source of variation, whereas uncertainties linked to the characterization methods and economic parameters had limited effect on the results. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  5. A protein prioritization approach tailored for the FA/BRCA pathway.

    Directory of Open Access Journals (Sweden)

    Anneke Haitjema

    Full Text Available Fanconi anemia (FA is a heterogeneous recessive disorder associated with a markedly elevated risk to develop cancer. To date sixteen FA genes have been identified, three of which predispose heterozygous mutation carriers to breast cancer. The FA proteins work together in a genome maintenance pathway, the so-called FA/BRCA pathway which is important during the S phase of the cell cycle. Since not all FA patients can be linked to (one of the sixteen known complementation groups, new FA genes remain to be identified. In addition the complex FA network remains to be further unravelled. One of the FA genes, FANCI, has been identified via a combination of bioinformatic techniques exploiting FA protein properties and genetic linkage. The aim of this study was to develop a prioritization approach for proteins of the entire human proteome that potentially interact with the FA/BRCA pathway or are novel candidate FA genes. To this end, we combined the original bioinformatics approach based on the properties of the first thirteen FA proteins identified with publicly available tools for protein-protein interactions, literature mining (Nermal and a protein function prediction tool (FuncNet. Importantly, the three newest FA proteins FANCO/RAD51C, FANCP/SLX4, and XRCC2 displayed scores in the range of the already known FA proteins. Likewise, a prime candidate FA gene based on next generation sequencing and having a very low score was subsequently disproven by functional studies for the FA phenotype. Furthermore, the approach strongly enriches for GO terms such as DNA repair, response to DNA damage stimulus, and cell cycle-regulated genes. Additionally, overlaying the top 150 with a haploinsufficiency probability score, renders the approach more tailored for identifying breast cancer related genes. This approach may be useful for prioritization of putative novel FA or breast cancer genes from next generation sequencing efforts.

  6. Language Model Combination and Adaptation Using Weighted Finite State Transducers

    Science.gov (United States)

    Liu, X.; Gales, M. J. F.; Hieronymus, J. L.; Woodland, P. C.

    2010-01-01

    In speech recognition systems language model (LMs) are often constructed by training and combining multiple n-gram models. They can be either used to represent different genres or tasks found in diverse text sources, or capture stochastic properties of different linguistic symbol sequences, for example, syllables and words. Unsupervised LM adaption may also be used to further improve robustness to varying styles or tasks. When using these techniques, extensive software changes are often required. In this paper an alternative and more general approach based on weighted finite state transducers (WFSTs) is investigated for LM combination and adaptation. As it is entirely based on well-defined WFST operations, minimum change to decoding tools is needed. A wide range of LM combination configurations can be flexibly supported. An efficient on-the-fly WFST decoding algorithm is also proposed. Significant error rate gains of 7.3% relative were obtained on a state-of-the-art broadcast audio recognition task using a history dependently adapted multi-level LM modelling both syllable and word sequences

  7. The combination of two training approaches to improve older adults' driving safety.

    Science.gov (United States)

    Bédard, Michel; Porter, Michelle M; Marshall, Shawn; Isherwood, Ivy; Riendeau, Julie; Weaver, Bruce; Tuokko, Holly; Molnar, Frank; Miller-Polgar, Jan

    2008-03-01

    An increasing number of older adults rely on the automobile for transportation. Educational approaches based on the specific needs of older drivers may help to optimize safe driving. We examined if the combination of an in-class education program with on-road education would lead to improvements in older drivers' knowledge of safe driving practices and on-road driving evaluations. We used a multisite, randomized controlled trial approach. Participants in the intervention group received the in-class and on-road education; those in the control group waited and were offered the education afterwards. We measured knowledge of safe driving practices before and after the in-class component of the program and on-road driving skills before and after the whole program. Participants' knowledge improved from 61% of correct answers before the in-class education component to 81% after (p < .001). The on-road evaluation results suggested improvements on some aspects of safe driving (e.g., moving in roadway, p < .05) but not on others. The results of this study demonstrate that education programs focused on the needs of older drivers may help improve their knowledge of safe driving practices and actual driving performance. Further research is required to determine if these changes will affect other variables such as driver confidence and crash rates.

  8. The Tools, Approaches and Applications of Visual Literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria

    Science.gov (United States)

    Ecoma, Victor

    2016-01-01

    The paper reflects upon the tools, approaches and applications of visual literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria. The objective of the discourse is to examine how the visual arts training and practice equip students with skills in visual literacy through methods of production, materials and…

  9. Reprocessed and combined thorium fuel cycles in a PER system with a micro heterogeneous approaches

    International Nuclear Information System (INIS)

    Monteiro, Fabiana B.A.; Castro, Victor F.; Faria, Rochkhudson B. de; Pereira, Claubia; Fortini, Angela

    2015-01-01

    A micro heterogeneous approaches were used to study the behavior of reprocessed fuel spiked with thorium in a PWR fuel element considering (TRU-Th) cycle. The goal is to achieve a higher burnup using three different configurations to model the fuel element using SCALE 6.0. The reprocessed fuels were obtained using the ORIGEN 2.1 code from a spent PWR standard fuel (33,000 MWd/tHM burned), with 3.1% of initial enrichment. The spent fuel remained in the cooling pool for five years and then reprocessed using the UREX+ technique. Three configurations of micro heterogeneous approaches were analyzed, and the k inf and plutonium evolution during the burnup were evaluated. The preliminary results show that the behavior of advanced fuel based on transuranic elements spiked with thorium, and micro heterogeneous approach are satisfactory in PWRs, and the configuration that use a combination of Th and TRU (configuration 1) seems to be the most promising once has higher values for k inf during the burnup, compared with other configurations. (author)

  10. Synthetic approaches towards new polymer systems by the combination of living carbocationic and anionic polymerizations

    DEFF Research Database (Denmark)

    Feldthusen, Jesper; Ivan, Bela; Muller, Axel. H.E.

    1996-01-01

    Recent efforts to obtain block copolymers by combination of living carbocationic and anionic polymerizations are presented.When tolyl-ended polyisobutylene was used as macroinitiator of anionic polymerization of methacrylate derivatives mixtures of homopolymers and block copolymers were formed due...... to incomplete lithiation of this chain end.In another approach a new functionalization method was developed by end-quenching living polyisobutylene with 1,1-diphenylethylene. After transformation of the groups into 2,2-diphenylvinyl end groups and lithiation polymers were synthesized from protected acrylate...

  11. Review: Henry E. Brady & David Collier (Eds. (2004. Rethinking Social Inquiry: Diverse Tools, Shared Standards

    Directory of Open Access Journals (Sweden)

    Matthias Catón

    2006-03-01

    Full Text Available The book Rethinking Social Inquiry, edited by Henry E. BRADY and David COLLIER, is a response to a book by KING, KEOHANE and VERBA (1994 that aimed to introduce quantitative standards to qualitative research. The authors of the book reviewed here criticize many of the suggestions made there because they argue that qualitative research requires other tools. Nevertheless, they agree that the foundations of research design are similar. The book comprises a comprehensive critique of mainstream quantitative techniques, describes a set of qualitative tools for research, and addresses issues of how to combine qualitative and quantitative approaches to maximize analytical leverage. It is an excellent contribution to the methodological debate in the social sciences. URN: urn:nbn:de:0114-fqs0602309

  12. Systematic review of sex work interventions in sub-Saharan Africa: examining combination prevention approaches.

    Science.gov (United States)

    Awungafac, George; Delvaux, Therese; Vuylsteke, Bea

    2017-08-01

    The incidence of HIV and sexually transmitted infections is disproportionately high among sex workers (SW). We aimed to update the evidence on the effectiveness of SW interventions in sub-Saharan Africa and to provide more insights into combination prevention. The Systematic review followed PRISMA guidelines in a search of PUBMED and POPLINE for peer-reviewed literature published between 1 January 2000 and 22 July 2016 (registration number on PROSPERO: CRD42016042529). We considered cohort interventions, randomised controlled trials and cross-sectional surveys of SW programmes. A framework was used in the description and mapping of intervention to desired outcomes. Twenty-six papers(reporting on 25 studies) were included. A strategy that empowered peer educator leaders to steer community activities showed a twofold increase in coverage of behaviour change communication and utilisation of health facility among SW. Brief alcohol harm reduction effort demonstrated a significant effect on sexual violence and engagement in sex trading. A risk reduction counselling intervention among drug-injecting SW showed an effect on alcohol, substance use and engagement in sex work. No study on a promising intervention like PrEP among SWs was found. We observed that interventions that combined some structural components, biomedical and behavioural strategies tend to accumulate more desired outcomes. The evidence base that can be considered in intervention designs to prevent HIV in SW in SSA is vast. The health sector should consider interventions to reduce binge alcohol intake and intravenous drug use among sex workers. Programmes should staunchly consider multicomponent approaches that explore community-based structural approaches. © 2017 John Wiley & Sons Ltd.

  13. Tests of Local Hadron Calibration Approaches in ATLAS Combined Beam Tests

    International Nuclear Information System (INIS)

    Grahn, Karl-Johan; Kiryunin, Andrey; Pospelov, Guennadi

    2011-01-01

    Three ATLAS calorimeters in the region of the forward crack at |η| 3.2 in the nominal ATLAS setup and a typical section of the two barrel calorimeters at |η| = 0.45 of ATLAS have been exposed to combined beam tests with single electrons and pions. Detailed shower shape studies of electrons and pions with comparisons to various Geant4 based simulations utilizing different physics lists are presented for the endcap beam test. The local hadron calibration approach as used in the full Atlas setup has been applied to the endcap beam test data. An extension of it using layer correlations has been tested with the barrel test beam data. Both methods utilize modular correction steps based on shower shape variables to correct for invisible energy inside the reconstructed clusters in the calorimeters (compensation) and for lost energy deposits outside of the reconstructed clusters (dead material and out-of-cluster deposits). Results for both methods and comparisons to Monte Carlo simulations are presented.

  14. Promoting a combination approach to paediatric HIV psychosocial support.

    Science.gov (United States)

    Amzel, Anouk; Toska, Elona; Lovich, Ronnie; Widyono, Monique; Patel, Tejal; Foti, Carrie; Dziuban, Eric J; Phelps, B Ryan; Sugandhi, Nandita; Mark, Daniella; Altschuler, Jenny

    2013-11-01

    is still limited evidence demonstrating which interventions have positive effects on the well being of HIV-infected children. Interventions that improve the psychosocial well being of children living with HIV must be replicable in resource-limited settings, avoiding dependence on specialized staff for implementation.This paper advocates for combination approaches that strengthen the capacity of service providers, expand the availability of age appropriate and family-centred support and equip schools to be more protective and supportive of children living with HIV. The coordination of care with other community-based interventions is also needed to foster more supportive and less stigmatizing environments. To ensure effective, feasible, and scalable interventions, improving the evidence base to document improved outcomes and longer term impact as well as implementation of operational studies to document delivery approaches are needed.

  15. Performance of in silico prediction tools for the classification of rare BRCA1/2 missense variants in clinical diagnostics.

    Science.gov (United States)

    Ernst, Corinna; Hahnen, Eric; Engel, Christoph; Nothnagel, Michael; Weber, Jonas; Schmutzler, Rita K; Hauke, Jan

    2018-03-27

    The use of next-generation sequencing approaches in clinical diagnostics has led to a tremendous increase in data and a vast number of variants of uncertain significance that require interpretation. Therefore, prediction of the effects of missense mutations using in silico tools has become a frequently used approach. Aim of this study was to assess the reliability of in silico prediction as a basis for clinical decision making in the context of hereditary breast and/or ovarian cancer. We tested the performance of four prediction tools (Align-GVGD, SIFT, PolyPhen-2, MutationTaster2) using a set of 236 BRCA1/2 missense variants that had previously been classified by expert committees. However, a major pitfall in the creation of a reliable evaluation set for our purpose is the generally accepted classification of BRCA1/2 missense variants using the multifactorial likelihood model, which is partially based on Align-GVGD results. To overcome this drawback we identified 161 variants whose classification is independent of any previous in silico prediction. In addition to the performance as stand-alone tools we examined the sensitivity, specificity, accuracy and Matthews correlation coefficient (MCC) of combined approaches. PolyPhen-2 achieved the lowest sensitivity (0.67), specificity (0.67), accuracy (0.67) and MCC (0.39). Align-GVGD achieved the highest values of specificity (0.92), accuracy (0.92) and MCC (0.73), but was outperformed regarding its sensitivity (0.90) by SIFT (1.00) and MutationTaster2 (1.00). All tools suffered from poor specificities, resulting in an unacceptable proportion of false positive results in a clinical setting. This shortcoming could not be bypassed by combination of these tools. In the best case scenario, 138 families would be affected by the misclassification of neutral variants within the cohort of patients of the German Consortium for Hereditary Breast and Ovarian Cancer. We show that due to low specificities state-of-the-art in silico

  16. Designing Tool Support for Translating Use Cases and UML 2.0 Sequence Diagrams into a Coloured Petri Net

    DEFF Research Database (Denmark)

    Fernandes, Joao Miguel; Tjell, Simon; Jørgensen, Jens Bæk

    2007-01-01

    On a case study on the specification of an elevator controller, this paper presents an approach that can translate given UML descriptions into a Coloured Petri Net (CPN) model. The UML descriptions must be specified in the form of Use Cases (UCs) and UML 2.0 Sequence Diagrams (SDs). The CPN model...... constitutes one single, coherent and executable representation of all possible behaviours that are specified by the given UML artefacts. CPN is a formal modelling language that enables construction and analysis of scalable, executable models of behaviour. A combined use of UML and CPN can be useful in several...... projects. CPN is well supported by the tool called CPN Tools and the work we present here is aimed at building a CPN Tools front-end engine that implements the proposed translation....

  17. Getting The Picture: Our Changing Climate- A new learning tool for climate science

    Science.gov (United States)

    Yager, K.; Balog, J. D.

    2014-12-01

    Earth Vision Trust (EVT), founded by James Balog- photographer and scientist, has developed a free, online, multimedia climate science education tool for students and educators. Getting The Picture (GTP) creates a new learning experience, drawing upon powerful archives of Extreme Ice Survey's unique photographs and time-lapse videos of changing glaciers around the world. GTP combines the latest in climate science through interactive tools that make the basic scientific tenets of climate science accessible and easy to understand. The aim is to use a multidisciplinary approach to encourage critical thinking about the way our planet is changing due to anthropogenic activities, and to inspire students to find their own voice regarding our changing climate The essence of this resource is storytelling through the use of inspiring images, field expedition notes and dynamic multimedia tools. EVT presents climate education in a new light, illustrating the complex interaction between humans and nature through their Art + Science approach. The overarching goal is to educate and empower young people to take personal action. GTP is aligned with national educational and science standards (NGSS, CCSS, Climate Literacy) so it may be used in conventional classrooms as well as education centers, museum kiosks or anywhere with Internet access. Getting The Picture extends far beyond traditional learning to provide an engaging experience for students, educators and all those who wish to explore the latest in climate science.

  18. Characterization of bioactive compounds of Annona cherimola L. leaves using a combined approach based on HPLC-ESI-TOF-MS and NMR.

    Science.gov (United States)

    Díaz-de-Cerio, Elixabet; Aguilera-Saez, Luis Manuel; Gómez-Caravaca, Ana María; Verardo, Vito; Fernández-Gutiérrez, Alberto; Fernández, Ignacio; Arráez-Román, David

    2018-06-01

    Annona cherimola Mill. (cherimoya) has widely been used as food crop. The leaves of this tree possess several health benefits, which are, in general, attributed mainly to its bioactive composition. However, literature concerning a comprehensive characterization based on a combined approach, which consists of nuclear magnetic resonance (NMR) and high-performance liquid chromatography coupled with time-of-flight mass spectrometry (HPLC-TOF-MS), from these leaves is scarce. Thus, the aim of this work was to study the polar profile of full extracts of cherimoya leaves by using these tools. Thus, a total of 77 compounds have been characterized, 12 of which were identified by both techniques. Briefly, 23 compounds were classified as amino acids, organic acids, carbohydrates, cholines, phenolic acid derivatives, and flavonoids by NMR, while 66 metabolites were divided into sugars, amino acids, phenolic acids and derivatives, flavonoids, phenylpropanoids, and other polar compounds by HPLC-TOF-MS. It is worth mentioning that different solvent mixtures were tested and the total phenolic content in the extracts quantified (TPC via HPLC-TOF-MS). The tendency observed was EtOH/water 80/20 (v/v) (17.0 ± 0.2 mg TPC/g leaf dry weight (d.w.)) ≥ acetone/water 70/30 (v/v) (16.1 ± 0.7 mg TPC/g leaf d.w.) > EtOH/water 70/30 (v/v) (14.0 ± 0.3 mg TPC/g leaf d.w.) > acetone/water 80/20 (v/v) (13.5 ± 0.4 mg TPC/g leaf d.w.). Importantly, flavonoids derivatives were between 63 and 76% of the TPC in those extracts. Major compounds were sucrose, glucose (α and β), and proline, and chlorogenic acid and rutin for NMR and HPLC-TOF-MS, respectively. Graphical abstract The combined use of LC-HRMS and NMR is a potential synergic combination for a comprehensive metabolite composition of cherimoya leaves.

  19. Contemporary approach to neurologic prognostication of coma after cardiac arrest.

    Science.gov (United States)

    Ben-Hamouda, Nawfel; Taccone, Fabio S; Rossetti, Andrea O; Oddo, Mauro

    2014-11-01

    Coma after cardiac arrest (CA) is an important cause of admission to the ICU. Prognosis of post-CA coma has significantly improved over the past decade, particularly because of aggressive postresuscitation care and the use of therapeutic targeted temperature management (TTM). TTM and sedatives used to maintain controlled cooling might delay neurologic reflexes and reduce the accuracy of clinical examination. In the early ICU phase, patients' good recovery may often be indistinguishable (based on neurologic examination alone) from patients who eventually will have a poor prognosis. Prognostication of post-CA coma, therefore, has evolved toward a multimodal approach that combines neurologic examination with EEG and evoked potentials. Blood biomarkers (eg, neuron-specific enolase [NSE] and soluble 100-β protein) are useful complements for coma prognostication; however, results vary among commercial laboratory assays, and applying one single cutoff level (eg, > 33 μg/L for NSE) for poor prognostication is not recommended. Neuroimaging, mainly diffusion MRI, is emerging as a promising tool for prognostication, but its precise role needs further study before it can be widely used. This multimodal approach might reduce false-positive rates of poor prognosis, thereby providing optimal prognostication of comatose CA survivors. The aim of this review is to summarize studies and the principal tools presently available for outcome prediction and to describe a practical approach to the multimodal prognostication of coma after CA, with a particular focus on neuromonitoring tools. We also propose an algorithm for the optimal use of such multimodal tools during the early ICU phase of post-CA coma.

  20. Biocompatible Nanoemulsions for Improved Aceclofenac Skin Delivery: Formulation Approach Using Combined Mixture-Process Experimental Design.

    Science.gov (United States)

    Isailović, Tanja; Ðorđević, Sanela; Marković, Bojan; Ranđelović, Danijela; Cekić, Nebojša; Lukić, Milica; Pantelić, Ivana; Daniels, Rolf; Savić, Snežana

    2016-01-01

    We aimed to develop lecithin-based nanoemulsions intended for effective aceclofenac (ACF) skin delivery utilizing sucrose esters [sucrose palmitate (SP) and sucrose stearate (SS)] as additional stabilizers and penetration enhancers. To find the suitable surfactant mixtures and levels of process variables (homogenization pressure and number of cycles - high pressure homogenization manufacturing method) that result in drug-loaded nanoemulsions with minimal droplet size and narrow size distribution, a combined mixture-process experimental design was employed. Based on optimization data, selected nanoemulsions were evaluated regarding morphology, surface charge, drug-excipient interactions, physical stability, and in vivo skin performances (skin penetration and irritation potential). The predicted physicochemical properties and storage stability were proved satisfying for ACF-loaded nanoemulsions containing 2% of SP in the blend with 0%-1% of SS and 1%-2% of egg lecithin (produced at 50°C/20 cycles/800 bar). Additionally, the in vivo tape stripping demonstrated superior ACF skin absorption from these nanoemulsions, particularly from those containing 2% of SP, 0.5% of SS, and 1.5% of egg lecithin, when comparing with the sample costabilized by conventional surfactant - polysorbate 80. In summary, the combined mixture-process experimental design was shown as a feasible tool for formulation development of multisurfactant-based nanosized delivery systems with potentially improved overall product performances.

  1. Chemical entity recognition in patents by combining dictionary-based and statistical approaches.

    Science.gov (United States)

    Akhondi, Saber A; Pons, Ewoud; Afzal, Zubair; van Haagen, Herman; Becker, Benedikt F H; Hettne, Kristina M; van Mulligen, Erik M; Kors, Jan A

    2016-01-01

    We describe the development of a chemical entity recognition system and its application in the CHEMDNER-patent track of BioCreative 2015. This community challenge includes a Chemical Entity Mention in Patents (CEMP) recognition task and a Chemical Passage Detection (CPD) classification task. We addressed both tasks by an ensemble system that combines a dictionary-based approach with a statistical one. For this purpose the performance of several lexical resources was assessed using Peregrine, our open-source indexing engine. We combined our dictionary-based results on the patent corpus with the results of tmChem, a chemical recognizer using a conditional random field classifier. To improve the performance of tmChem, we utilized three additional features, viz. part-of-speech tags, lemmas and word-vector clusters. When evaluated on the training data, our final system obtained an F-score of 85.21% for the CEMP task, and an accuracy of 91.53% for the CPD task. On the test set, the best system ranked sixth among 21 teams for CEMP with an F-score of 86.82%, and second among nine teams for CPD with an accuracy of 94.23%. The differences in performance between the best ensemble system and the statistical system separately were small.Database URL: http://biosemantics.org/chemdner-patents. © The Author(s) 2016. Published by Oxford University Press.

  2. Combining engineering and data-driven approaches: Development of a generic fire risk model facilitating calibration

    DEFF Research Database (Denmark)

    De Sanctis, G.; Fischer, K.; Kohler, J.

    2014-01-01

    Fire risk models support decision making for engineering problems under the consistent consideration of the associated uncertainties. Empirical approaches can be used for cost-benefit studies when enough data about the decision problem are available. But often the empirical approaches...... a generic risk model that is calibrated to observed fire loss data. Generic risk models assess the risk of buildings based on specific risk indicators and support risk assessment at a portfolio level. After an introduction to the principles of generic risk assessment, the focus of the present paper...... are not detailed enough. Engineering risk models, on the other hand, may be detailed but typically involve assumptions that may result in a biased risk assessment and make a cost-benefit study problematic. In two related papers it is shown how engineering and data-driven modeling can be combined by developing...

  3. A Model-Free Diagnosis Approach for Intake Leakage Detection and Characterization in Diesel Engines

    Directory of Open Access Journals (Sweden)

    Ghaleb Hoblos

    2015-07-01

    Full Text Available Feature selection is an essential step for data classification used in fault detection and diagnosis processes. In this work, a new approach is proposed, which combines a feature selection algorithm and a neural network tool for leak detection and characterization tasks in diesel engine air paths. The Chi square classifier is used as the feature selection algorithm and the neural network based on Levenberg-Marquardt is used in system behavior modeling. The obtained neural network is used for leak detection and characterization. The model is learned and validated using data generated by xMOD. This tool is used again for testing. The effectiveness of the proposed approach is illustrated in simulation when the system operates on a low speed/load and the considered leak affecting the air path is very small.

  4. ARTICLE Robust Diagnosis of Mechatronics System by Bond Graph Approach

    Directory of Open Access Journals (Sweden)

    Abderrahmene Sellami

    2018-03-01

    Full Text Available This article presents design of a robust diagnostic system based on bond graph model for a mechatronic system. Mechatronics is the synergistic and systemic combination of mechanics, electronics and computer science. The design of a mechatronic system modeled by the bond graph model becomes easier and more generous. The bond graph tool is a unified graphical language for all areas of engineering sciences and confirmed as a structured approach to modeling and simulation of multidisciplinary systems.

  5. A new flowsheeting tool for flue gas treating

    NARCIS (Netherlands)

    van Elk, E. P.; Arendsen, A. R. J.; Versteeg, G. F.

    2009-01-01

    A new flowsheeting tool, specifically designed for steady-state simulation of acid gas treating processes, has been developed. The models implemented in the new tool combine all issues relevant for the design, optimization and analysis of acid gas treating processes, including post-combustion and

  6. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    Science.gov (United States)

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Factors affecting economies of scale in combined sewer systems.

    Science.gov (United States)

    Maurer, Max; Wolfram, Martin; Anja, Herlyn

    2010-01-01

    A generic model is introduced that represents the combined sewer infrastructure of a settlement quantitatively. A catchment area module first calculates the length and size distribution of the required sewer pipes on the basis of rain patterns, housing densities and area size. These results are fed into the sewer-cost module in order to estimate the combined sewer costs of the entire catchment area. A detailed analysis of the relevant input parameters for Swiss settlements is used to identify the influence of size on costs. The simulation results confirm that an economy of scale exists for combined sewer systems. This is the result of two main opposing cost factors: (i) increased construction costs for larger sewer systems due to larger pipes and increased rain runoff in larger settlements, and (ii) lower costs due to higher population and building densities in larger towns. In Switzerland, the more or less organically grown settlement structures and limited land availability emphasise the second factor to show an apparent economy of scale. This modelling approach proved to be a powerful tool for understanding the underlying factors affecting the cost structure for water infrastructures.

  8. Are people with epilepsy using eHealth-tools?

    Science.gov (United States)

    Leenen, Loes A M; Wijnen, Ben F M; de Kinderen, Reina J A; van Heugten, Caroline M; Evers, Silvia M A A; Majoie, Marian H J M

    2016-11-01

    Self-management for people with epilepsy (PWE) should lead to shared decision-making and thus to adherence to the treatment plan. eHealth is an important way of supporting PWE in their self-management. In this survey, we used a mixed method to explore the following: 1) which factors were monitored by PWE and how (using pen and paper or eHealth-tools), 2) how many PWE own a computer or smartphone, and 3) how do they perceive the use of eHealth. A consecutive series of 1000 PWE attending the outpatient clinic of a tertiary epilepsy center were asked to fill in a questionnaire. In comparison with the general population, fewer PWE owned a computer or smartphone. They were, however, more likely to self-monitor their health than other patients suffering from a chronic condition. Although PWE did not use eHealth-tools often, they perceived it as a user-friendly tool, promoting health behavior as well as adherence. On the other hand, problems with privacy and the perception that not everyone is able to use eHealth were considered as disadvantages by PWE. Promoting self-care was perceived as both an advantage and a disadvantage. It was seen as an advantage when PWE mentioned the option of eHealth-tools in order to gain insight into one's epilepsy. At the same time, it was seen as a disadvantage because it confronts PWE with their disease, which causes emotional stress. The high level of self-monitoring combined with a low usage of eHealth-tools seems to indicate that there is a need for a more tailored approach to stimulate the use of eHealth-tools by PWE. Further research should focus on this aspect, e.g., what PWE need in order to make more use of eHealth-tools in their self-care. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  10. Nanotechnology tools in pharmaceutical R&D

    OpenAIRE

    Challa S.S.R. Kumar

    2010-01-01

    Nanotechnology is a new approach to problem solving and can be considered as a collection of tools and ideas which can be applied in pharmaceutical industry. Application of nanotechnology tools in pharmaceutical R&D is likely to result in moving the industry from ‘blockbuster drug’ model to ‘personalized medicine’. There are compelling applications in pharmaceutical industry where inexpensive nanotechnology tools can be utilized. The review explores the possibility of categorizing various nan...

  11. High-speed micro electrode tool fabrication by a twin-wire EDM system

    International Nuclear Information System (INIS)

    Sheu, Dong-Yea

    2008-01-01

    This paper describes a new machining process which combines twin-electro-wire together with two electro discharge circuits to rapidly fabricate micro electrode tools. The results show that transistor electro discharge and RC electro discharge circuits coexist to fabricate micro tools with rough and finish machining both on the same machine. Compared to conventional wire electro discharge grinding (WEDG) technology, a twin-wire EDM system that combines rough and finish machining into one process allows the efficient fabrication of micro tools. This high-speed micro tool fabrication process can be applied not only to micro electrode machining but also to micro punching tool and micro probing tips machining

  12. Design Tools for Integrated Asynchronous Electronic Circuits

    National Research Council Canada - National Science Library

    Martin, Alain

    2003-01-01

    ..., simulation, verification, at the logical and physical levels. Situs has developed a business model for the commercialization of the CAD tools, and has designed the prototype of the tool suite based on this business model and the Caltech approach...

  13. Big data analytics in immunology: a knowledge-based approach.

    Science.gov (United States)

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  14. Big Data Analytics in Immunology: A Knowledge-Based Approach

    Directory of Open Access Journals (Sweden)

    Guang Lan Zhang

    2014-01-01

    Full Text Available With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  15. Orthology detection combining clustering and synteny for very large datasets.

    Science.gov (United States)

    Lechner, Marcus; Hernandez-Rosales, Maribel; Doerr, Daniel; Wieseke, Nicolas; Thévenin, Annelyse; Stoye, Jens; Hartmann, Roland K; Prohaska, Sonja J; Stadler, Peter F

    2014-01-01

    The elucidation of orthology relationships is an important step both in gene function prediction as well as towards understanding patterns of sequence evolution. Orthology assignments are usually derived directly from sequence similarities for large data because more exact approaches exhibit too high computational costs. Here we present PoFF, an extension for the standalone tool Proteinortho, which enhances orthology detection by combining clustering, sequence similarity, and synteny. In the course of this work, FFAdj-MCS, a heuristic that assesses pairwise gene order using adjacencies (a similarity measure related to the breakpoint distance) was adapted to support multiple linear chromosomes and extended to detect duplicated regions. PoFF largely reduces the number of false positives and enables more fine-grained predictions than purely similarity-based approaches. The extension maintains the low memory requirements and the efficient concurrency options of its basis Proteinortho, making the software applicable to very large datasets.

  16. Combining formal and functional approaches to topic structure

    NARCIS (Netherlands)

    Zellers, M.; Post, B.

    2012-01-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this

  17. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    International Nuclear Information System (INIS)

    Setiawan, A; Wangsaputra, R; Halim, A H; Martawirya, Y Y

    2016-01-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule. (paper)

  18. Enhancement of loss detection capability using a combination of the Kalman Filter/Linear Smoother and controllable unit accounting approach

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.

    1979-01-01

    An approach to loss detection is presented which combines the optimal loss detection capability of state estimation techniques with a controllable unit accounting approach. The state estimation theory makes use of a linear system model which is capable of modeling the interaction of various controllable unit areas within a given facility. An example is presented which illustrates the increase in loss detection probability which is realizable with state estimation techniques. Comparisons are made with a Shewhart Control Chart and the CUSUM statistic

  19. Innovation in healthcare services – creating a Combined Contingency Theory and Ecosystems Approach

    Science.gov (United States)

    Engelseth, Per; Kritchanchai, Duangpun

    2018-04-01

    The purpose of this conceptual paper is to develop an analytical framework used for process development in healthcare services. Healthcare services imply a form of operations management demanding an adapted research approach. This study therefore highlights first in the introduction challenges of healthcare services as a reasoning of this study. It is a type of service that has high societal and therefore ethical concern, but at the same time needs to be carried out efficiently to economise service production resource use. Combined business and ethics concerns need to be balanced in this service supply system. In the literature review that is the bulk of this paper, first, particularities of the service industry processes are considered. This is followed by considering literature on contingency theory to consider the nature of the supply chain context of the healthcare service processes highlighting interdependencies and appropriate technology use. This developed view is then expanded to consider an ecosystems approach to encompass the environment expanding analyses to considering in balanced manner features of business, society and nature. A research model for directing both further researches on the healthcare service industry an innovation of such services in practice is introduced.

  20. SitesIdentify: a protein functional site prediction tool

    Directory of Open Access Journals (Sweden)

    Doig Andrew J

    2009-11-01

    Full Text Available Abstract Background The rate of protein structures being deposited in the Protein Data Bank surpasses the capacity to experimentally characterise them and therefore computational methods to analyse these structures have become increasingly important. Identifying the region of the protein most likely to be involved in function is useful in order to gain information about its potential role. There are many available approaches to predict functional site, but many are not made available via a publicly-accessible application. Results Here we present a functional site prediction tool (SitesIdentify, based on combining sequence conservation information with geometry-based cleft identification, that is freely available via a web-server. We have shown that SitesIdentify compares favourably to other functional site prediction tools in a comparison of seven methods on a non-redundant set of 237 enzymes with annotated active sites. Conclusion SitesIdentify is able to produce comparable accuracy in predicting functional sites to its closest available counterpart, but in addition achieves improved accuracy for proteins with few characterised homologues. SitesIdentify is available via a webserver at http://www.manchester.ac.uk/bioinformatics/sitesidentify/

  1. An empowerment-based approach to developing innovative e-health tools for self-management

    NARCIS (Netherlands)

    Alpay, L.; Boog, P. van der; Dumaij, A.

    2011-01-01

    E-health is seen as an important technological tool in achieving self-management; however, there is little evidence of how effective e-health is for self-management. Example tools remain experimental and there is limited knowledge yet about the design, use, and effects of this class of tools. By way

  2. Design of automation tools for management of descent traffic

    Science.gov (United States)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  3. Tool to assess causality of direct and indirect adverse events associated with therapeutic interventions.

    Science.gov (United States)

    Zorzela, Liliane; Mior, Silvano; Boon, Heather; Gross, Anita; Yager, Jeromy; Carter, Rose; Vohra, Sunita

    2018-03-01

    To develop and test a tool to assess the causality of direct and indirect adverse events associated with therapeutic interventions. The intervention was one or more drugs and/or natural health products, a device, or practice (professional delivering the intervention). Through the assessment of causality of adverse events, we can learn about factors contributing to the harm and consider what modification may prevent its reoccurrence. Existing scales (WHO-UMC, Naranjo and Horn) were adapted to develop a tool (algorithm and table) to evaluate cases of serious harmful events reported through a national surveillance study. We also incorporated a novel approach that assesses indirect harm (caused by the delay in diagnosis/treatment) and the health provider delivering the intervention (practice). The tool was tested, revised and then implemented to assess all reported cases of serious events resulting from use of complementary therapies. The use of complementary therapies was the trigger to report the event. Each case was evaluated by two assessors, out of a panel of five, representing different health care professionals. The tool was used in assessment of eight serious adverse events. Each event was independently evaluated by two assessors. The algorithm facilitated assessment of a serious direct or indirect harm. Assessors agreed in the final score on seven of eight cases (weighted kappa coefficient of 0.75). A tool to support the assessment of causality of adverse events was developed and tested. We propose a novel method to assess direct and indirect harms related to product(s), device(s), practice or a combination of the previous. Further research will probably help evaluate this approach across different settings and interventions.

  4. A combined coalescence gene-dropping tool for evaluating genomic selection in complex scenarios (ms2gs).

    Science.gov (United States)

    Pérez-Enciso, M; Legarra, A

    2016-04-01

    We present ms2gs, a combined coalescence - gene dropping (i.e. backward-forward) simulator for complex traits. It therefore aims at combining the advantages of both approaches. It is primarily conceived for very short term, recent scenarios such as those that are of interest in animal and plant breeding. It is very flexible in terms of defining QTL architecture and SNP ascertainment bias, and it allows for easy modelling of alternative markers such as RADs. It can use real sequence or chip data or generate molecular polymorphisms via the coalescence. It can generate QTL conditional on extant molecular information, such as low-density genotyping. It models (simplistically) sequence, imputation or genotyping errors. It requires as input both genotypic data in plink or ms formats, and a pedigree that is used to perform the gene dropping. By default, it compares accuracy for BLUP, SNP ascertained data, sequence, and causal SNPs. It employs VanRaden's linear (GBLUP) and nonlinear method for incorporating molecular information. To illustrate the program, we present a small application in a half-sib population and a multiparental (MAGIC) cross. The program, manual and examples are available at https://github.com/mperezenciso/ms2gs. © 2016 Blackwell Verlag GmbH.

  5. A business intelligence approach using web search tools and online data reduction techniques to examine the value of product-enabled services

    DEFF Research Database (Denmark)

    Tanev, Stoyan; Liotta, Giacomo; Kleismantas, Andrius

    2015-01-01

    in Canada and Europe. It adopts an innovative methodology based on online textual data that could be implemented in advanced business intelligence tools aiming at the facilitation of innovation, marketing and business decision making. Combinations of keywords referring to different aspects of service value......-service innovation as a competitive advantage on the marketplace. On the other hand, the focus of EU firms on innovative hybrid offerings is not explicitly related to business differentiation and competitiveness....

  6. Powered mobility intervention: understanding the position of tool use learning as part of implementing the ALP tool.

    Science.gov (United States)

    Nilsson, Lisbeth; Durkin, Josephine

    2017-10-01

    To explore the knowledge necessary for adoption and implementation of the Assessment of Learning Powered mobility use (ALP) tool in different practice settings for both adults and children. To consult with a diverse population of professionals working with adults and children, in different countries and various settings; who were learning about or using the ALP tool, as part of exploring and implementing research findings. Classical grounded theory with a rigorous comparative analysis of data from informants together with reflections on our own rich experiences of powered mobility practice and comparisons with the literature. A core category learning tool use and a new theory of cognizing tool use, with its interdependent properties: motivation, confidence, permissiveness, attentiveness and co-construction has emerged which explains in greater depth what enables the application of the ALP tool. The scientific knowledge base on tool use learning and the new theory conveys the information necessary for practitioner's cognizing how to apply the learning approach of the ALP tool in order to enable tool use learning through powered mobility practice as a therapeutic intervention in its own right. This opens up the possibility for more children and adults to have access to learning through powered mobility practice. Implications for rehabilitation Tool use learning through powered mobility practice is a therapeutic intervention in its own right. Powered mobility practice can be used as a rehabilitation tool with individuals who may not need to become powered wheelchair users. Motivation, confidence, permissiveness, attentiveness and co-construction are key properties for enabling the application of the learning approach of the ALP tool. Labelling and the use of language, together with honing observational skills through viewing video footage, are key to developing successful learning partnerships.

  7. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    OpenAIRE

    Panta, Sandeep R.; Wang, Runtang; Fries, Jill; Kalyanam, Ravi; Speer, Nicole; Banich, Marie; Kiehl, Kent; King, Margaret; Milham, Michael; Wager, Tor D.; Turner, Jessica A.; Plis, Sergey M.; Calhoun, Vince D.

    2016-01-01

    In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI) scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS). We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed ...

  8. Combined spectroscopy approaches towards the study of truly 1D carbon-based structures

    Energy Technology Data Exchange (ETDEWEB)

    Ayala, Paola, E-mail: paola.ayala@univie.ac.at [University of Vienna (Austria)

    2016-07-01

    Full text: The applicability of nanostructured materials owes great part of its success to the proper understanding of their physical properties and the interaction with the surrounding environment. Applications related to improving solar cell efficiency are among the fields in which understanding the behavior of nanomaterials is critical. In this talk I will present an overview and progress report of the use of different spectroscopy techniques such as Raman, photoemission and X-ray absorption spectroscopy as key tools to understand the properties of low dimensional carbon systems with sp{sup 2} hybridization, as well as one dimensional carbyne chains. Keeping in mind that the properties of sp{sup 2} hybridized materials can be nicely tuned via different functionalization methods like substitutional doping, lattice modifications, adsorption of species, among others, this overview will provide an approach to how these techniques can be utilized to understand and analyze changes in the site-selective valence and conduction bands of single walled carbon nanotubes and graphene. (author)

  9. A Quantum Hybrid PSO Combined with Fuzzy k-NN Approach to Feature Selection and Cell Classification in Cervical Cancer Detection

    Directory of Open Access Journals (Sweden)

    Abdullah M. Iliyasu

    2017-12-01

    Full Text Available A quantum hybrid (QH intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO method with the intuitionistic rationality of traditional fuzzy k-nearest neighbours (Fuzzy k-NN algorithm (known simply as the Q-Fuzzy approach is proposed for efficient feature selection and classification of cells in cervical smeared (CS images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection and another hybrid technique combining the standard PSO algorithm with the Fuzzy k-NN technique (P-Fuzzy approach. In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k-NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.

  10. Developing Coastal Adaptation to Climate Change in the New York City Infrastructure-Shed: Process, Approach, Tools, and Strategies

    Science.gov (United States)

    Rosenzweig, Cynthia; Solecki, William D.; Blake, Reginald; Bowman, Malcolm; Faris, Craig; Gornitz, Vivien; Horton, Radley; Jacob, Klaus; LeBlanc, Alice; Leichenko, Robin; hide

    2010-01-01

    While current rates of sea level rise and associated coastal flooding in the New York City region appear to be manageable by stakeholders responsible for communications, energy, transportation, and water infrastructure, projections for sea level rise and associated flooding in the future, especially those associated with rapid icemelt of the Greenland and West Antarctic Icesheets, may be beyond the range of current capacity because an extreme event might cause flooding and inundation beyond the planning and preparedness regimes. This paper describes the comprehensive process, approach, and tools developed by the New York City Panel on Climate Change (NPCC) in conjunction with the region s stakeholders who manage its critical infrastructure, much of which lies near the coast. It presents the adaptation approach and the sea-level rise and storm projections related to coastal risks developed through the stakeholder process. Climate change adaptation planning in New York City is characterized by a multi-jurisdictional stakeholder-scientist process, state-of-the-art scientific projections and mapping, and development of adaptation strategies based on a risk-management approach.

  11. Development of advanced risk informed asset management tool based on system dynamics approach for nuclear power plant

    International Nuclear Information System (INIS)

    Lee, Gyoung Cheol

    2007-02-01

    In the competitive circumstance of electricity industry, the economic efficiency of electricity generation facility is the most important factor to increase their competitiveness. For nuclear power plant (NPP), safety is also an essential factor. Over fast several years, efforts for development of safety concerned and financial asset maximizing method, process and tools have been continued internationally and Risk-Informed Asset Management (RIAM) methodology is suggested by Electric Power Research Institute (EPRI). This RIAM methodology is expected to provide plant operators with a project prioritization and life cycle management planning tool for making long-term maintenance plans, guiding plant budgeting, and determining the sensitivity of a plant's economic risk to the reliability and availability of system, structure, and components (SSC), as well as other technical and economic parameters. The focus of this study is to develop model that help us to resource allocation, to find what effect such allocations on the plant economic and safety performance. Detailed research process for this goal is as follow; First step for development of advanced RIAM model is to review for current RIAM model of EPRI. This part describes the overall RIAM methodology including its conceptual model, implementation process, modular approach etc. Second step is to perform feasibility study for current EPRI's RIAM model with case study. This part shows the result of feasibility study for current RIAM method by case study and discussion for result. Finally, concept of Advanced RIAM model is developed based on system dynamics approach and parameter relationship is formulated. In advanced RIAM Model, Identification of scheduled maintenance effect on other parameters and the relationship between PM Activity and failure rate is most important factor. In this study, these relationships are formulated based on system dynamics approach. Creations of these modeling tool using Vensim

  12. A Combined Approach to Measure Micropollutant Behaviour during Riverbank Filtration

    Science.gov (United States)

    van Driezum, Inge; Saracevic, Ernis; Derx, Julia; Kirschner, Alexander; Sommer, Regina; Farnleitner, Andreas; Blaschke, Alfred Paul

    2016-04-01

    Riverbank filtration (RBF) systems are widely used as natural treatment process. The advantages of RBF over surface water abstraction are the elimination of for example suspended solids, biodegradable compounds (like specific micropollutants), bacteria and viruses (Hiscock and Grischek, 2002). However, in contrast to its importance, remarkably less is known on the respective external (e.g. industrial or municipal sewage) and the internal (e.g. wildlife and agricultural influence) sources of contaminants, the environmental availability and fate of the various hazardous substances, and its potential transport during soil and aquifer passage. The goal of this study is to get an insight in the behaviour of various micropollutants and microbial indicators during riverbank filtration. Field measurements were combined with numerical modelling approaches. The study area comprises an alluvial backwater and floodplain area downstream of Vienna. The river is highly dynamic, with discharges ranging from 900 m3/s during low flow to 11000 m3/s during flood events. Samples were taken in several monitoring wells along a transect extending from the river towards a backwater river in the floodplain. Three of the piezometers were situated in the first 20 meters away from the river in order to obtain information about micropollutant behaviour close to the river. A total of 9 different micropollutants were analysed in grab samples taken under different river flow conditions (n=33). Following enrichment using SPE, analysis was performed using high performance liquid chromatography-tandem mass spectrometry. Faecal indicators (E. coli and enterococci) and bacterial spores were enumerated in sample volumes of 1 L each using cultivation based methods (ISO 16649-1, ISO 7899-2:2000 and ISO 6222). The analysis showed that some compounds, e.g. ibuprofen and diclofenac, were only found in the river. These compounds were already degraded in the first ten meters away from the river. Analysis of

  13. Fatigue in cold-forging dies: Tool life analysis

    DEFF Research Database (Denmark)

    Skov-Hansen, P.; Bay, Niels; Grønbæk, J.

    1999-01-01

    In the present investigation it is shown how the tool life of heavily loaded cold-forging dies can be predicted. Low-cycle fatigue and fatigue crack growth testing of the tool materials are used in combination with finite element modelling to obtain predictions of tool lives. In the models...... the number of forming cycles is calculated first to crack initiation and then during crack growth to fatal failure. An investigation of a critical die insert in an industrial cold-forging tool as regards the influence of notch radius, the amount and method of pre-stressing and the selected tool material...

  14. The perfection of the construction of a combined cutting tool on the basis of the results of mathematical modelling of working cutting processes in RecurDyn

    Directory of Open Access Journals (Sweden)

    Poddubny Vladimir

    2017-01-01

    Full Text Available As the title implies the article describes how to optimize the construction of a combined cutting tool on the example of developed design of the face milling cutter with regulable rigidity of damping elements in order to improve the vibration resistance of the cutting process. RecurDyn is proposed, which is widely used for creating models of different mechanical systems, their analysis and optimization of construction, uses the ideology of visual object-oriented programming and computer research of volume solid-state models. Much attention is given to the description of the mechanical and mathematical model of the face milling cutter in RecurDyn and the results of mathematical modeling of the face milling cutter with damping elements, consisting of individual elements, with the possibility of program controlling its operation in the process of cutting. The applying of RecurDyn made it possible to carry out a complex assessment of influence of separate elements of a design of the combined cutting tool on quantitative and qualitative parameters of milling process and to define optimal values of the input and output parameters of technological process of machining for various damping elements.

  15. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  16. Tools and approaches for simplifying serious games development in educational settings

    OpenAIRE

    Calvo, Antonio; Rotaru, Dan C.; Freire, Manuel; Fernandez-Manjon, Baltasar

    2016-01-01

    Serious Games can benefit from the commercial video games industry by taking advantage of current development tools. However, the economics and requirements of serious games and commercial games are very different. In this paper, we describe the factors that impact the total cost of ownership of serious games used in educational settings, review the specific requirements of games used as learning material, and analyze the different development tools available in the industry highlighting thei...

  17. External validation of approaches to prediction of falls during hospital rehabilitation stays and development of a new simpler tool

    Directory of Open Access Journals (Sweden)

    Angela Vratsistas-Curto

    2017-12-01

    Full Text Available Objectives: To test the external validity of 4 approaches to fall prediction in a rehabilitation setting (Predict_FIRST, Ontario Modified STRATIFY (OMS, physiotherapists’ judgement of fall risk (PT_Risk, and falls in the past year (Past_Falls, and to develop and test the validity of a simpler tool for fall prediction in rehabilitation (Predict_CM2. Participants: A total of 300 consecutively-admitted rehabilitation inpatients. Methods: Prospective inception cohort study. Falls during the rehabilitation stay were monitored. Potential predictors were extracted from medical records. Results: Forty-one patients (14% fell during their rehabilitation stay. The external validity, area under the receiver operating characteristic curve (AUC, for predicting future fallers was: 0.71 (95% confidence interval (95% CI: 0.61–0.81 for OMS (Total_Score; 0.66 (95% CI: 0.57–0.74 for Predict_FIRST; 0.65 (95% CI 0.57–0.73 for PT_Risk; and 0.52 for Past_Falls (95% CI: 0.46–0.60. A simple 3-item tool (Predict_CM2 was developed from the most predictive individual items (impaired mobility/transfer ability, impaired cognition, and male sex. The accuracy of Predict_CM2 was 0.73 (95% CI: 0.66–0.81, comparable to OMS (Total_Score (p = 0.52, significantly better than Predict_FIRST (p = 0.04, and Past_Falls (p < 0.001, and approaching significantly better than PT_Risk (p = 0.09. Conclusion: Predict_CM2 is a simpler screening tool with similar accuracy for predicting fallers in rehabilitation to OMS (Total_Score and better accuracy than Predict_FIRST or Past_Falls. External validation of Predict_CM2 is required.

  18. Combining in Theory Building

    Directory of Open Access Journals (Sweden)

    Uolevi Lehtinen

    2013-07-01

    Full Text Available The objectives of this article strive to describe the idea and rationale of combining i.e. why, when and how to develop theoretically new combined approaches. Then business administration, especially marketing is used as a theoretical and empirical illustrative area. Methodology is inductive and deductive logic and in the empirical examples surveys, case analysis and utilization of secondary data. This article introduce a new promising way, in the long run, to develop new comprehensive approaches and even paradigms for different disciplines, subdisciplines and branches of subdiciplines. Therefore, the ultimate message of the article is to challenge the researchers to put the idea and rationale for combing to the test in their own research field and to build new combined and comprehensive approaches if possible in the field. This message is rather multidisciplinary concerning for example economics, social sciences and political sciences in addition to business administration.

  19. A holistic approach to corporate social responsibility as a prerequisite for sustainable development: Empirical evidence

    Directory of Open Access Journals (Sweden)

    Zlatanović Dejana

    2015-01-01

    Full Text Available The growing importance of sustainable development and corporate social responsibility (CSR for contemporary organizations demands appropriate holistic tools. The paper highlights how Soft Systems Methodology (SSM, a relevant holistic, i.e., soft systems approach, supports the conceptualization and management of the complex issues of CSR and sustainable development. The SSM’s key methodological tools are used: rich picture, root definitions, and conceptual models. Empirical research compares a selected sample of enterprises in the automotive industry in the Republic of Serbia, to identify possible systemically desirable and culturally feasible changes to improve their CSR behaviour through promoting their sustainable development. Some limitations of this research and of SSM application are discussed. Combining SSM with some other systems approaches, such as System Dynamics or Critical Systems Heuristics, is recommended for future research.

  20. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry

    2016-11-28

    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  1. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  2. Assessment of MYCN amplification status in Tunisian neuroblastoma: CISH and MLPA combining approach.

    Science.gov (United States)

    H'Mida Ben Brahim, Dorra; Trabelsi, Saoussen; Chabchoub, Imen; Gargouri, Inesse; Harrabi, Imed; Moussa, Adnene; Chourabi, Maroua; Haddaji, Marwa; Sassi, Sihem; Mougou, Soumaya; Gribaa, Moez; Ben Ahmed, Slim; Zakhama, Abdelfattah; Nouri, Abdellatif; Saad, Ali

    2015-01-01

    Neuroblastoma (NB) shows a complex combination of genetic aberrations. Some of them represent poor genetic prognosis factors that require specific and intensive chemotherapy. MYCN amplification consists of the major bad outcome prognostic factor, it is indeed frequently observed in aggressive neuroblastomas. To date different methods are used for MYCN status detection. The primary aim of our study was to provide a critical assessment of MYCN status using 2 molecular techniques CISH and MLPA. We also focused on the correlation between neuroblastoma genetic markers and patient's clinical course among 15 Tunisian patients. we developed a descriptive study that includes 15 pediatric Tunisian patients referred to our laboratory from 2004 to 2011. We reported the analysis of fresh and FFPE NB tumors tissues. No significant correlation was found between COG grade and patients overall survival. Assessment of NMYC gene copy number by kappa statistic test revealed high concordance between CISH and MLPA tests (kappa coefficient = 0.02). Despite misdiagnosing of MYCN status fewer than 5 copies, MLPA remains an effective molecular technique that enables a large panel of genomic aberrations screening. Thus combining CISH and MLPA is an effective molecular approach adopted in our laboratory. Our results allow pediatric oncologists to set up the first Neuroblastoma therapeutic strategy based on molecular markers in Tunisia.

  3. Combining Formal and Functional Approaches to Topic Structure

    Science.gov (United States)

    Zellers, Margaret; Post, Brechtje

    2012-01-01

    Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to…

  4. Reducing the operational energy demand in buildings using building information modeling tools and sustainability approaches

    Directory of Open Access Journals (Sweden)

    Mojtaba Valinejad Shoubi

    2015-03-01

    Full Text Available A sustainable building is constructed of materials that could decrease environmental impacts, such as energy usage, during the lifecycle of the building. Building Information Modeling (BIM has been identified as an effective tool for building performance analysis virtually in the design stage. The main aims of this study were to assess various combinations of materials using BIM and identify alternative, sustainable solutions to reduce operational energy consumption. The amount of energy consumed by a double story bungalow house in Johor, Malaysia, and assessments of alternative material configurations to determine the best energy performance were evaluated by using Revit Architecture 2012 and Autodesk Ecotect Analysis software to show which of the materials helped in reducing the operational energy use of the building to the greatest extent throughout its annual life cycle. At the end, some alternative, sustainable designs in terms of energy savings have been suggested.

  5. A Decision Support Tool for Appropriate Glucose-Lowering Therapy in Patients with Type 2 Diabetes

    DEFF Research Database (Denmark)

    Ampudia-Blasco, F Javier; Benhamou, Pierre Yves; Charpentier, Guillaume

    2014-01-01

    Abstract Background: Optimal glucose-lowering therapy in type 2 diabetes mellitus requires a patient-specific approach. Although a good framework, current guidelines are insufficiently detailed to address the different phenotypes and individual needs of patients seen in daily practice. We developed...... a patient-specific decision support tool based on a systematic analysis of expert opinion. Materials and Methods: Based on the American Diabetes Association (ADA)/European Association for the Study of Diabetes (EASD) 2012 position statement, a panel of 12 European experts rated the appropriateness (RAND....... The panel recommendations were embedded in an online decision support tool (DiaScope(®); Novo Nordisk Health Care AG, Zürich, Switzerland). Results: Treatment appropriateness was associated with (combinations of) the patient variables mentioned above. As second-line agents, dipeptidyl peptidase-4 inhibitors...

  6. Combined approach branchial sinusectomy: a new technique for excision of second branchial cleft sinus.

    Science.gov (United States)

    Olusesi, A D

    2009-10-01

    Branchial cleft anomalies are well described, with the second arch anomaly being the commonest. Following surgical excision, recurrence occurs in 2 to 22 per cent of cases, and is believed to be due largely to incomplete resection. This report aims to describe a simple surgical technique for treatment of second branchial cleft sinus in the older paediatric age group and adults. An 11-year-old girl underwent surgical excision of a second branchial sinus. Prior to surgery, she was assessed by means of an imaging sonogram, and by direct methylene blue dye injection into the sinus on the operating table, followed by insertion of a metallic probe. Dissection was of the 'step ladder' incision type, but the incision was completed via an oropharyngeal approach. Histological examination of the lesion after excision established the diagnosis. No recurrence had been observed at the time of writing. Although they are congenital lesions, second branchial cleft abnormalities usually present in the older paediatric age group or even in adulthood. In the case reported, a simple combined approach ensured completeness of resection.

  7. A Hybrid Hierarchical Approach for Brain Tissue Segmentation by Combining Brain Atlas and Least Square Support Vector Machine

    Science.gov (United States)

    Kasiri, Keyvan; Kazemi, Kamran; Dehghani, Mohammad Javad; Helfroush, Mohammad Sadegh

    2013-01-01

    In this paper, we present a new semi-automatic brain tissue segmentation method based on a hybrid hierarchical approach that combines a brain atlas as a priori information and a least-square support vector machine (LS-SVM). The method consists of three steps. In the first two steps, the skull is removed and the cerebrospinal fluid (CSF) is extracted. These two steps are performed using the toolbox FMRIB's automated segmentation tool integrated in the FSL software (FSL-FAST) developed in Oxford Centre for functional MRI of the brain (FMRIB). Then, in the third step, the LS-SVM is used to segment grey matter (GM) and white matter (WM). The training samples for LS-SVM are selected from the registered brain atlas. The voxel intensities and spatial positions are selected as the two feature groups for training and test. SVM as a powerful discriminator is able to handle nonlinear classification problems; however, it cannot provide posterior probability. Thus, we use a sigmoid function to map the SVM output into probabilities. The proposed method is used to segment CSF, GM and WM from the simulated magnetic resonance imaging (MRI) using Brainweb MRI simulator and real data provided by Internet Brain Segmentation Repository. The semi-automatically segmented brain tissues were evaluated by comparing to the corresponding ground truth. The Dice and Jaccard similarity coefficients, sensitivity and specificity were calculated for the quantitative validation of the results. The quantitative results show that the proposed method segments brain tissues accurately with respect to corresponding ground truth. PMID:24696800

  8. Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads

    Science.gov (United States)

    Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard

    2016-11-01

    Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.

  9. Systemic Approach to Architectural Performance

    Directory of Open Access Journals (Sweden)

    Marie Davidova

    2017-04-01

    Full Text Available First-hand experiences in several design projects that were based on media richness and collaboration are described in this article. Although complex design processes are merely considered as socio-technical systems, they are deeply involved with natural systems. My collaborative research in the field of performance-oriented design combines digital and physical conceptual sketches, simulations and prototyping. GIGA-mapping - is applied to organise the data. The design process uses the most suitable tools, for the subtasks at hand, and the use of media is mixed according to particular requirements. These tools include digital and physical GIGA-mapping, parametric computer aided design (CAD, digital simulation of analyses, as well as sampling and 1:1 prototyping. Also discussed in this article are the methodologies used in several design projects to strategize these tools and the developments and trends in the tools employed.  The paper argues that the digital tools tend to produce similar results through given pre-sets that often do not correspond to real needs. Thus, there is a significant need for mixed methods including prototyping in the creative design process. Media mixing and cooperation across disciplines is unavoidable in the holistic approach to contemporary design. This includes the consideration of diverse biotic and abiotic agents. I argue that physical and digital GIGA-mapping is a crucial tool to use in coping with this complexity. Furthermore, I propose the integration of physical and digital outputs in one GIGA-map and the participation and co-design of biotic and abiotic agents into one rich design research space, which is resulting in an ever-evolving research-design process-result time-based design.

  10. Design principles of metal-cutting machine tools

    CERN Document Server

    Koenigsberger, F

    1964-01-01

    Design Principles of Metal-Cutting Machine Tools discusses the fundamentals aspects of machine tool design. The book covers the design consideration of metal-cutting machine, such as static and dynamic stiffness, operational speeds, gearboxes, manual, and automatic control. The text first details the data calculation and the general requirements of the machine tool. Next, the book discusses the design principles, which include stiffness and rigidity of the separate constructional elements and their combined behavior under load, as well as electrical, mechanical, and hydraulic drives for the op

  11. Multi-criteria approach with linear combination technique and analytical hierarchy process in land evaluation studies

    Directory of Open Access Journals (Sweden)

    Orhan Dengiz

    2018-01-01

    Full Text Available Land evaluation analysis is a prerequisite to achieving optimum utilization of the available land resources. Lack of knowledge on best combination of factors that suit production of yields has contributed to the low production. The aim of this study was to determine the most suitable areas for agricultural uses. For that reasons, in order to determine land suitability classes of the study area, multi-criteria approach was used with linear combination technique and analytical hierarchy process by taking into consideration of some land and soil physico-chemical characteristic such as slope, texture, depth, derange, stoniness, erosion, pH, EC, CaCO3 and organic matter. These data and land mapping unites were taken from digital detailed soil map scaled as 1:5.000. In addition, in order to was produce land suitability map GIS was program used for the study area. This study was carried out at Mahmudiye, Karaamca, Yazılı, Çiçeközü, Orhaniye and Akbıyık villages in Yenişehir district of Bursa province. Total study area is 7059 ha. 6890 ha of total study area has been used as irrigated agriculture, dry farming agriculture, pasture while, 169 ha has been used for non-agricultural activities such as settlement, road water body etc. Average annual temperature and precipitation of the study area are 16.1oC and 1039.5 mm, respectively. Finally after determination of land suitability distribution classes for the study area, it was found that 15.0% of the study area has highly (S1 and moderately (S2 while, 85% of the study area has marginally suitable and unsuitable coded as S3 and N. It was also determined some relation as compared results of linear combination technique with other hierarchy approaches such as Land Use Capability Classification and Suitability Class for Agricultural Use methods.

  12. Cooling Systems Design in Hot Stamping Tools by a Thermal-Fluid-Mechanical Coupled Approach

    Directory of Open Access Journals (Sweden)

    Tao Lin

    2014-06-01

    Full Text Available Hot stamping tools with cooling systems are the key facilities for hot stamping process of Ultrahigh strength steels (UHSS in automotive industry. Hot stamping tools have significant influence on the final microstructure and properties of the hot stamped parts. In serials production, the tools should be rapidly cooled by cooling water. Hence, design of hot stamping tools with cooling systems is important not only for workpieces of good quality but also for the tools with good cooling performance and long life. In this paper, a new multifield simulation method was proposed for the design of hot stamping tools with cooling system. The deformation of the tools was also analyzed by this method. Based on MpCCI (Mesh-based parallel Code Coupling Interface, thermal-fluid simulation and thermal-fluid-mechanical coupled simulation were performed. Subsequently, the geometrical parameters of the cooling system are investigated for the design. The results show that, both the distance between the ducts and the distance between the ducts and the tools loaded contour have significant influence on the quenching effect. And better quenching effect can be achieved with the shorter distance from the tool surface and with smaller distance between ducts. It is also shown that, thermal expansion is the main reason for deformation of the hot forming tools, which causes the distortion of the cooling ducts, and the stress concentration at corner of the ducts.

  13. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  14. Combination of real options and game-theoretic approach in investment analysis

    Science.gov (United States)

    Arasteh, Abdollah

    2016-09-01

    Investments in technology create a large amount of capital investments by major companies. Assessing such investment projects is identified as critical to the efficient assignment of resources. Viewing investment projects as real options, this paper expands a method for assessing technology investment decisions in the linkage existence of uncertainty and competition. It combines the game-theoretic models of strategic market interactions with a real options approach. Several key characteristics underlie the model. First, our study shows how investment strategies rely on competitive interactions. Under the force of competition, firms hurry to exercise their options early. The resulting "hurry equilibrium" destroys the option value of waiting and involves violent investment behavior. Second, we get best investment policies and critical investment entrances. This suggests that integrating will be unavoidable in some information product markets. The model creates some new intuitions into the forces that shape market behavior as noticed in the information technology industry. It can be used to specify best investment policies for technology innovations and adoptions, multistage R&D, and investment projects in information technology.

  15. Combining 2-m temperature nowcasting and short range ensemble forecasting

    Directory of Open Access Journals (Sweden)

    A. Kann

    2011-12-01

    Full Text Available During recent years, numerical ensemble prediction systems have become an important tool for estimating the uncertainties of dynamical and physical processes as represented in numerical weather models. The latest generation of limited area ensemble prediction systems (LAM-EPSs allows for probabilistic forecasts at high resolution in both space and time. However, these systems still suffer from systematic deficiencies. Especially for nowcasting (0–6 h applications the ensemble spread is smaller than the actual forecast error. This paper tries to generate probabilistic short range 2-m temperature forecasts by combining a state-of-the-art nowcasting method and a limited area ensemble system, and compares the results with statistical methods. The Integrated Nowcasting Through Comprehensive Analysis (INCA system, which has been in operation at the Central Institute for Meteorology and Geodynamics (ZAMG since 2006 (Haiden et al., 2011, provides short range deterministic forecasts at high temporal (15 min–60 min and spatial (1 km resolution. An INCA Ensemble (INCA-EPS of 2-m temperature forecasts is constructed by applying a dynamical approach, a statistical approach, and a combined dynamic-statistical method. The dynamical method takes uncertainty information (i.e. ensemble variance from the operational limited area ensemble system ALADIN-LAEF (Aire Limitée Adaptation Dynamique Développement InterNational Limited Area Ensemble Forecasting which is running operationally at ZAMG (Wang et al., 2011. The purely statistical method assumes a well-calibrated spread-skill relation and applies ensemble spread according to the skill of the INCA forecast of the most recent past. The combined dynamic-statistical approach adapts the ensemble variance gained from ALADIN-LAEF with non-homogeneous Gaussian regression (NGR which yields a statistical mbox{correction} of the first and second moment (mean bias and dispersion for Gaussian distributed continuous

  16. Rapid Benefit Indicator (RBI) Checklist Tool - Quick Start ...

    Science.gov (United States)

    The Rapid Benefits Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration – A Rapid Benefits Indicators Approach for Decision Makers. This checklist tool is intended to be used to record information as you answer the questions in that guide. When performing a Rapid Benefits Indicator (RBI) assessment on wetlands restoration site(s) results can be recorded and reviewed using this VBA enabled MS Excel Checklist Tool.

  17. Combining sap flow and eddy covariance approaches to derive stomatal and non-stomatal O3 fluxes in a forest stand

    International Nuclear Information System (INIS)

    Nunn, A.J.; Cieslik, S.; Metzger, U.; Wieser, G.; Matyssek, R.

    2010-01-01

    Stomatal O 3 fluxes to a mixed beech/spruce stand (Fagus sylvatica/Picea abies) in Central Europe were determined using two different approaches. The sap flow technique yielded the tree-level transpiration, whereas the eddy covariance method provided the stand-level evapotranspiration. Both data were then converted into stomatal ozone fluxes, exemplifying this novel concept for July 2007. Sap flow-based stomatal O 3 flux was 33% of the total O 3 flux, whereas derivation from evapotranspiration rates in combination with the Penman-Monteith algorithm amounted to 47%. In addition to this proportional difference, the sap flow-based assessment yielded lower levels of stomatal O 3 flux and reflected stomatal regulation rather than O 3 exposure, paralleling the daily courses of canopy conductance for water vapor and eddy covariance-based total stand-level O 3 flux. The demonstrated combination of sap flow and eddy covariance approaches supports the development of O 3 risk assessment in forests from O 3 exposure towards flux-based concepts. - Combined tree sap flow and eddy covariance-based methodologies yield stomatal O 3 flux as 33% in total stand flux.

  18. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  19. FACET CLASSIFICATIONS OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2013-12-01

    Full Text Available The article deals with the classification of e-learning tools based on the facet method, which suggests the separation of the parallel set of objects into independent classification groups; at the same time it is not assumed rigid classification structure and pre-built finite groups classification groups are formed by a combination of values taken from the relevant facets. An attempt to systematize the existing classification of e-learning tools from the standpoint of classification theory is made for the first time. Modern Ukrainian and foreign facet classifications of e-learning tools are described; their positive and negative features compared to classifications based on a hierarchical method are analyzed. The original author's facet classification of e-learning tools is proposed.

  20. Combining independent de novo assemblies optimizes the coding transcriptome for nonconventional model eukaryotic organisms.

    Science.gov (United States)

    Cerveau, Nicolas; Jackson, Daniel J

    2016-12-09

    Next-generation sequencing (NGS) technologies are arguably the most revolutionary technical development to join the list of tools available to molecular biologists since PCR. For researchers working with nonconventional model organisms one major problem with the currently dominant NGS platform (Illumina) stems from the obligatory fragmentation of nucleic acid material that occurs prior to sequencing during library preparation. This step creates a significant bioinformatic challenge for accurate de novo assembly of novel transcriptome data. This challenge becomes apparent when a variety of modern assembly tools (of which there is no shortage) are applied to the same raw NGS dataset. With the same assembly parameters these tools can generate markedly different assembly outputs. In this study we present an approach that generates an optimized consensus de novo assembly of eukaryotic coding transcriptomes. This approach does not represent a new assembler, rather it combines the outputs of a variety of established assembly packages, and removes redundancy via a series of clustering steps. We test and validate our approach using Illumina datasets from six phylogenetically diverse eukaryotes (three metazoans, two plants and a yeast) and two simulated datasets derived from metazoan reference genome annotations. All of these datasets were assembled using three currently popular assembly packages (CLC, Trinity and IDBA-tran). In addition, we experimentally demonstrate that transcripts unique to one particular assembly package are likely to be bioinformatic artefacts. For all eight datasets our pipeline generates more concise transcriptomes that in fact possess more unique annotatable protein domains than any of the three individual assemblers we employed. Another measure of assembly completeness (using the purpose built BUSCO databases) also confirmed that our approach yields more information. Our approach yields coding transcriptome assemblies that are more likely to be

  1. SCENARIOS EVALUATION TOOL FOR CHLORINATED SOLVENT MNA

    Energy Technology Data Exchange (ETDEWEB)

    Vangelas, K; Brian02 Looney, B; Michael J. Truex; Charles J. Newell

    2006-08-16

    journal articles, as well as in the technical and regulatory documents being developed within the ITRC. Three topic areas were identified for development during this project. These areas are: mass balance, Enhanced Attenuation (EA), and new characterization and monitoring tools and approaches to support MNA and EA. Each of these topics is documented in stand alone reports, WSRC-STI-2006-00082, WSRC-STI-2006-00083, and WSRC-STI-2006-00084, respectively. In brief, the mass balance efforts are examining methods and tools to allow a site to be evaluated in terms of a system where the inputs and processes within the system are compared to the outputs from the system, as well as understanding what attenuation processes may be occurring and how likely they are to occur within a system. Enhanced Attenuation is a new concept that is a transition step between primary treatments and MNA, when the natural attenuation processes are not sufficient to allow direct transition from the primary treatment to MNA. EA technologies are designed to either boost the level of the natural attenuation processes or decrease the loading of contaminants to the system for a period of time sufficient to allow the remedial goals to be met over the long-term. For characterization and monitoring, a phased approach based on documenting the site specific mass balance was developed. Tools and techniques to support the approach included direct measures of the biological processes and various tools to support cost-effective long-term monitoring of systems where the natural attenuation processes are the main treatment remedies. The effort revealed opportunities for integrating attenuation mechanisms into a systematic set of ''combined remedies'' for contaminated sites.

  2. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-01

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructed community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.

  3. Practical, general parser combinators

    NARCIS (Netherlands)

    A. Izmaylova (Anastasia); A. Afroozeh (Ali); T. van der Storm (Tijs)

    2016-01-01

    textabstractParser combinators are a popular approach to parsing where contextfree grammars are represented as executable code. However, conventional parser combinators do not support left recursion, and can have worst-case exponential runtime. These limitations hinder the expressivity and

  4. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain.

    Science.gov (United States)

    Srivastava, Subodh; Sharma, Neeraj; Singh, S K; Srivastava, R

    2014-07-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based unsharp masking and crispening method is applied on the approximation coefficients of the wavelet transformed images to further highlight the abnormalities such as micro-calcifications, tumours, etc., to reduce the false positives (FPs). Thirdly, a modified fuzzy c-means (FCM) segmentation method is applied on the output of the second step. In the modified FCM method, the mutual information is proposed as a similarity measure in place of conventional Euclidian distance based dissimilarity measure for FCM segmentation. Finally, the inverse 2D-DWT is applied. The efficacy of the proposed unsharp masking and crispening method for image enhancement is evaluated in terms of signal-to-noise ratio (SNR) and that of the proposed segmentation method is evaluated in terms of random index (RI), global consistency error (GCE), and variation of information (VoI). The performance of the proposed segmentation approach is compared with the other commonly used segmentation approaches such as Otsu's thresholding, texture based, k-means, and FCM clustering as well as thresholding. From the obtained results, it is observed that the proposed segmentation approach performs better and takes lesser processing time in comparison to the standard FCM and other segmentation methods in consideration.

  5. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    International Nuclear Information System (INIS)

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    A quantitative local computed tomography combined with data-constrained modelling has been developed. The method could improve distinctly the spatial resolution and the composition resolution in a sample larger than the field of view, for quantitative characterization of three-dimensional distributions of material compositions and void. Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials

  6. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    OpenAIRE

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnost...

  7. Modelling and analysis of tool wear and surface roughness in hard turning of AISI D2 steel using response surface methodology

    Directory of Open Access Journals (Sweden)

    M. Junaid Mir

    2018-01-01

    Full Text Available The present work deals with some machinability studies on tool wear and surface roughness, in finish hard turning of AISI D2 steel using PCBN, Mixed ceramic and coated carbide inserts. The machining experiments are conducted based on the response surface methodology (RSM. Combined effects of three cutting parameters viz., cutting speed, cutting time and tool hardness on the two performance outputs (i.e. VB and Ra, are explored employing the analysis of variance (ANOVA.The relationship(s between input variables and the response parameters are determined using a quadratic regression model. The results show that the tool wear was influenced principally by the cutting time and in the second level by the cutting tool hardness. On the other hand, cutting time was the dominant factor affecting workpiece surface roughness followed by cutting speed. Finally, the multiple response optimizations of tool wear and surface roughness were carried out using the desirability function approach (DFA.

  8. Sindbad: a multi-purpose and scalable X-ray simulation tool for NDE and medical imaging

    Energy Technology Data Exchange (ETDEWEB)

    Guillemaud, R.; Tabary, J.; Hugonnard, P.; Mathy, F.; Koenig, A.; Gliere, A

    2003-07-01

    In a unified framework, S.i.n.d.b.a.d. is a multipurpose X-ray simulation software which provides scalable approach of computation and very efficient results by combining analytical and monte Carlo simulations. The software has been validated experimentally. it is also a easy to use software with a strong emphasize on user friendly GUI, simple description of object (CAD or volume) and visualization tools. The next developments will be focused on acceleration of Monte Carlo simulation for scatter fraction computation and the addition of new types of detector. (N.C.)

  9. Sequential extraction combined with isotope analysis as a tool for the investigation of lead mobilisation in soils: Application to organic-rich soils in an upland catchment in Scotland

    International Nuclear Information System (INIS)

    Bacon, Jeffrey R.; Farmer, John G.; Dunn, Sarah M.; Graham, Margaret C.; Vinogradoff, Susan I.

    2006-01-01

    Sequential extraction (modified BCR procedure) combined with isotope analysis has been investigated as a tool for assessing mobilisation of lead into streams at an upland catchment in NE Scotland. The maximum lead concentrations (up to 110 mg kg -1 in air-dried soil) occurred not at the surface but at about 10 cm depth. The lowest 206 Pb/ 207 Pb ratios in any profile occurred, with one exception, at 2.5-5 cm depth. In the one exception, closest to the only road in the area, significantly lower 206 Pb/ 207 Pb ratios in the surface soil together with much increased chloride concentrations (in comparison to other surface waters) indicated the possible mobilisation of roadside lead and transfer to the stream. The 206 Pb/ 207 Pb ratios in extractable fractions tended at depth towards the ratio measured in the residual phase but the ratios in the oxidizable fraction increased to a value higher than that of the residual phase. - Sequential extraction combined with isotope analysis was used as a tool to assess mobilisation of lead into streams

  10. A Novel Supra-Brow Combined with Infra-Brow Lift Approach for Asian Women.

    Science.gov (United States)

    Shu, Maoguo; He, Lin; Su, Yingjun; Shi, Junli; Zhang, Xi; Liu, Xiangyu; Yu, Xueyuan

    2016-06-01

    Direct brow lift surgery remains popular among Asian women despite its disadvantages. The traditional direct brow lift by a supra-brow incision is not suitable for Asian women because of their unique facial features, such as higher eyebrows, wider upper eyelids, and more orbital fat. Therefore, we designed a novel brow lift technique via a supra-brow combined with an infra-brow approach for Asian women. An area of skin above and below the eyebrow was measured, demarcated, and surgically removed. The redundant orbicularis oculi muscle (OOM) was excised while keeping the frontalis muscle intact. The OOM in the inferior flap was elevated and sutured to the frontalis muscle. In cases of puffy eyelids, orbital fat was partially removed through an infra-brow incision. Finally, a series of modifications were performed to reduce post-operative scarring. A total of 496 patients underwent this surgery from July 2009 to December 2013 and 432 patients were followed up for at least 6 months after surgery. Post-operative scars, in most patients (428/432), were inconspicuous. There were no facial nerve injuries documented and eight patients reported transient forehead numbness. The height of the palpebral fissure was increased but there was no marked increase observed of the distance between the upper eyelid edge and the eyebrow. In follow-up visits, 409 out of 432 patients (94.7 %) were satisfied with their surgical results. This new brow lift technique via a supra-brow combined with an infra-brow approach provided a simple and safe surgical repair of lateral brow ptosis, upper eyelids hooding, and crows' feet in Asian women. The surgical outcomes were predictable and the scars were inconspicuous. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  11. Process development and tooling design for intrinsic hybrid composites

    Science.gov (United States)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  12. Monitoring hemodynamics and oxygenation of the kidney in rats by a combined near-infrared spectroscopy and invasive probe approach

    Science.gov (United States)

    Grosenick, Dirk; Cantow, Kathleen; Arakelyan, Karen; Wabnitz, Heidrun; Flemming, Bert; Skalweit, Angela; Ladwig, Mechthild; Macdonald, Rainer; Niendorf, Thoralf; Seeliger, Erdmann

    2015-07-01

    We have developed a hybrid approach to investigate the dynamics of perfusion and oxygenation in the kidney of rats under pathophysiologically relevant conditions. Our approach combines near-infrared spectroscopy to quantify hemoglobin concentration and oxygen saturation in the renal cortex, and an invasive probe method for measuring total renal blood flow by an ultrasonic probe, perfusion by laser-Doppler fluxmetry, and tissue oxygen tension via fluorescence quenching. Hemoglobin concentration and oxygen saturation were determined from experimental data by a Monte Carlo model. The hybrid approach was applied to investigate and compare temporal changes during several types of interventions such as arterial and venous occlusions, as well as hyperoxia, hypoxia and hypercapnia induced by different mixtures of the inspired gas. The approach was also applied to study the effects of the x-ray contrast medium iodixanol on the kidney.

  13. Classification and optimization of training tools for NPP simulator

    International Nuclear Information System (INIS)

    Billoen, G. van

    1994-01-01

    The training cycle of nuclear power plant (NPP) operators has evolved during the last decade in parallel with the evolution of the training tools. The phases of the training cycle can be summarized as follows: (1) basic principle learning, (2) specific functional training, (3) full operating range training, and (4) detailed accident analyses. The progress in simulation technology and man/machine interface (MMI) gives the training centers new opportunities to improve their training methods and effectiveness in the transfer of knowledge. To take advantage of these new opportunities a significant investment in simulation tools may be required. It is therefore important to propose an optimized approach when dealing with the overall equipment program for these training centers. An overall look of tools proposed on the international simulation market shows that there is a need for systematic approach in this field. Classification of the different training tools needed for each training cycle is the basis for an optimized approach in terms of hardware configuration and software specifications of the equipment to install in training centers. The 'Multi-Function Simulator' is one of the approaches. (orig.) (3 tabs.)

  14. Orthology detection combining clustering and synteny for very large datasets.

    Directory of Open Access Journals (Sweden)

    Marcus Lechner

    Full Text Available The elucidation of orthology relationships is an important step both in gene function prediction as well as towards understanding patterns of sequence evolution. Orthology assignments are usually derived directly from sequence similarities for large data because more exact approaches exhibit too high computational costs. Here we present PoFF, an extension for the standalone tool Proteinortho, which enhances orthology detection by combining clustering, sequence similarity, and synteny. In the course of this work, FFAdj-MCS, a heuristic that assesses pairwise gene order using adjacencies (a similarity measure related to the breakpoint distance was adapted to support multiple linear chromosomes and extended to detect duplicated regions. PoFF largely reduces the number of false positives and enables more fine-grained predictions than purely similarity-based approaches. The extension maintains the low memory requirements and the efficient concurrency options of its basis Proteinortho, making the software applicable to very large datasets.

  15. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    Science.gov (United States)

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  16. Advanced control approach for hybrid systems based on solid oxide fuel cells

    International Nuclear Information System (INIS)

    Ferrari, Mario L.

    2015-01-01

    Highlights: • Advanced new control system for SOFC based hybrid plants. • Proportional–Integral approach with feed-forward technology. • Good control of fuel cell temperature. • All critical properties maintained inside safe conditions. - Abstract: This paper shows a new advanced control approach for operations in hybrid systems equipped with solid oxide fuel cell technology. This new tool, which combines feed-forward and standard proportional–integral techniques, controls the system during load changes avoiding failures and stress conditions detrimental to component life. This approach was selected to combine simplicity and good control performance. Moreover, the new approach presented in this paper eliminates the need for mass flow rate meters and other expensive probes, as usually required for a commercial plant. Compared to previous works, better performance is achieved in controlling fuel cell temperature (maximum gradient significantly lower than 3 K/min), reducing the pressure gap between cathode and anode sides (at least a 30% decrease during transient operations), and generating a higher safe margin (at least a 10% increase) for the Steam-to-Carbon Ratio. This new control system was developed and optimized using a hybrid system transient model implemented, validated and tested within previous works. The plant, comprising the coupling of a tubular solid oxide fuel cell stack with a microturbine, is equipped with a bypass valve able to connect the compressor outlet with the turbine inlet duct for rotational speed control. Following model development and tuning activities, several operative conditions were considered to show the new control system increased performance compared to previous tools (the same hybrid system model was used with the new control approach). Special attention was devoted to electrical load steps and ramps considering significant changes in ambient conditions

  17. Combining empirical and theory-based land-use modelling approaches to assess economic potential of biofuel production avoiding iLUC: Argentina as a case study

    NARCIS (Netherlands)

    Diogo, V.; van der Hilst, F.; van Eijck, J.; Verstegen, J.A.; Hilbert, J.; Carballo, S.; Volante, J.; Faaij, A.

    2014-01-01

    In this paper, a land-use modelling framework is presented combining empirical and theory-based modelling approaches to determine economic potential of biofuel production avoiding indirect land-use changes (iLUC) resulting from land competition with other functions. The empirical approach explores

  18. Combining transcranial magnetic stimulation and functional imaging in cognitive brain research: possibilities and limitations.

    Science.gov (United States)

    Sack, Alexander T; Linden, David E J

    2003-09-01

    Transcranial magnetic stimulation (TMS) is a widely used tool for the non-invasive study of basic neurophysiological processes and the relationship between brain and behavior. We review the physical and physiological background of TMS and discuss the large body of perceptual and cognitive studies, mainly in the visual domain, that have been performed with TMS in the past 15 years. We compare TMS with other neurophysiological and neuropsychological research tools and propose that TMS, compared with the classical neuropsychological lesion studies, can make its own unique contribution. As the main focus of this review, we describe the different approaches of combining TMS with functional neuroimaging techniques. We also discuss important shortcomings of TMS, especially the limited knowledge concerning its physiological effects, which often make the interpretation of TMS results ambiguous. We conclude with a critical analysis of the resulting conceptual and methodological limitations that the investigation of functional brain-behavior relationships still has to face. We argue that while some of the methodological limitations of TMS applied alone can be overcome by combination with functional neuroimaging, others will persist until its physical and physiological effects can be controlled.

  19. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  20. Additive manufacturing of tools for lapping glass

    Science.gov (United States)

    Williams, Wesley B.

    2013-09-01

    Additive manufacturing technologies have the ability to directly produce parts with complex geometries without the need for secondary processes, tooling or fixtures. This ability was used to produce concave lapping tools with a VFlash 3D printer from 3D Systems. The lapping tools were first designed in Creo Parametric with a defined constant radius and radial groove pattern. The models were converted to stereolithography files which the VFlash used in building the parts, layer by layer, from a UV curable resin. The tools were rotated at 60 rpm and used with 120 grit and 220 grit silicon carbide lapping paste to lap 0.750" diameter fused silica workpieces. The samples developed a matte appearance on the lapped surface that started as a ring at the edge of the workpiece and expanded to the center. This indicated that as material was removed, the workpiece radius was beginning to match the tool radius. The workpieces were then cleaned and lapped on a second tool (with equivalent geometry) using a 3000 grit corundum aluminum oxide lapping paste, until a near specular surface was achieved. By using lapping tools that have been additively manufactured, fused silica workpieces can be lapped to approach a specified convex geometry. This approach may enable more rapid lapping of near net shape workpieces that minimize the material removal required by subsequent polishing. This research may also enable development of new lapping tool geometry and groove patterns for improved loose abrasive finishing.