WorldWideScience

Sample records for intelligent debris analysis

  1. Forewarning of Debris flows using Intelligent Geophones

    Science.gov (United States)

    PK, I.; Ramesh, M. V.

    2017-12-01

    Landslides are one of the major catastrophic disasters that cause significant damage to human life and civil structures. Heavy rainfall on landslide prone areas can lead to most dangerous debris flow, where the materials such as mud, sand, soil, rock, water and air will move with greater velocity down the mountain. This sudden slope instability can lead to loss of human life and infrastructure. According to our knowledge, till now no one could identify the minutest factors that lead to initiation of the landslide. In this work, we aim to study the landslide phenomena deeply, using the landslide laboratory set up in our university. This unique mechanical simulator for landslide initiation is equipped with the capability to generate rainfall, seepage, etc., in the laboratory setup. Using this setup, we aim to study several landslide initiation scenarios generated by varying different parameters. The complete setup will be equipped with heterogeneous sensors such as rain gauge, moisture sensor, pore pressure sensor, strain gauges, tiltmeter, inclinometer, extensometer, and geophones. Our work will focus on the signals received from the intelligent geophone system for identifying the underground vibrations during a debris flow. Using the large amount of signals derived from the laboratory set up, we have performed detailed signal processing and data analysis to determine the fore warning signals captured by these heterogeneous sensors. Detailed study of these heterogeneous signals has provided the insights to forewarning the community based on the signals generated during the laboratory tests. In this work we will describe the details of the design, development, methodology, results, inferences and the suggestion for the next step to detect and forewarn the students. The response of intelligent geophone sensors at the time of failure, failure style and subsequent debris flow for heterogeneous soil layers were studied, thus helping in the development of fore warning

  2. TMI-2 core debris analysis

    International Nuclear Information System (INIS)

    Cook, B.A.; Carlson, E.R.

    1985-01-01

    One of the ongoing examination tasks for the damaged TMI-2 reactor is analysis of samples of debris obtained from the debris bed presently at the top of the core. This paper summarizes the results reported in the TMI-2 Core Debris Grab Sample Examination and Analysis Report, which will be available early in 1986. The sampling and analysis procedures are presented, and information is provided on the key results as they relate to the present core condition, peak temperatures during the transient, temperature history, chemical interactions, and core relocation. The results are then summarized

  3. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  4. Intelligent audio analysis

    CERN Document Server

    Schuller, Björn W

    2013-01-01

    This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition.  Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...

  5. Intelligence analysis – the royal discipline of Competitive Intelligence

    OpenAIRE

    František Bartes

    2011-01-01

    The aim of this article is to propose work methodology for Competitive Intelligence teams in one of the intelligence cycle’s specific area, in the so-called “Intelligence Analysis”. Intelligence Analysis is one of the stages of the Intelligence Cycle in which data from both the primary and secondary research are analyzed. The main result of the effort is the creation of added value for the information collected. Company Competiitve Intelligence, correctly understood and implemented in busines...

  6. Space Debris Removal: A Game Theoretic Analysis

    Directory of Open Access Journals (Sweden)

    Richard Klima

    2016-08-01

    Full Text Available We analyse active space debris removal efforts from a strategic, game-theoretical perspective. Space debris is non-manoeuvrable, human-made objects orbiting Earth, which pose a significant threat to operational spacecraft. Active debris removal missions have been considered and investigated by different space agencies with the goal to protect valuable assets present in strategic orbital environments. An active debris removal mission is costly, but has a positive effect for all satellites in the same orbital band. This leads to a dilemma: each agency is faced with the choice between the individually costly action of debris removal, which has a positive impact on all players; or wait and hope that others jump in and do the ‘dirty’ work. The risk of the latter action is that, if everyone waits, the joint outcome will be catastrophic, leading to what in game theory is referred to as the ‘tragedy of the commons’. We introduce and thoroughly analyse this dilemma using empirical game theory and a space debris simulator. We consider two- and three-player settings, investigate the strategic properties and equilibria of the game and find that the cost/benefit ratio of debris removal strongly affects the game dynamics.

  7. Alternative fuels in fire debris analysis: biodiesel basics.

    Science.gov (United States)

    Stauffer, Eric; Byron, Doug

    2007-03-01

    Alternative fuels are becoming more prominent on the market today and, soon, fire debris analysts will start seeing them in liquid samples or in fire debris samples. Biodiesel fuel is one of the most common alternative fuels and is now readily available in many parts of the United States and around the world. This article introduces biodiesel to fire debris analysts. Biodiesel fuel is manufactured from vegetable oils and/or animal oils/fats. It is composed of fatty acid methyl esters (FAMEs) and is sold pure or as a blend with diesel fuel. When present in fire debris samples, it is recommended to extract the debris using passive headspace concentration on activated charcoal, possibly followed by a solvent extraction. The gas chromatographic analysis of the extract is first carried out with the same program as for regular ignitable liquid residues, and second with a program adapted to the analysis of FAMEs.

  8. ASTM standards for fire debris analysis: a review.

    Science.gov (United States)

    Stauffer, Eric; Lentini, John J

    2003-03-12

    The American Society for Testing and Materials (ASTM) recently updated its standards E 1387 and E 1618 for the analysis of fire debris. The changes in the classification of ignitable liquids are presented in this review. Furthermore, a new standard on extraction of fire debris with solid phase microextraction (SPME) was released. Advantages and drawbacks of this technique are presented and discussed. Also, the standard on cleanup by acid stripping has not been reapproved. Fire debris analysts that use the standards should be aware of these changes.

  9. Bremsstrahlung converter debris shields: test and analysis

    International Nuclear Information System (INIS)

    Reedy, E.D. Jr.; Perry, F.C.

    1983-10-01

    Electron beam accelerators are commonly used to create bremsstrahlung x-rays for effects testing. Typically, the incident electron beam strikes a sandwich of three materials: (1) a conversion foil, (2) an electron scavenger, and (3) a debris shield. Several laboratories, including Sandia National Laboratories, are developing bremsstrahlung x-ray sources with much larger test areas (approx. 200 to 500 cm 2 ) than ever used before. Accordingly, the debris shield will be much larger than before and subject to loads which could cause shield failure. To prepare for this eventuality, a series of tests were run on the Naval Surface Weapons Center's Casino electron beam accelerator (approx. 1 MeV electrons, 100 ns FWHM pulse, 45 kJ beam energy). The primary goal of these tests was to measure the stress pulse which loads a debris shield. These measurements were made with carbon gages mounted on the back of the converter sandwich. At an electron beam fluence of about 1 kJ/cm 2 , the measured peak compressive stress was typically in the 1 to 2 kbar range. Measured peak compressive stress scaled in a roughly linear manner with fluence level as the fluence level was increased to 10 kJ/cm 2 . The duration of the compressive pulse was on the order of microseconds. In addition to the stress wave measurements, a limited number of tests were made to investigate the type of damage generated in several potential shield materials

  10. Economic analysis requirements in support of orbital debris regulatory policy

    Science.gov (United States)

    Greenberg, Joel S.

    1996-10-01

    As the number of Earth orbiting objects increases so does the potential for generating orbital debris with the consequent increase in the likelihood of impacting and damaging operating satellites. Various debris remediation approaches are being considered that encompass both in-orbit and return-to-Earth schema and have varying degrees of operations, cost, international competitiveness, and safety implications. Because of the diversity of issues, concerns and long-term impacts, there is a clear need for the setting of government policies that will lead to an orderly abatement of the potential orbital debris hazards. These policies may require the establishment of a supportive regulatory regime. The Department of Transportation is likely to have regulatory responsibilities relating to orbital debris stemming from its charge to protect the public health and safety, safety of property, and national security interests and foreign policy interests of the United States. This paper describes DOT's potential regulatory role relating to orbital debris remediation, the myriad of issues concerning the need for establishing government policies relating to orbital debris remediation and their regulatory implications, the proposed technological solutions and their economic and safety implications. Particular emphasis is placed upon addressing cost-effectiveness and economic analyses as they relate to economic impact analysis in support of regulatory impact analysis.

  11. Intelligence analysis – the royal discipline of Competitive Intelligence

    Directory of Open Access Journals (Sweden)

    František Bartes

    2011-01-01

    Full Text Available The aim of this article is to propose work methodology for Competitive Intelligence teams in one of the intelligence cycle’s specific area, in the so-called “Intelligence Analysis”. Intelligence Analysis is one of the stages of the Intelligence Cycle in which data from both the primary and secondary research are analyzed. The main result of the effort is the creation of added value for the information collected. Company Competiitve Intelligence, correctly understood and implemented in business practice, is the “forecasting of the future”. That is forecasting about the future, which forms the basis for strategic decisions made by the company’s top management. To implement that requirement in corporate practice, the author perceives Competitive Intelligence as a systemic application discipline. This approach allows him to propose a “Work Plan” for Competitive Intelligence as a fundamental standardized document to steer Competitive Intelligence team activities. The author divides the Competitive Intelligence team work plan into five basic parts. Those parts are derived from the five-stage model of the intelligence cycle, which, in the author’s opinion, is more appropriate for complicated cases of Competitive Intelligence.

  12. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  13. Computational Intelligence in Intelligent Data Analysis

    CERN Document Server

    Nürnberger, Andreas

    2013-01-01

    Complex systems and their phenomena are ubiquitous as they can be found in biology, finance, the humanities, management sciences, medicine, physics and similar fields. For many problems in these fields, there are no conventional ways to mathematically or analytically solve them completely at low cost. On the other hand, nature already solved many optimization problems efficiently. Computational intelligence attempts to mimic nature-inspired problem-solving strategies and methods. These strategies can be used to study, model and analyze complex systems such that it becomes feasible to handle them. Key areas of computational intelligence are artificial neural networks, evolutionary computation and fuzzy systems. As only a few researchers in that field, Rudolf Kruse has contributed in many important ways to the understanding, modeling and application of computational intelligence methods. On occasion of his 60th birthday, a collection of original papers of leading researchers in the field of computational intell...

  14. Intelligence analysis in corporate security

    Directory of Open Access Journals (Sweden)

    Manojlović Dragan

    2014-01-01

    Full Text Available Located in the survey indicate that the protection of a corporation, its internal and external interest from the perspective of quality data for intelligence analysis and the need for kroporacije and corporate security. Furthermore, the results indicate that the application is not only practical knowledge of intelligence analysis, but also its scientific knowledge, provides epistemologically oriented critique of traditional techniques undertaken in corporate security in connection with the analysis of the challenges, risks and threats. On the question of whether it can and should be understood only as a form of corporate espionage, any aspect of such a new concept in the theory and practice of corporate security, competitive intelligence activities, as well as an activity or involves a range of different methods and techniques meaningful and expedient activities to be implemented integrally and continuously within corporate security, given the multiple responses to the work. The privatization of intelligence activities as an irreversible process that was decades ago engulfed the western hemisphere, in the first decade of the third millennium has been accepted in Europe, in the sense that corporations at national and multinational levels of system intelligence analysis used not only for your safety but also for the competition, and nothing and less for growth companies and profits. It has become a resource that helps control their managers in corporations to make timely and appropriate decisions. Research has shown that intelligence analysis in corporate security one factor that brings the diversity of the people and give corporations an advantage not only in time, but much more on the market and product.

  15. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  16. Sampling and Analysis Plan for K Basins Debris

    International Nuclear Information System (INIS)

    WESTCOTT, J.L.

    2000-01-01

    This Sampling and Analysis Plan presents the rationale and strategy for sampling and analysis activities to support removal of debris from the K-East and K-West Basins located in the 100K Area at the Hanford Site. This project is focused on characterization to support waste designation for disposal of waste at the Environmental Restoration Disposal Facility (ERDF). This material has previously been dispositioned at the Hanford Low-Level Burial Grounds or Central Waste Complex. The structures that house the basins are classified as radioactive material areas. Therefore, all materials removed from the buildings are presumed to be radioactively contaminated. Because most of the materials that will be addressed under this plan will be removed from the basins, and because of the cost associated with screening materials for release, it is anticipated that all debris will be managed as low-level waste. Materials will be surveyed, however, to estimate radionuclide content for disposal and to determine that the debris is not contaminated with levels of transuranic radionuclides that would designate the debris as transuranic waste

  17. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  18. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  19. Global analysis of anthropogenic debris ingestion by sea turtles.

    Science.gov (United States)

    Schuyler, Qamar; Hardesty, Britta Denise; Wilcox, Chris; Townsend, Kathy

    2014-02-01

    Ingestion of marine debris can have lethal and sublethal effects on sea turtles and other wildlife. Although researchers have reported on ingestion of anthropogenic debris by marine turtles and implied incidences of debris ingestion have increased over time, there has not been a global synthesis of the phenomenon since 1985. Thus, we analyzed 37 studies published from 1985 to 2012 that report on data collected from before 1900 through 2011. Specifically, we investigated whether ingestion prevalence has changed over time, what types of debris are most commonly ingested, the geographic distribution of debris ingestion by marine turtles relative to global debris distribution, and which species and life-history stages are most likely to ingest debris. The probability of green (Chelonia mydas) and leatherback turtles (Dermochelys coriacea) ingesting debris increased significantly over time, and plastic was the most commonly ingested debris. Turtles in nearly all regions studied ingest debris, but the probability of ingestion was not related to modeled debris densities. Furthermore, smaller, oceanic-stage turtles were more likely to ingest debris than coastal foragers, whereas carnivorous species were less likely to ingest debris than herbivores or gelatinovores. Our results indicate oceanic leatherback turtles and green turtles are at the greatest risk of both lethal and sublethal effects from ingested marine debris. To reduce this risk, anthropogenic debris must be managed at a global level. © 2013 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.

  20. Analysis Of The 2009 July Impact Debris In Jupiter'S Atmosphere

    Science.gov (United States)

    Sanchez-Lavega, Agustin; Hueso, R.; Legarreta, J.; Pérez-Hoyos, S.; García-Melendo, E.; Gómez, J. M.; Rojas, J. F.; Orton, G. S.; Wesley, A.; IOPW International Outer Planet Watch Team

    2009-09-01

    We report the analysis of images obtained by the contributors to the International Outer Planet Watch (IOPW) of the debris left in the atmosphere of Jupiter by the object that impacted the planet between 18 and 19 July 2009. The discovery images by Anthony Wesley in July 19.625 and the first two days of its tracking, shows a dark debris spot (continuum wavelength) located at planetocentric latitude -55.1 deg and 304.5 deg System III longitude. The imaging survey indicates that the spot was not present in July 18.375, so the impact occurred during a window between both dates. The main spot had a size of about 4,500 km and to its Northwest a thin debris halo of similar size was initially observed. Methane band images at a wavelength of 890 nm shows the spot to be bright indicating that the debris aerosols are highly placed in the atmosphere relative to surrounding clouds. At the central latitude of the impact, the Jovian flow has nearly zero speed but anticyclonic vorticity bounded by jets at -51.5 deg (directed westward with velocity -10 m/s) and at -57.5 deg (directed eastward with velocity 25 m/s). The morphology in the continuum and the spot brightness in the methane band strongly suggest that the feature was caused by a cometary or asteroidal impact, similar in behaviour to the SL9 impacts of 1994. This work has been funded by Spanish MEC AYA2006-07735 with FEDER support and Grupos Gobierno Vasco IT-464-07. RH acknowledges a "Ramón y Cajal” contract from MEC.

  1. Analysis of the Herschel DEBRIS Sun-like star sample

    Science.gov (United States)

    Sibthorpe, B.; Kennedy, G. M.; Wyatt, M. C.; Lestrade, J.-F.; Greaves, J. S.; Matthews, B. C.; Duchêne, G.

    2018-04-01

    This paper presents a study of circumstellar debris around Sun-like stars using data from the Herschel DEBRIS Key Programme. DEBRIS is an unbiased survey comprising the nearest ˜90 stars of each spectral type A-M. Analysis of the 275 F-K stars shows that excess emission from a debris disc was detected around 47 stars, giving a detection rate of 17.1^{+2.6}_{-2.3} per cent, with lower rates for later spectral types. For each target a blackbody spectrum was fitted to the dust emission to determine its fractional luminosity and temperature. The derived underlying distribution of fractional luminosity versus blackbody radius in the population showed that most detected discs are concentrated at f ˜ 10-5 and at temperatures corresponding to blackbody radii 7-40 au, which scales to ˜40 au for realistic dust properties (similar to the current Kuiper belt). Two outlying populations are also evident; five stars have exceptionally bright emission ( f > 5 × 10-5), and one has unusually hot dust <4 au. The excess emission distributions at all wavelengths were fitted with a steady-state evolution model, showing that these are compatible with all stars being born with a narrow belt that then undergoes collisional grinding. However, the model cannot explain the hot dust systems - likely originating in transient events - and bright emission systems - arising potentially from atypically massive discs or recent stirring. The emission from the present-day Kuiper belt is predicted to be close to the median of the population, suggesting that half of stars have either depleted their Kuiper belts (similar to the Solar system) or had a lower planetesimal formation efficiency.

  2. Approaches to Enhance Sensemaking for Intelligence Analysis

    National Research Council Canada - National Science Library

    McBeth, Michael

    2002-01-01

    ..., and to apply persuasion skills to interact more productively with others. Each approach is explained from a sensemaking perspective and linked to Richard Heuer's Psychology of Intelligence Analysis...

  3. Integrating Oil Debris and Vibration Measurements for Intelligent Machine Health Monitoring. Degree awarded by Toledo Univ., May 2002

    Science.gov (United States)

    Dempsey, Paula J.

    2003-01-01

    A diagnostic tool for detecting damage to gears was developed. Two different measurement technologies, oil debris analysis and vibration were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Rig. An oil debris sensor and the two vibration algorithms were adapted as the diagnostic tools. An inductance type oil debris sensor was selected for the oil analysis measurement technology. Gear damage data for this type of sensor was limited to data collected in the NASA Glenn test rigs. For this reason, this analysis included development of a parameter for detecting gear pitting damage using this type of sensor. The vibration data was used to calculate two previously available gear vibration diagnostic algorithms. The two vibration algorithms were selected based on their maturity and published success in detecting damage to gears. Oil debris and vibration features were then developed using fuzzy logic analysis techniques, then input into a multi sensor data fusion process. Results show combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spur gears. As a result of this research, this new diagnostic tool has significantly improved detection of gear damage in the NASA Glenn Spur Gear Fatigue Rigs. This research also resulted in several other findings that will improve the development of future health monitoring systems. Oil debris analysis was found to be more reliable than vibration analysis for detecting pitting fatigue failure of gears and is capable of indicating damage progression. Also, some vibration algorithms are as sensitive to operational effects as they

  4. Summary of Disposable Debris Shields (DDS) Analysis for Development of Solid Debris Collection at NIF

    International Nuclear Information System (INIS)

    Shaughnessy, D.A.; Moody, K.J.; Grant, P.M.; Lewis, L.A.; Hutcheon, I.D.; Lindvall, R.; Gostic, J.M.

    2011-01-01

    Collection of solid debris from the National Ignition Facility (NIF) is being developed both as a diagnostic tool and as a means for measuring nuclear reaction cross sections relevant to the Stockpile Stewardship Program and nuclear astrophysics. The concept is straightforward; following a NIF shot, the debris that is produced as a result of the capsule and hohlraum explosion would be collected and subsequently extracted from the chamber. The number of nuclear activations that occurred in the capsule would then be measured through a combination of radiation detection and radiochemical processing followed by mass spectrometry. Development of the catcher is challenging due to the complex environment of the NIF target chamber. The collector surface is first exposed to a large photon flux, followed by the debris wind that is produced. The material used in the catcher must be mechanically strong in order to withstand the large amount of energy it is exposed to, as well as be chemically compatible with the form and composition of the debris. In addition, the location of the catcher is equally important. If it is positioned too close to the center of the target chamber, it will be significantly ablated, which could interfere with the ability of the debris to reach the surface and stick. If it is too far away, the fraction of the debris cloud collected will be too small to result in a statistically significant measurement. Material, geometric configuration, and location must all be tested in order to design the optimal debris collection system for NIF. One of the first ideas regarding solid debris collection at NIF was to use the disposable debris shields (DDS), which are fielded over the final optics assemblies (FOA) 7 m away from the center of the target chamber. The DDS are meant to be replaced after a certain number of shots, and if the shields could be subsequently analyzed after removal, it would serve as a mechanism for fielding a relatively large collection area

  5. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    Science.gov (United States)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  6. Treatment technology analysis for mixed waste containers and debris

    International Nuclear Information System (INIS)

    Gehrke, R.J.; Brown, C.H.; Langton, C.A.; Askew, N.M.; Kan, T.; Schwinkendorf, W.E.

    1994-03-01

    A team was assembled to develop technology needs and strategies for treatment of mixed waste debris and empty containers in the Department of Energy (DOE) complex, and to determine the advantages and disadvantages of applying the Debris and Empty Container Rules to these wastes. These rules issued by the Environmental Protection Agency (EPA) apply only to the hazardous component of mixed debris. Hazardous debris that is subjected to regulations under the Atomic Energy Act because of its radioactivity (i.e., mixed debris) is also subject to the debris treatment standards. The issue of treating debris per the Resource Conservation and Recovery Act (RCRA) at the same time or in conjunction with decontamination of the radioactive contamination was also addressed. Resolution of this issue requires policy development by DOE Headquarters of de minimis concentrations for radioactivity and release of material to Subtitle D landfills or into the commercial sector. The task team recommends that, since alternate treatment technologies (for the hazardous component) are Best Demonstrated Available Technology (BDAT): (1) funding should focus on demonstration, testing, and evaluation of BDAT on mixed debris, (2) funding should also consider verification of alternative treatments for the decontamination of radioactive debris, and (3) DOE should establish criteria for the recycle/reuse or disposal of treated and decontaminated mixed debris as municipal waste

  7. Requirement analysis for autonomous systems and intelligent ...

    African Journals Online (AJOL)

    user

    Danish Power System and a requirement analysis for the use of intelligent agents and ..... tries to make an optimal islanding plan at this state and tries to blackstart. ... 4 Foundation for Physical Intelligent Agents (FIPA): http://www.fipa.org ...

  8. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis)

    Science.gov (United States)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  9. Analysis of a space debris laser removal system

    Science.gov (United States)

    Gjesvold, Evan; Straub, Jeremy

    2017-05-01

    As long as man ventures into space, he will leave behind debris, and as long as he ventures into space, this debris will pose a threat to him and his projects. Space debris must be located and decommissioned. Lasers may prove to be the ideal method, as they can operate at a distance from the debris, have a theoretically infinite supply of energy from the sun, and are a seemingly readily available technology. This paper explores the requirements and reasoning for such a laser debris removal method. A case is made for the negligibility of eliminating rotational velocity from certain systems, while a design schematic is also presented for the implementation of a cube satellite proof of concept.

  10. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  11. Content Analysis for Proactive Protective Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.

    2010-12-15

    The aim of this paper is to outline a plan for developing and validating a Proactive Protective Intelligence approach that prevents targeted violence through the analysis and assessment of threats overtly or covertly expressed in abnormal communications to USSS protectees.

  12. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    Science.gov (United States)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  13. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  14. Mapping coastal marine debris using aerial imagery and spatial analysis.

    Science.gov (United States)

    Moy, Kirsten; Neilson, Brian; Chung, Anne; Meadows, Amber; Castrence, Miguel; Ambagis, Stephen; Davidson, Kristine

    2017-12-19

    This study is the first to systematically quantify, categorize, and map marine macro-debris across the main Hawaiian Islands (MHI), including remote areas (e.g., Niihau, Kahoolawe, and northern Molokai). Aerial surveys were conducted over each island to collect high resolution photos, which were processed into orthorectified imagery and visually analyzed in GIS. The technique provided precise measurements of the quantity, location, type, and size of macro-debris (>0.05m 2 ), identifying 20,658 total debris items. Northeastern (windward) shorelines had the highest density of debris. Plastics, including nets, lines, buoys, floats, and foam, comprised 83% of the total count. In addition, the study located six vessels from the 2011 Tōhoku tsunami. These results created a baseline of the location, distribution, and composition of marine macro-debris across the MHI. Resource managers and communities may target high priority areas, particularly along remote coastlines where macro-debris counts were largely undocumented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Global Analysis of Anthropogenic Debris Ingestion by Sea Turtles

    Science.gov (United States)

    Schuyler, Qamar; Hardesty, Britta Denise; Wilcox, Chris; Townsend, Kathy

    2014-01-01

    Ingestion of marine debris can have lethal and sublethal effects on sea turtles and other wildlife. Although researchers have reported on ingestion of anthropogenic debris by marine turtles and implied incidences of debris ingestion have increased over time, there has not been a global synthesis of the phenomenon since 1985. Thus, we analyzed 37 studies published from 1985 to 2012 that report on data collected from before 1900 through 2011. Specifically, we investigated whether ingestion prevalence has changed over time, what types of debris are most commonly ingested, the geographic distribution of debris ingestion by marine turtles relative to global debris distribution, and which species and life-history stages are most likely to ingest debris. The probability of green (Chelonia mydas) and leatherback turtles (Dermochelys coriacea) ingesting debris increased significantly over time, and plastic was the most commonly ingested debris. Turtles in nearly all regions studied ingest debris, but the probability of ingestion was not related to modeled debris densities. Furthermore, smaller, oceanic-stage turtles were more likely to ingest debris than coastal foragers, whereas carnivorous species were less likely to ingest debris than herbivores or gelatinovores. Our results indicate oceanic leatherback turtles and green turtles are at the greatest risk of both lethal and sublethal effects from ingested marine debris. To reduce this risk, anthropogenic debris must be managed at a global level. Análisis Global de la Ingesta de Residuos Antropogénicos por Tortugas Marinas La ingesta de residuos marinos puede tener efectos letales y subletales sobre las tortugas marinas y otros animales. Aunque hay investigadores que han reportado la ingesta de residuos antropogénicos por tortugas marinas y la incidencia de la ingesta de residuos ha incrementado con el tiempo, no ha habido una síntesis global del fenómeno desde 1985. Por esto analizamos 37 estudios publicados, desde

  16. Supercritical kinetic analysis in simplified system of fuel debris using integral kinetic model

    International Nuclear Information System (INIS)

    Tuya, Delgersaikhan; Obara, Toru

    2016-01-01

    Highlights: • Kinetic analysis in simplified weakly coupled fuel debris system was performed. • The integral kinetic model was used to simulate criticality accidents. • The fission power and released energy during simulated accident were obtained. • Coupling between debris regions and its effect on the fission power was obtained. - Abstract: Preliminary prompt supercritical kinetic analyses in a simplified coupled system of fuel debris designed to roughly resemble a melted core of a nuclear reactor were performed using an integral kinetic model. The integral kinetic model, which can describe region- and time-dependent fission rate in a coupled system of arbitrary geometry, was used because the fuel debris system is weakly coupled in terms of neutronics. The results revealed some important characteristics of coupled systems, such as the coupling between debris regions and the effect of the coupling on the fission rate and released energy in each debris region during the simulated criticality accident. In brief, this study showed that the integral kinetic model can be applied to supercritical kinetic analysis in fuel debris systems and also that it can be a useful tool for investigating the effect of the coupling on consequences of a supercritical accident.

  17. Epistasis analysis using artificial intelligence.

    Science.gov (United States)

    Moore, Jason H; Hill, Doug P

    2015-01-01

    Here we introduce artificial intelligence (AI) methodology for detecting and characterizing epistasis in genetic association studies. The ultimate goal of our AI strategy is to analyze genome-wide genetics data as a human would using sources of expert knowledge as a guide. The methodology presented here is based on computational evolution, which is a type of genetic programming. The ability to generate interesting solutions while at the same time learning how to solve the problem at hand distinguishes computational evolution from other genetic programming approaches. We provide a general overview of this approach and then present a few examples of its application to real data.

  18. Risk Analysis Reveals Global Hotspots for Marine Debris Ingestion by Sea Turtles

    Science.gov (United States)

    Schuyler, Q. A.; Wilcox, C.; Townsend, K.; Wedemeyer-Strombel, K.; Balazs, G.; van Sebille, E.; Hardesty, B. D.

    2016-02-01

    Plastic marine debris pollution is rapidly becoming one of the critical environmental concerns facing wildlife in the 21st century. Here we present a risk analysis for plastic ingestion by sea turtles on a global scale. We combined global marine plastic distributions based on ocean drifter data with sea turtle habitat maps to predict exposure levels to plastic pollution. Empirical data from necropsies of deceased animals were then utilised to assess the consequence of exposure to plastics. We modelled the risk (probability of debris ingestion) by incorporating exposure to debris and consequence of exposure, and included life history stage, species of sea turtle, and date of stranding observation as possible additional explanatory factors. Life history stage is the best predictor of debris ingestion, but the best-fit model also incorporates encounter rates within a limited distance from stranding location, marine debris predictions specific to the date of the stranding study, and turtle species. There was no difference in ingestion rates between stranded turtles vs. those caught as bycatch from fishing activity, suggesting that stranded animals are not a biased representation of debris ingestion rates in the background population. Oceanic life-stage sea turtles are at the highest risk of debris ingestion, and olive ridley turtles are the most at-risk species. The regions of highest risk to global sea turtle populations are off of the east coasts of the USA, Australia, and South Africa; the east Indian Ocean, and Southeast Asia. Model results can be used to predict the number of sea turtles globally at risk of debris ingestion. Based on currently available data, initial calculations indicate that up to 52% of sea turtles may have ingested debris.

  19. Risk analysis reveals global hotspots for marine debris ingestion by sea turtles.

    Science.gov (United States)

    Schuyler, Qamar A; Wilcox, Chris; Townsend, Kathy A; Wedemeyer-Strombel, Kathryn R; Balazs, George; van Sebille, Erik; Hardesty, Britta Denise

    2016-02-01

    Plastic marine debris pollution is rapidly becoming one of the critical environmental concerns facing wildlife in the 21st century. Here we present a risk analysis for plastic ingestion by sea turtles on a global scale. We combined global marine plastic distributions based on ocean drifter data with sea turtle habitat maps to predict exposure levels to plastic pollution. Empirical data from necropsies of deceased animals were then utilised to assess the consequence of exposure to plastics. We modelled the risk (probability of debris ingestion) by incorporating exposure to debris and consequence of exposure, and included life history stage, species of sea turtle and date of stranding observation as possible additional explanatory factors. Life history stage is the best predictor of debris ingestion, but the best-fit model also incorporates encounter rates within a limited distance from stranding location, marine debris predictions specific to the date of the stranding study and turtle species. There is no difference in ingestion rates between stranded turtles vs. those caught as bycatch from fishing activity, suggesting that stranded animals are not a biased representation of debris ingestion rates in the background population. Oceanic life-stage sea turtles are at the highest risk of debris ingestion, and olive ridley turtles are the most at-risk species. The regions of highest risk to global sea turtle populations are off of the east coasts of the USA, Australia and South Africa; the east Indian Ocean, and Southeast Asia. Model results can be used to predict the number of sea turtles globally at risk of debris ingestion. Based on currently available data, initial calculations indicate that up to 52% of sea turtles may have ingested debris. © 2015 John Wiley & Sons Ltd.

  20. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  1. Research of Classical and Intelligent Information System Solutions for Criminal Intelligence Analysis

    OpenAIRE

    Šimović, Vladimir

    2001-01-01

    The objective of this study is to present research on classical and intelligent information system solutions used in criminal intelligence analysis in Croatian security system theory. The study analyses objective and classical methods of information science, including artificial intelligence and other scientific methods. The intelligence and classical software solutions researched, proposed, and presented in this study were used in developing the integrated information system for the Croatian...

  2. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  3. Sensitivity analysis for maximum heat removal from debris in the lower head

    International Nuclear Information System (INIS)

    Kim, Yong Hoon; Suh, Kune Y.

    2000-01-01

    Sensitivity analyses were performed to determine the maximum heat removal capability from the debris and the reactor pressure vessel (RPV) wall through the gap that may be formed during a core melt relocation accident. Cases studied included four different nuclear power plant (TMI-2,KORI-2,YGN 3and4 and KNGR) per the thermal opower output. Results of the analysis show that the heat removal through gap cooling relative to flooding is efficacious as much as about 40% of the core material accumulated in the lower plenum in case of the TMI-2 reactor. In excess of 40%, however, the gap cooling alone was found not to be enough for heat removal from the core debris. There being uncertaainties aoboout the assumptions made in the present study,the analyses yield consistent results. If different cooling effects are considered, heat removal may be greatly enhanced. The LAVA experiements were performed at the Korea Atomic Energy Research Institute (KAERI) using al 2 O 3 /Fe thermite melt relocating down to the scaled vessel of a reactor lower head filled with preheated water. Test results indicated a cooling effect of water ingression through the debris-to-vessel gap and the intra-debris pores and crevices. If the cooling capacity of the intra-debris pores and crevices is comparable to debris-to-vessel heat removal capability, heat removal from the debris will be greatly augmented than heat removal by the gap cooling alone. The three nuclear reactor (KORI-2, YGN 3and4 and KNGR) calculation results for heat removal through the debris-to-vessel gap size of about 1mm were compared with the TMI-2 reactor calculation results for the case of gap cooling alone. (author)

  4. Estimating construction and demolition debris generation using a materials flow analysis approach.

    Science.gov (United States)

    Cochran, K M; Townsend, T G

    2010-11-01

    The magnitude and composition of a region's construction and demolition (C&D) debris should be understood when developing rules, policies and strategies for managing this segment of the solid waste stream. In the US, several national estimates have been conducted using a weight-per-construction-area approximation; national estimates using alternative procedures such as those used for other segments of the solid waste stream have not been reported for C&D debris. This paper presents an evaluation of a materials flow analysis (MFA) approach for estimating C&D debris generation and composition for a large region (the US). The consumption of construction materials in the US and typical waste factors used for construction materials purchasing were used to estimate the mass of solid waste generated as a result of construction activities. Debris from demolition activities was predicted from various historical construction materials consumption data and estimates of average service lives of the materials. The MFA approach estimated that approximately 610-78 × 10(6)Mg of C&D debris was generated in 2002. This predicted mass exceeds previous estimates using other C&D debris predictive methodologies and reflects the large waste stream that exists. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Apical extrusion of debris in four different endodontic instrumentation systems: A meta-analysis.

    Science.gov (United States)

    Western, J Sylvia; Dicksit, Daniel Devaprakash

    2017-01-01

    All endodontic instrumentation systems tested so far, promote apical extrusion of debris, which is one of the main causes of postoperative pain, flare ups, and delayed healing. Of this meta-analysis was to collect and analyze in vitro studies quantifying apically extruded debris while using Hand ProTaper (manual), ProTaper Universal (rotary), Wave One (reciprocating), and self-adjusting file (SAF; vibratory) endodontic instrumentation systems and to determine methods which produced lesser extrusion of debris apically. An extensive electronic database search was done in PubMed, Scopus, Cochrane, LILACS, and Google Scholar from inception until February 2016 using the key terms "Apical Debris Extrusion, extruded material, and manual/rotary/reciprocating/SAF systems." A systematic search strategy was followed to extract 12 potential articles from a total of 1352 articles. The overall effect size was calculated from the raw mean difference of weight of apically extruded debris. Statistically significant difference was seen in the following comparisons: SAF ProTaper. Apical extrusion of debris was invariably present in all the instrumentation systems analyzed. SAF system seemed to be periapical tissue friendly as it caused reduced apical extrusion compared to Rotary ProTaper and Wave One.

  6. Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection

    National Research Council Canada - National Science Library

    Willis, Henry H; LaTourrette, Tom; Kelly, Terrence K; Hickey, Scot; Neill, Samuel

    2007-01-01

    ...? The Office of Intelligence and Analysis (OI&A) at DHS is responsible for using information and intelligence from multiple sources to identify and assess current and future threats to the United States...

  7. The Role of Intelligence Analysis in the War on Terrorism

    National Research Council Canada - National Science Library

    Ormond, Valerie

    2002-01-01

    The United States government must provide the intelligence community's analytical force with the necessary resources and capabilities in order to use intelligence analysis as an effective weapon in the War on Terrorism...

  8. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  9. Debris flow rheology: Experimental analysis of fine-grained slurries

    Science.gov (United States)

    Major, Jon J.; Pierson, Thomas C.

    1992-01-01

    The rheology of slurries consisting of ≤2-mm sediment from a natural debris flow deposit was measured using a wide-gap concentric-cylinder viscometer. The influence of sediment concentration and size and distribution of grains on the bulk rheological behavior of the slurries was evaluated at concentrations ranging from 0.44 to 0.66. The slurries exhibit diverse rheological behavior. At shear rates above 5 s−1 the behavior approaches that of a Bingham material; below 5 s−1, sand exerts more influence and slurry behavior deviates from the Bingham idealization. Sand grain interactions dominate the mechanical behavior when sand concentration exceeds 0.2; transient fluctuations in measured torque, time-dependent decay of torque, and hysteresis effects are observed. Grain rubbing, interlocking, and collision cause changes in packing density, particle distribution, grain orientation, and formation and destruction of grain clusters, which may explain the observed behavior. Yield strength and plastic viscosity exhibit order-of-magnitude variation when sediment concentration changes as little as 2–4%. Owing to these complexities, it is unlikely that debris flows can be characterized by a single rheological model.

  10. [Technologies for Complex Intelligent Clinical Data Analysis].

    Science.gov (United States)

    Baranov, A A; Namazova-Baranova, L S; Smirnov, I V; Devyatkin, D A; Shelmanov, A O; Vishneva, E A; Antonova, E V; Smirnov, V I

    2016-01-01

    The paper presents the system for intelligent analysis of clinical information. Authors describe methods implemented in the system for clinical information retrieval, intelligent diagnostics of chronic diseases, patient's features importance and for detection of hidden dependencies between features. Results of the experimental evaluation of these methods are also presented. Healthcare facilities generate a large flow of both structured and unstructured data which contain important information about patients. Test results are usually retained as structured data but some data is retained in the form of natural language texts (medical history, the results of physical examination, and the results of other examinations, such as ultrasound, ECG or X-ray studies). Many tasks arising in clinical practice can be automated applying methods for intelligent analysis of accumulated structured array and unstructured data that leads to improvement of the healthcare quality. the creation of the complex system for intelligent data analysis in the multi-disciplinary pediatric center. Authors propose methods for information extraction from clinical texts in Russian. The methods are carried out on the basis of deep linguistic analysis. They retrieve terms of diseases, symptoms, areas of the body and drugs. The methods can recognize additional attributes such as "negation" (indicates that the disease is absent), "no patient" (indicates that the disease refers to the patient's family member, but not to the patient), "severity of illness", disease course", "body region to which the disease refers". Authors use a set of hand-drawn templates and various techniques based on machine learning to retrieve information using a medical thesaurus. The extracted information is used to solve the problem of automatic diagnosis of chronic diseases. A machine learning method for classification of patients with similar nosology and the methodfor determining the most informative patients'features are

  11. Debris flow run-out simulation and analysis using a dynamic model

    Science.gov (United States)

    Melo, Raquel; van Asch, Theo; Zêzere, José L.

    2018-02-01

    Only two months after a huge forest fire occurred in the upper part of a valley located in central Portugal, several debris flows were triggered by intense rainfall. The event caused infrastructural and economic damage, although no lives were lost. The present research aims to simulate the run-out of two debris flows that occurred during the event as well as to calculate via back-analysis the rheological parameters and the excess rain involved. Thus, a dynamic model was used, which integrates surface runoff, concentrated erosion along the channels, propagation and deposition of flow material. Afterwards, the model was validated using 32 debris flows triggered during the same event that were not considered for calibration. The rheological and entrainment parameters obtained for the most accurate simulation were then used to perform three scenarios of debris flow run-out on the basin scale. The results were confronted with the existing buildings exposed in the study area and the worst-case scenario showed a potential inundation that may affect 345 buildings. In addition, six streams where debris flow occurred in the past and caused material damage and loss of lives were identified.

  12. Identification, testing, and analysis of a meteorite debris from jhelum, pakistan

    International Nuclear Information System (INIS)

    Kayani, S.

    2012-01-01

    In this research paper, X-ray diffraction (XRD) and X-ray fluorescence (XRF) spectrometry have been used to determine the mineralogical and elemental composition of a stone sample recovered from a location near village Lehri in district Jhelurn, Pakistan. The test data is compared with previous findings (as reported in literature and included in references) to identify this sample stone as part of a prehistoric meteorite ablation debris. Carbon content of a specimen of the meteorite debris has also been determined through combustion analysis. This carbon abundance has been compared with carbon wt% value of a certain type of meteorites to establ ish the origin and nature of the parent body of this particular meteorite debris. (author)

  13. Risk analysis reveals global hotspots for marine debris ingestion by sea turtles

    NARCIS (Netherlands)

    Schuyler, Qamar A.; Wilcox, Chris; Townsend, Kathy A.; Wedemeyer-Strombel, Kathryn R.; Balazs, George; van Sebille, Erik|info:eu-repo/dai/nl/304831921; Hardesty, Britta Denise

    2016-01-01

    Plastic marine debris pollution is rapidly becoming one of the critical environmental concerns facing wildlife in the 21st century. Here we present a risk analysis for plastic ingestion by sea turtles on a global scale. We combined global marine plastic distributions based on ocean drifter data with

  14. Debris flows risk analysis and direct loss estimation: the case study of Valtellina di Tirano, Italy

    Czech Academy of Sciences Publication Activity Database

    Blahůt, Jan; Glade, T.; Sterlacchini, S.

    2014-01-01

    Roč. 11, č. 2 (2014), s. 288-307 ISSN 1672-6316 Institutional support: RVO:67985891 Keywords : Debris flows * Risk analysis * Economic losses * Central Alps * Italy Subject RIV: DE - Earth Magnetism, Geodesy, Geography OBOR OECD: Physical geography Impact factor: 0.963, year: 2014

  15. Interaction of debris with a solid obstacle: Numerical analysis

    International Nuclear Information System (INIS)

    Kosinska, Anna

    2010-01-01

    The subject of this research is the propagation of a cloud of solid particles formed from an explosion-damaged construction. The main objective is the interaction of the cloud (debris) with a solid beam located at some distance from the explosion. The mathematical model involves the flow of the gas using standard conservation equations, and this part of the model is solved numerically. The solid particles are treated as a system of solid points (so-called Lagrangian approach), whose motion is the result of the flowing gas as well as collisions with obstacles. These two issues are described respectively by Newton's second law and the hard-sphere model. The model is used to simulate various cases where the influence of different parameters like the value of the pressure of the explosion, the particle size, the number of particles and the obstacle location are investigated. The results are presented as snapshots of particle location, and also as the particle total momentum during collision with the beam.

  16. Interaction of debris with a solid obstacle: numerical analysis.

    Science.gov (United States)

    Kosinska, Anna

    2010-05-15

    The subject of this research is the propagation of a cloud of solid particles formed from an explosion-damaged construction. The main objective is the interaction of the cloud (debris) with a solid beam located at some distance from the explosion. The mathematical model involves the flow of the gas using standard conservation equations, and this part of the model is solved numerically. The solid particles are treated as a system of solid points (so-called Lagrangian approach), whose motion is the result of the flowing gas as well as collisions with obstacles. These two issues are described respectively by Newton's second law and the hard-sphere model. The model is used to simulate various cases where the influence of different parameters like the value of the pressure of the explosion, the particle size, the number of particles and the obstacle location are investigated. The results are presented as snapshots of particle location, and also as the particle total momentum during collision with the beam. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  17. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  18. Application of CAMP code to analysis of debris coolability experiments in ALPHA program

    International Nuclear Information System (INIS)

    Maruyama, Yu; Moriyama, Kiyofumi; Park, Hyun-Sun; Yang, Yanhua; Sugimoto, Jun

    1999-01-01

    An analytical code for thermo-fluid dynamics of a molten debris, CAMP, was applied to the analysis of the ex-vessel and in-vessel debris coolability experiments performed in ALPHA program. The analysis on the ex-vessel debris coolability experiments, where water was added onto a layer of thermite melt, indicated that the upper surface of the melt was remained molten during a period when melt eruptions followed by a mild steam explosion were observed. This might imply that a coarse mixing between the melt and the overlying water could have been formed if a sufficient force was generated at the interface between the two fluids. In the analysis of the in-vessel debris coolability experiments, where an aluminum oxide (Al 2 O 3 ) melt was poured into a water-filled lower head experimental vessel, a temperature increase at the outer surface of the vessel was qualitatively reproduced when a gap was assumed to be at the interface between the solidified Al 2 O 3 and the vessel wall. (author)

  19. Fire debris analysis for forensic fire investigation using laser induced breakdown spectroscopy (LIBS)

    Science.gov (United States)

    Choi, Soojin; Yoh, Jack J.

    2017-08-01

    The possibility verification of the first attempt to apply LIBS to arson investigation was performed. LIBS has capabilities for real time in-situ analysis and depth profiling. It can provide valuable information about the fire debris that are complementary to the classification of original sample components and combustion residues. In this study, fire debris was analyzed to determine the ignition source and existence of a fire accelerant using LIBS spectra and depth profiling analysis. Fire debris chemical composition and carbon layer thickness determines the possible ignition source while the carbon layer thickness of combusted samples represents the degree of sample carbonization. When a sample is combusted with fire accelerants, a thicker carbon layer is formed because the burning rate is increased. Therefore, depth profiling can confirm the existence of combustion accelerants, which is evidence of arson. Also investigation of fire debris by depth profiling is still possible when a fire is extinguished with water from fire hose. Such data analysis and in-situ detection of forensic signals via the LIBS may assist fire investigation at crime scenes.

  20. Sensor fusion for intelligent alarm analysis

    International Nuclear Information System (INIS)

    Nelson, C.L.; Fitzgerald, D.S.

    1996-01-01

    The purpose of an intelligent alarm analysis system is to provide complete and manageable information to a central alarm station operator by applying alarm processing and fusion techniques to sensor information. This paper discusses the sensor fusion approach taken to perform intelligent alarm analysis for the Advanced Exterior Sensor (AES). The AES is an intrusion detection and assessment system designed for wide-area coverage, quick deployment, low false/nuisance alarm operation, and immediate visual assessment. It combines three sensor technologies (visible, infrared, and millimeter wave radar) collocated on a compact and portable remote sensor module. The remote sensor module rotates at a rate of 1 revolution per second to detect and track motion and provide assessment in a continuous 360 degree field-of-regard. Sensor fusion techniques are used to correlate and integrate the track data from these three sensors into a single track for operator observation. Additional inputs to the fusion process include environmental data, knowledge of sensor performance under certain weather conditions, sensor priority, and recent operator feedback. A confidence value is assigned to the track as a result of the fusion process. This helps to reduce nuisance alarms and to increase operator confidence in the system while reducing the workload of the operator

  1. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    Science.gov (United States)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  2. Improvement and evaluation of debris coolability analysis module in severe accident analysis code SAMPSON using LIVE experiment

    International Nuclear Information System (INIS)

    Wei, Hongyang; Erkan, Nejdet; Okamoto, Koji; Gaus-Liu, Xiaoyang; Miassoedov, Alexei

    2017-01-01

    Highlights: • Debris coolability analysis module in SAMPSON is validated. • Model for heat transfer between melt pool and pressure vessel wall is modified. • Modified debris coolability analysis module is found to give reasonable results. - Abstract: The purpose of this work is to validate the debris coolability analysis (DCA) module in the severe accident analysis code SAMPSON by simulating the first steady stage of the LIVE-L4 test. The DCA module is used for debris cooling in the lower plenum and for predicting the safety margin of present reactor vessels during a severe accident. In the DCA module, the spreading and cooling of molten debris, gap cooling, heating of a three-dimensional reactor vessel, and natural convection heat transfer are all considered. The LIVE experiment is designed to investigate the formation and stability of melt pools in a reactor pressure vessel (RPV). By comparing the simulation results and experimental data in terms of the average melt pool temperature and the heat flux along the vessel wall, a bug is found in the code and the model for the heat transfer between the melt pool and RPV wall is modified. Based on the Asfia–Dhir and Jahn–Reineke correlations, the modified version of the DCA module is found to give reasonable results for the average melt pool temperature, crust thickness in the steady state, and crust growth rate.

  3. An artificial intelligence approach towards disturbance analysis

    International Nuclear Information System (INIS)

    Fiedler, U.; Lindner, A.; Baldeweg, F.; Klebau, J.

    1986-01-01

    Scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in well-known situations. But in cases of emergencies, the operator needs more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence (AI) methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application to nuclear power plants. Problems related to man-machine communication will be gone into more detail, because their solution will influence end-user acceptance considerably. (author)

  4. HOW TO MAKE ANALYSIS WORK IN BUSINESS INTELLIGENCE SOFTWARE

    OpenAIRE

    Axner, Dr Lilit

    2009-01-01

    Competitive Intelligence (CI) has been defined by many authors. These definitions do have certain differences but all of them have a main common feature: They put the accent on the analysis. The most precise definition is given by the Society for Competitive Intelligence Professionals (SCIP): “A systematic and ethical program for gathering, analyzing, and managing external information that can affect your company’s plans, decisions, and operations”. Business Intelligence (BI) is much broader ...

  5. Determination of Ignitable Liquids in Fire Debris: Direct Analysis by Electronic Nose

    Directory of Open Access Journals (Sweden)

    Marta Ferreiro-González

    2016-05-01

    Full Text Available Arsonists usually use an accelerant in order to start or accelerate a fire. The most widely used analytical method to determine the presence of such accelerants consists of a pre-concentration step of the ignitable liquid residues followed by chromatographic analysis. A rapid analytical method based on headspace-mass spectrometry electronic nose (E-Nose has been developed for the analysis of Ignitable Liquid Residues (ILRs. The working conditions for the E-Nose analytical procedure were optimized by studying different fire debris samples. The optimized experimental variables were related to headspace generation, specifically, incubation temperature and incubation time. The optimal conditions were 115 °C and 10 min for these two parameters. Chemometric tools such as hierarchical cluster analysis (HCA and linear discriminant analysis (LDA were applied to the MS data (45–200 m/z to establish the most suitable spectroscopic signals for the discrimination of several ignitable liquids. The optimized method was applied to a set of fire debris samples. In order to simulate post-burn samples several ignitable liquids (gasoline, diesel, citronella, kerosene, paraffin were used to ignite different substrates (wood, cotton, cork, paper and paperboard. A full discrimination was obtained on using discriminant analysis. This method reported here can be considered as a green technique for fire debris analyses.

  6. Determination of Ignitable Liquids in Fire Debris: Direct Analysis by Electronic Nose

    Science.gov (United States)

    Ferreiro-González, Marta; Barbero, Gerardo F.; Palma, Miguel; Ayuso, Jesús; Álvarez, José A.; Barroso, Carmelo G.

    2016-01-01

    Arsonists usually use an accelerant in order to start or accelerate a fire. The most widely used analytical method to determine the presence of such accelerants consists of a pre-concentration step of the ignitable liquid residues followed by chromatographic analysis. A rapid analytical method based on headspace-mass spectrometry electronic nose (E-Nose) has been developed for the analysis of Ignitable Liquid Residues (ILRs). The working conditions for the E-Nose analytical procedure were optimized by studying different fire debris samples. The optimized experimental variables were related to headspace generation, specifically, incubation temperature and incubation time. The optimal conditions were 115 °C and 10 min for these two parameters. Chemometric tools such as hierarchical cluster analysis (HCA) and linear discriminant analysis (LDA) were applied to the MS data (45–200 m/z) to establish the most suitable spectroscopic signals for the discrimination of several ignitable liquids. The optimized method was applied to a set of fire debris samples. In order to simulate post-burn samples several ignitable liquids (gasoline, diesel, citronella, kerosene, paraffin) were used to ignite different substrates (wood, cotton, cork, paper and paperboard). A full discrimination was obtained on using discriminant analysis. This method reported here can be considered as a green technique for fire debris analyses. PMID:27187407

  7. Three dimensional computational fluid dynamic analysis of debris transport under emergency cooling water recirculation

    International Nuclear Information System (INIS)

    Park, Jong Woon

    2010-01-01

    This paper provides a computational fluid dynamic (CFD) analysis method on the evaluation of debris transport under emergency recirculation mode after loss of coolant accident of a nuclear power plant. Three dimensional reactor building floor geometrical model is constructed including flow obstacles larger than 6 inches such as mechanical components and equipments and considering various inlet flow paths from the upper reactor building such as break and spray flow. In the modeling of the inlet flows from the upper floors, effect of gravitational force was also reflected. For the precision of the analysis, 3 millions of tetrahedral-shaped meshes were generated. Reference calculation showed physically reasonable results. Sensitivity studies for mesh type and turbulence model showed very similar results to the reference case. This study provides useful information on the application of CFD to the evaluation of debris transport fraction for the design of new emergency sump filters. (orig.)

  8. Artificial intelligence in mitral valve analysis.

    Science.gov (United States)

    Jeganathan, Jelliffe; Knio, Ziyad; Amador, Yannis; Hai, Ting; Khamooshian, Arash; Matyal, Robina; Khabbaz, Kamal R; Mahmood, Feroze

    2017-01-01

    Echocardiographic analysis of mitral valve (MV) has become essential for diagnosis and management of patients with MV disease. Currently, the various software used for MV analysis require manual input and are prone to interobserver variability in the measurements. The aim of this study is to determine the interobserver variability in an automated software that uses artificial intelligence for MV analysis. Retrospective analysis of intraoperative three-dimensional transesophageal echocardiography data acquired from four patients with normal MV undergoing coronary artery bypass graft surgery in a tertiary hospital. Echocardiographic data were analyzed using the eSie Valve Software (Siemens Healthcare, Mountain View, CA, USA). Three examiners analyzed three end-systolic (ES) frames from each of the four patients. A total of 36 ES frames were analyzed and included in the study. A multiple mixed-effects ANOVA model was constructed to determine if the examiner, the patient, and the loop had a significant effect on the average value of each parameter. A Bonferroni correction was used to correct for multiple comparisons, and P = 0.0083 was considered to be significant. Examiners did not have an effect on any of the six parameters tested. Patient and loop had an effect on the average parameter value for each of the six parameters as expected (P < 0.0083 for both). We were able to conclude that using automated analysis, it is possible to obtain results with good reproducibility, which only requires minimal user intervention.

  9. Close Approach Prediction Analysis of the Earth Science Constellation with the Fengyun-1C Debris

    Science.gov (United States)

    Duncan, Matthew; Rand, David K.

    2008-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. Each day, close approach predictions are generated by a U.S. Department of Defense Joint Space Operations Center Orbital Safety Analyst using the high accuracy Space Object Catalog maintained by the Air Force's 1" Space Control Squadron. Prediction results and other ancillary data such as state vector information are sent to NASAJGoddard Space Flight Center's (GSFC's) Collision Risk Assessment analysis team for review. Collision analysis is performed and the GSFC team works with the ESC member missions to develop risk reduction strategies as necessary. This paper presents various close approach statistics for the ESC. The ESC missions have been affected by debris from the recent anti-satellite test which destroyed the Chinese Fengyun- 1 C satellite. The paper also presents the percentage of close approach events induced by the Fengyun-1C debris, and presents analysis results which predict the future effects on the ESC caused by this event. Specifically, the Fengyun-1C debris is propagated for twenty years using high-performance computing technology and close approach predictions are generated for the ESC. The percent increase in the total number of conjunction events is considered to be an estimate of the collision risk due to the Fengyun-1C break- UP.

  10. Behavior Analysis and the Quest for Machine Intelligence.

    Science.gov (United States)

    Stephens, Kenneth R.; Hutchison, William R.

    1993-01-01

    Discusses three approaches to building intelligent systems: artificial intelligence, neural networks, and behavior analysis. BANKET, an object-oriented software system, is explained; a commercial application of BANKET is described; and a collaborative effort between the academic and business communities for the use of BANKET is discussed.…

  11. Extensions to SCDAP/RELAP5/MOD2 debris analysis models for the severe accident analysis of Savannah River Site (SRS) reactors preliminary design report

    International Nuclear Information System (INIS)

    Siefken, L.J.; Moore, R.L.

    1989-06-01

    Proposed extensions to the debris analysis model in the SCDAP/RELAP5 code to perform severe accident analyses of Savannah River Plant reactors are described. Designs are presented for the following areas of development: (a) calculating convective and radiative heat transfer at the surfaces of a debris region; (b) calculating heatup of a structure and supported debris that interfaces with several fluid control volumes; (c) modeling the addition of transported material to the surfaces of any structure represented by the debris analysis model; (d) calculating the two-dimensional heatup of an arbitrary number of structures in the reactor system; (e) modeling the effect of natural convection of liquefied material on heat transfer in a debris bed; and (f) modeling fission product release and aerosol generation in a debris bed. 11 refs., 12 figs., 7 tabs

  12. Temporary disaster debris management site identification using binomial cluster analysis and GIS.

    Science.gov (United States)

    Grzeda, Stanislaw; Mazzuchi, Thomas A; Sarkani, Shahram

    2014-04-01

    An essential component of disaster planning and preparation is the identification and selection of temporary disaster debris management sites (DMS). However, since DMS identification is a complex process involving numerous variable constraints, many regional, county and municipal jurisdictions initiate this process during the post-disaster response and recovery phases, typically a period of severely stressed resources. Hence, a pre-disaster approach in identifying the most likely sites based on the number of locational constraints would significantly contribute to disaster debris management planning. As disasters vary in their nature, location and extent, an effective approach must facilitate scalability, flexibility and adaptability to variable local requirements, while also being generalisable to other regions and geographical extents. This study demonstrates the use of binomial cluster analysis in potential DMS identification in a case study conducted in Hamilton County, Indiana. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.

  13. Artificial intelligence in mitral valve analysis

    Directory of Open Access Journals (Sweden)

    Jelliffe Jeganathan

    2017-01-01

    Full Text Available Background: Echocardiographic analysis of mitral valve (MV has become essential for diagnosis and management of patients with MV disease. Currently, the various software used for MV analysis require manual input and are prone to interobserver variability in the measurements. Aim: The aim of this study is to determine the interobserver variability in an automated software that uses artificial intelligence for MV analysis. Settings and Design: Retrospective analysis of intraoperative three-dimensional transesophageal echocardiography data acquired from four patients with normal MV undergoing coronary artery bypass graft surgery in a tertiary hospital. Materials and Methods: Echocardiographic data were analyzed using the eSie Valve Software (Siemens Healthcare, Mountain View, CA, USA. Three examiners analyzed three end-systolic (ES frames from each of the four patients. A total of 36 ES frames were analyzed and included in the study. Statistical Analysis: A multiple mixed-effects ANOVA model was constructed to determine if the examiner, the patient, and the loop had a significant effect on the average value of each parameter. A Bonferroni correction was used to correct for multiple comparisons, and P = 0.0083 was considered to be significant. Results: Examiners did not have an effect on any of the six parameters tested. Patient and loop had an effect on the average parameter value for each of the six parameters as expected (P < 0.0083 for both. Conclusion: We were able to conclude that using automated analysis, it is possible to obtain results with good reproducibility, which only requires minimal user intervention.

  14. Artificial Intelligence in Mitral Valve Analysis

    Science.gov (United States)

    Jeganathan, Jelliffe; Knio, Ziyad; Amador, Yannis; Hai, Ting; Khamooshian, Arash; Matyal, Robina; Khabbaz, Kamal R; Mahmood, Feroze

    2017-01-01

    Background: Echocardiographic analysis of mitral valve (MV) has become essential for diagnosis and management of patients with MV disease. Currently, the various software used for MV analysis require manual input and are prone to interobserver variability in the measurements. Aim: The aim of this study is to determine the interobserver variability in an automated software that uses artificial intelligence for MV analysis. Settings and Design: Retrospective analysis of intraoperative three-dimensional transesophageal echocardiography data acquired from four patients with normal MV undergoing coronary artery bypass graft surgery in a tertiary hospital. Materials and Methods: Echocardiographic data were analyzed using the eSie Valve Software (Siemens Healthcare, Mountain View, CA, USA). Three examiners analyzed three end-systolic (ES) frames from each of the four patients. A total of 36 ES frames were analyzed and included in the study. Statistical Analysis: A multiple mixed-effects ANOVA model was constructed to determine if the examiner, the patient, and the loop had a significant effect on the average value of each parameter. A Bonferroni correction was used to correct for multiple comparisons, and P = 0.0083 was considered to be significant. Results: Examiners did not have an effect on any of the six parameters tested. Patient and loop had an effect on the average parameter value for each of the six parameters as expected (P < 0.0083 for both). Conclusion: We were able to conclude that using automated analysis, it is possible to obtain results with good reproducibility, which only requires minimal user intervention. PMID:28393769

  15. Advanced Analysis Cognition: Improving the Cognition of Intelligence Analysis

    Science.gov (United States)

    2013-09-01

    Reviews, 3rd ed., Sage Publications, Thousand Oaks, CA, 1998. 5 Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook for Systematic Reviews of...Structured Analytic Techniques for Intelligence Analysis, CQ Press, Washington, D.C., 2011. Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook...RW 3989) Bleicher, J. Contemporary Hermeneutics: Hermeneutics as Method, Philosophy, and Critique, Routledge & Kegan Paul, London; Boston, 1980

  16. Dynamic Analysis of The Intelligent Sprayer Boom

    DEFF Research Database (Denmark)

    Wiggers, Sine Leergaard; Maagaard, Jørgen; Terp, Christian Istjord

    called “The intelligent sprayer boom”. For the sprayer boom the primary challenge is to hit the weeds with precision from a movable platform. Since the sprayer boom is mounted on a tractor the system will react to bumps in the field. The intelligent sprayer boom has an integrated camera technology......As part of the 3 year project “The intelligent Sprayer Boom”, financed by The Danish National Advanced Technology Foundation, the dynamics of the sprayer boom is to be analysed. In order to minimize the amount of herbicides used to kill the weeds in agriculture a new sprayer boom is being developed...

  17. TMI-2 core debris grab samples: Examination and analysis: Part 1

    International Nuclear Information System (INIS)

    Akers, D.W.; Carlson, E.R.; Cook, B.A.; Ploger, S.A.; Carlson, J.O.

    1986-09-01

    Six samples of particulate debris were removed from the TMI-2 core rubble bed during September and October 1983, and five more samples were obtained in March 1984. The samples (up to 174 g each) were obtained at two locations in the core: H8 (center) and E9 (mid-radius). Ten of the eleven samples were examined at the Idaho National Engineering Laboratory to obtain data on the physical and chemical nature of the debris and the postaccident condition of the core. Portions of the samples also were subjected to differential thermal analysis at Rockwell Hanford Operations and metallurgical and chemical examinations at Argonne National Laboratories. This report presents results of the examination of the core debris grab samples, including physical, metallurgical, chemical, and radiochemical analyses. The results indicate that temperatures in the core reached at least 3100 K during the TMI-2 accident, fuel melting and significant mixing of core structural material occurred, and large fractions of some radionuclides (e.g., 90 Sr and 144 Ce) were retained in the core

  18. A Ballistic Limit Analysis Program for Shielding Against Micrometeoroids and Orbital Debris

    Science.gov (United States)

    Ryan, Shannon; Christiansen, Erie

    2010-01-01

    A software program has been developed that enables the user to quickly and simply perform ballistic limit calculations for common spacecraft structures that are subject to hypervelocity impact of micrometeoroid and orbital debris (MMOD) projectiles. This analysis program consists of two core modules: design, and; performance. The design module enables a user to calculate preliminary dimensions of a shield configuration (e.g., thicknesses/areal densities, spacing, etc.) for a ?design? particle (diameter, density, impact velocity, incidence). The performance module enables a more detailed shielding analysis, providing the performance of a user-defined shielding configuration over the range of relevant in-orbit impact conditions.

  19. Development of intelligent system for a thermal analysis instrument

    International Nuclear Information System (INIS)

    Xu Xiaoli; Wu Guoxin; Shi Yongchao

    2005-01-01

    The key techniques for the intelligent analysis instrument developed are proposed. Based on the technique of virtual instrumentation, the intelligent PID control algorithm to control the temperature of thermal analysis instrument is described. The dynamic character and the robust performance of traditional PID controls are improved through the dynamic gain factor, temperature rate change factor, the forecast factor, and the temperature correction factor is introduced. Using the graphic development environment of LabVIEW, the design of system modularization and the graphic display are implemented. By means of multiple mathematical modules, intelligent data processing is realized

  20. Comparative Analysis of the Main Business Intelligence Solutions

    OpenAIRE

    Alexandra RUSANEANU

    2013-01-01

    Nowadays, Business Intelligence solutions are the main tools for analyzing and monitoring the company’s performance at any organizational level. This paper presents a comparative analysis of the most powerful Business Intelligence solutions using a set of technical features such as infrastructure of the platform, development facilities, complex analysis tools, interactive dashboards and scorecards, mobile integration and complex implementation of performance management methodologies.

  1. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  2. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  3. Coastal debris analysis in beaches of Chonburi Province, eastern of Thailand as implications for coastal conservation

    International Nuclear Information System (INIS)

    Thushari, Gajahin Gamage Nadeeka; Chavanich, Suchana; Yakupitiyage, Amararatne

    2017-01-01

    This study quantified coastal debris along 3 beaches (Angsila, Bangsaen, Samaesarn) in eastern coast of Thailand. Debris samples were collected from lower and upper strata of these beaches during wet and dry seasons. The results showed that Bangsaen had the highest average debris density (15.5 m −2 ) followed by Samaesarn (8.10 m −2 ), and Angsila (5.54 m −2 ). Among the 12 debris categories, the most abundant debris type was plastics (> 45% of the total debris) in all beach locations. Coastal debris distribution was related to economic activities in the vicinity. Fishery and shell-fish aquaculture activities were primary sources of debris in Angsila while tourism activities were main sources in Bangsaen and Samaesarn. Site-specific pollution control mechanisms (environmental awareness, reuse and recycling) are recommended to reduce public littering. Management actions in Angsila should focus on fishery and shell-fish culture practices, while Bangsaen and Samaesarn should be directed toward leisure activities promoting waste management. - Highlights: • Beach debris assessment was conducted in Chonburi Province, the eatern part of Thailand. • Coastal debris accumulation rates and sizes in the study sites depended on beach characteristics and seasons. • Anthropogenic sources were major contributors of coastal debris in the study sites. • Debris control programs need to focus on site specific coastal pollution issues for effective pollution management actions.

  4. Intelligence

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

  5. Intelligence.

    Science.gov (United States)

    Sternberg, Robert J

    2012-03-01

    Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

  6. Emotional Intelligence and Nurse Recruitment: Rasch and confirmatory factor analysis of the trait emotional intelligence questionnaire short form.

    Science.gov (United States)

    Snowden, Austyn; Watson, Roger; Stenhouse, Rosie; Hale, Claire

    2015-12-01

    To examine the construct validity of the Trait Emotional Intelligence Questionnaire Short form. Emotional intelligence involves the identification and regulation of our own emotions and the emotions of others. It is therefore a potentially useful construct in the investigation of recruitment and retention in nursing and many questionnaires have been constructed to measure it. Secondary analysis of existing dataset of responses to Trait Emotional Intelligence Questionnaire Short form using concurrent application of Rasch analysis and confirmatory factor analysis. First year undergraduate nursing and computing students completed Trait Emotional Intelligence Questionnaire-Short Form in September 2013. Responses were analysed by synthesising results of Rasch analysis and confirmatory factor analysis. Participants (N = 938) completed Trait Emotional Intelligence Questionnaire Short form. Rasch analysis showed the majority of the Trait Emotional Intelligence Questionnaire-Short Form items made a unique contribution to the latent trait of emotional intelligence. Five items did not fit the model and differential item functioning (gender) accounted for this misfit. Confirmatory factor analysis revealed a four-factor structure consisting of: self-confidence, empathy, uncertainty and social connection. All five misfitting items from the Rasch analysis belonged to the 'social connection' factor. The concurrent use of Rasch and factor analysis allowed for novel interpretation of Trait Emotional Intelligence Questionnaire Short form. Much of the response variation in Trait Emotional Intelligence Questionnaire Short form can be accounted for by the social connection factor. Implications for practice are discussed. © 2015 John Wiley & Sons Ltd.

  7. Improvement of molten core-concrete interaction model of the debris spreading analysis model in the SAMPSON code - 15193

    International Nuclear Information System (INIS)

    Hidaka, M.; Fujii, T.; Sakai, T.

    2015-01-01

    A debris spreading analysis (DSA) module has been developed and improved. The module is used in the severe accident analysis code SAMPSON and it has models for 3-dimensional natural convection with simultaneous spreading, melting and solidification. The existing analysis method of the quasi-3D boundary transportation to simulate downward concrete erosion for evaluation of molten-core concrete interaction (MCCI) was improved to full-3D to solve, for instance, debris lateral erosion under concrete floors at the bottom of the sump pit. In the advanced MCCI model, buffer cells were defined in order to solve numerical problems in case of trammel formation. Mass, momentum, and the advection term of energy between the debris melt cells and the buffer cells are solved. On the other hand, only the heat transfer and thermal conduction are solved between the debris melt cells and the structure cells, and the crust cells and the structure cells. As a preliminary analysis, a validation calculation was performed for erosion that occurred in the core-concrete interaction (CCI-2) test in the OECD/MCCI program. Comparison between the calculation and the CCI-2 test results showed the analysis has the ability to simulate debris lateral erosion under concrete floors. (authors)

  8. Explanatory analysis in business intelligence systems

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.; Dinter, B.; Smolnik, S.

    2012-01-01

    In this paper we describe a method for the discovery of exceptional values in business intelligence (BI) systems, in particular OLAP information systems. We also show how exceptional values can be explained by underlying causes. OLAP applications offer a support tool for business analysts and

  9. Requirement analysis for autonomous systems and intelligent ...

    African Journals Online (AJOL)

    First we review innovative control architectures in electric power systems such as Microgrids, Virtual power plants and Cell based systems. We evaluate application of autonomous systems and intelligent agents in each of these control architectures particularly in the context of Denmark's strategic energy plans. The second ...

  10. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  11. Assessing accumulated hard-tissue debris using micro-computed tomography and free software for image processing and analysis.

    Science.gov (United States)

    De-Deus, Gustavo; Marins, Juliana; Neves, Aline de Almeida; Reis, Claudia; Fidel, Sandra; Versiani, Marco A; Alves, Haimon; Lopes, Ricardo Tadeu; Paciornik, Sidnei

    2014-02-01

    The accumulation of debris occurs after root canal preparation procedures specifically in fins, isthmus, irregularities, and ramifications. The aim of this study was to present a step-by-step description of a new method used to longitudinally identify, measure, and 3-dimensionally map the accumulation of hard-tissue debris inside the root canal after biomechanical preparation using free software for image processing and analysis. Three mandibular molars presenting the mesial root with a large isthmus width and a type II Vertucci's canal configuration were selected and scanned. The specimens were assigned to 1 of 3 experimental approaches: (1) 5.25% sodium hypochlorite + 17% EDTA, (2) bidistilled water, and (3) no irrigation. After root canal preparation, high-resolution scans of the teeth were accomplished, and free software packages were used to register and quantify the amount of accumulated hard-tissue debris in either canal space or isthmus areas. Canal preparation without irrigation resulted in 34.6% of its volume filled with hard-tissue debris, whereas the use of bidistilled water or NaOCl followed by EDTA showed a reduction in the percentage volume of debris to 16% and 11.3%, respectively. The closer the distance to the isthmus area was the larger the amount of accumulated debris regardless of the irrigating protocol used. Through the present method, it was possible to calculate the volume of hard-tissue debris in the isthmuses and in the root canal space. Free-software packages used for image reconstruction, registering, and analysis have shown to be promising for end-user application. Copyright © 2014. Published by Elsevier Inc.

  12. Intelligent flame analysis for an optimized combustion

    Energy Technology Data Exchange (ETDEWEB)

    Stephan Peper; Dirk Schmidt [ABB Utilities GmbH, Mainheimm (Germany)

    2003-07-01

    One of the primary challenges in the area of process control is to ensure that many competing optimization goals are accomplished at the same time and be considered in time. This paper describes a successful approach through the use of an advanced pattern recognition technology and intelligent optimization tool modeling combustion processes more precisely and optimizing them based on a holistic view. 17 PowerPoint slides are also available in the proceedings. 5 figs., 1 tab.

  13. Army Intelligence Analysis: Transforming Army Intelligence Analysis Training and Doctrine to Serve the Reasonable Expectations and Needs of Echelons Corps and Below Commanders, Consumers, and Customers

    National Research Council Canada - National Science Library

    Lewis, George E., III

    2005-01-01

    ... of intelligence professionals. Now, when faced with modern adaptive and complex asymmetric threats, the need for human analysis has risen to the forefront, but Army Intelligence is ill-equipped to deliver what commanders need...

  14. A Review of Intelligent Driving Style Analysis Systems and Related Artificial Intelligence Algorithms

    Directory of Open Access Journals (Sweden)

    Gys Albertus Marthinus Meiring

    2015-12-01

    Full Text Available In this paper the various driving style analysis solutions are investigated. An in-depth investigation is performed to identify the relevant machine learning and artificial intelligence algorithms utilised in current driver behaviour and driving style analysis systems. This review therefore serves as a trove of information, and will inform the specialist and the student regarding the current state of the art in driver style analysis systems, the application of these systems and the underlying artificial intelligence algorithms applied to these applications. The aim of the investigation is to evaluate the possibilities for unique driver identification utilizing the approaches identified in other driver behaviour studies. It was found that Fuzzy Logic inference systems, Hidden Markov Models and Support Vector Machines consist of promising capabilities to address unique driver identification algorithms if model complexity can be reduced.

  15. A Review of Intelligent Driving Style Analysis Systems and Related Artificial Intelligence Algorithms.

    Science.gov (United States)

    Meiring, Gys Albertus Marthinus; Myburgh, Hermanus Carel

    2015-12-04

    In this paper the various driving style analysis solutions are investigated. An in-depth investigation is performed to identify the relevant machine learning and artificial intelligence algorithms utilised in current driver behaviour and driving style analysis systems. This review therefore serves as a trove of information, and will inform the specialist and the student regarding the current state of the art in driver style analysis systems, the application of these systems and the underlying artificial intelligence algorithms applied to these applications. The aim of the investigation is to evaluate the possibilities for unique driver identification utilizing the approaches identified in other driver behaviour studies. It was found that Fuzzy Logic inference systems, Hidden Markov Models and Support Vector Machines consist of promising capabilities to address unique driver identification algorithms if model complexity can be reduced.

  16. Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser

    Science.gov (United States)

    Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu

    2018-02-01

    Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.

  17. StreakDet data processing and analysis pipeline for space debris optical observations

    Science.gov (United States)

    Virtanen, Jenni; Flohrer, Tim; Muinonen, Karri; Granvik, Mikael; Torppa, Johanna; Poikonen, Jonne; Lehti, Jussi; Santti, Tero; Komulainen, Tuomo; Naranen, Jyri

    We describe a novel data processing and analysis pipeline for optical observations of space debris. The monitoring of space object populations requires reliable acquisition of observational data, to support the development and validation of space debris environment models, the build-up and maintenance of a catalogue of orbital elements. In addition, data is needed for the assessment of conjunction events and for the support of contingency situations or launches. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a “track before detect” problem, resulting in streaks, i.e., object trails of arbitrary lengths, in the images. The scope of the ESA-funded StreakDet (Streak detection and astrometric reduction) project is to investigate solutions for detecting and reducing streaks from optical images, particularly in the low signal-to-noise ratio (SNR) domain, where algorithms are not readily available yet. For long streaks, the challenge is to extract precise position information and related registered epochs with sufficient precision. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, there is a need to discuss and compare these approaches for space debris analysis, in order to develop and evaluate prototype implementations. In the StreakDet project, we develop algorithms applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The proposed processing pipeline starts from the

  18. Emotional Intelligence and Academic Success: A Conceptual Analysis for Educational Leaders

    Science.gov (United States)

    Labby, Sandy; Lunenburg, Frederick C.; Slate, John R.

    2012-01-01

    In this review of the literature, we briefly examined the development of intelligence theories as they lead to the emergence of the concept of emotional intelligence(s). In our analysis, we noted that only limited attention had been focused on the emotional intelligence skills of school administrators. Accordingly, we examined the role of…

  19. Analysis of Debris Trajectories at the Scaled Wind Farm Technology (SWiFT) Facility

    Energy Technology Data Exchange (ETDEWEB)

    White, Jonathan R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burnett, Damon J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    Sandia National Laboratories operates the Scaled Wind Farm Technology Facility (SWiFT) on behalf of the Department of Energy Wind and Water Power Technologies Office. An analysis was performed to evaluate the hazards associated with debris thrown from one of SWiFT’s operating wind turbines, assuming a catastrophic failure. A Monte Carlo analysis was conducted to assess the complex variable space associated with debris throw hazards that included wind speed, wind direction, azimuth and pitch angles of the blade, and percentage of the blade that was separated. In addition, a set of high fidelity explicit dynamic finite element simulations were performed to determine the threshold impact energy envelope for the turbine control building located on-site. Assuming that all of the layered, independent, passive and active engineered safety systems and administrative procedures failed (a 100% failure rate of the safety systems), the likelihood of the control building being struck was calculated to be less than 5/10,000 and ballistic simulations showed that the control building would not provide passive protection for the majority of impact scenarios. Although options exist to improve the ballistic resistance of the control building, the recommendation is not to pursue them because there is a low probability of strike and there is an equal likelihood personnel could be located at similar distances in other areas of the SWiFT facility which are not passively protected, while the turbines are operating. A fenced exclusion area has been created around the turbines which restricts access to the boundary of the 1/100 strike probability. The overall recommendation is to neither relocate nor improve passive protection of the control building as the turbine safety systems have been improved to have no less than two independent, redundant, high quality engineered safety systems. Considering this, in combination with a control building strike probability of less than 5/10,000, the

  20. Artificial intelligence analysis of paraspinal power spectra.

    Science.gov (United States)

    Oliver, C W; Atsma, W J

    1996-10-01

    OBJECTIVE: As an aid to discrimination of sufferers with back pain an artificial intelligence neural network was constructed to differentiate paraspinal power spectra. DESIGN: Clinical investigation using surface electromyography. METHOD: The surface electromyogram power spectra from 60 subjects, 33 non-back-pain sufferers and 27 chronic back pain sufferers were used to construct a back propagation neural network that was then tested. Subjects were placed on a test frame in 30 degrees of lumbar forward flexion. An isometric load of two-thirds maximum voluntary contraction was held constant for 30 s whilst surface electromyograms were recorded at the level of the L(4-5). Paraspinal power spectra were calculated and loaded into the input layer of a three-layer back propagation network. The neural network classified the spectra into normal or back pain type. RESULTS: The back propagation neural was shown to have satisfactory convergence with a specificity of 79% and a sensitivity of 80%. CONCLUSIONS: Artificial intelligence neural networks appear to be a useful method of differentiating paraspinal power spectra in back-pain sufferers.

  1. Quantitative analysis and predictors of embolic filter debris load during carotid artery stenting in asymptomatic patients.

    Science.gov (United States)

    Piazza, Michele; Squizzato, Francesco; Chincarini, Chiara; Fedrigo, Marny; Castellani, Chiara; Angelini, Annalisa; Grego, Franco; Antonello, Michele

    2018-03-01

    The objective of this study was to perform a quantitative analysis and to identify predictors of embolic filter debris (EFD) load during carotid artery stenting (CAS) in asymptomatic patients. All patients with asymptomatic carotid stenosis >70% undergoing CAS between 2008 and 2016 were included in a prospective database. A distal filter protection device was used in all patients. At the end of the procedure, the filter was fixed in formalin and then analyzed with a stereomicroscope. Morphometric analysis was performed with Image-Pro Plus software (Media Cybernetics, Rockville, Md). The total area of the filter membrane and the area covered by particulate material were quantified. The quantity of membrane occupied by debris was expressed as percentage of covered surface area. Anatomic and clinical variables were evaluated for their association with EFD load using multiple logistic regression. Among the 278 patients undergoing CAS, an open-cell stent was implanted in 211 patients (76%); 67 patients (24%) received a closed-cell stent. Overall technical success and clinical success were both 99%; no perioperative death was reported. Stroke rate was 1.8% (major, n = 1 [0.4%]; minor, n = 4 [1.4%]); transient ischemic attacks occurred in 5% of cases (n = 14). The quantitative analysis of the filter revealed that EFD was present in 74% of cases (n = 207). The mean EFD load was 10% of the filter surface (median, 1; range, 0-80); it was 31% in 22 (8%). Patients with any type of ischemic neurologic event after CAS (stroke and transient ischemic attack) had a significantly higher mean EFD load compared with uneventful cases (26.7% ± 19.0% vs 8.5% ± 13.5%; P 12.5% EFD load as the optimal cutoff for the association with clinically relevant perioperative ischemic events (sensitivity, 78%; specificity, 77%; area under the curve, 0.81). The multivariate analysis demonstrated that age >75 years (odds ratio [OR], 2.56; P = .003), pre-existing ipsilateral ischemic

  2. Sense-making for intelligence analysis on social media data

    Science.gov (United States)

    Pritzkau, Albert

    2016-05-01

    Social networks, in particular online social networks as a subset, enable the analysis of social relationships which are represented by interaction, collaboration, or other sorts of influence between people. Any set of people and their internal social relationships can be modelled as a general social graph. These relationships are formed by exchanging emails, making phone calls, or carrying out a range of other activities that build up the network. This paper presents an overview of current approaches to utilizing social media as a ubiquitous sensor network in the context of national and global security. Exploitation of social media is usually an interdisciplinary endeavour, in which the relevant technologies and methods are identified and linked in order ultimately demonstrate selected applications. Effective and efficient intelligence is usually accomplished in a combined human and computer effort. Indeed, the intelligence process heavily depends on combining a human's flexibility, creativity, and cognitive ability with the bandwidth and processing power of today's computers. To improve the usability and accuracy of the intelligence analysis we will have to rely on data-processing tools at the level of natural language. Especially the collection and transformation of unstructured data into actionable, structured data requires scalable computational algorithms ranging from Artificial Intelligence, via Machine Learning, to Natural Language Processing (NLP). To support intelligence analysis on social media data, social media analytics is concerned with developing and evaluating computational tools and frameworks to collect, monitor, analyze, summarize, and visualize social media data. Analytics methods are employed to extract of significant patterns that might not be obvious. As a result, different data representations rendering distinct aspects of content and interactions serve as a means to adapt the focus of the intelligence analysis to specific information

  3. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    Science.gov (United States)

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  4. Woody debris

    Science.gov (United States)

    Donna B. Scheungrab; Carl C. Trettin; Russ Lea; Martin F. Jurgensen

    2000-01-01

    Woody debris can be defined as any dead, woody plant material, including logs, branches, standing dead trees, and root wads. Woody debris is an important part of forest and stream ecosystems because it has a role in carbon budgets and nutrient cycling, is a source of energy for aquatic ecosystems, provides habitat for terrestrial and aquatic organisms, and contributes...

  5. Coastal debris analysis in beaches of Chonburi Province, eastern of Thailand as implications for coastal conservation.

    Science.gov (United States)

    Thushari, Gajahin Gamage Nadeeka; Chavanich, Suchana; Yakupitiyage, Amararatne

    2017-03-15

    This study quantified coastal debris along 3 beaches (Angsila, Bangsaen, Samaesarn) in eastern coast of Thailand. Debris samples were collected from lower and upper strata of these beaches during wet and dry seasons. The results showed that Bangsaen had the highest average debris density (15.5m -2 ) followed by Samaesarn (8.10m -2 ), and Angsila (5.54m -2 ). Among the 12 debris categories, the most abundant debris type was plastics (>45% of the total debris) in all beach locations. Coastal debris distribution was related to economic activities in the vicinity. Fishery and shell-fish aquaculture activities were primary sources of debris in Angsila while tourism activities were main sources in Bangsaen and Samaesarn. Site-specific pollution control mechanisms (environmental awareness, reuse and recycling) are recommended to reduce public littering. Management actions in Angsila should focus on fishery and shell-fish culture practices, while Bangsaen and Samaesarn should be directed toward leisure activities promoting waste management. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Tornado Damage Assessment: Reconstructing the Wind Through Debris Tracking and Treefall Pattern Analysis

    Science.gov (United States)

    Godfrey, C. M.; Peterson, C. J.; Lombardo, F.

    2017-12-01

    Efforts to enhance the resilience of communities to tornadoes requires an understanding of the interconnected nature of debris and damage propagation in both the built and natural environment. A first step toward characterizing the interconnectedness of these elements within a given community involves detailed post-event surveys of tornado damage. Such damage surveys immediately followed the 22 January 2017 EF3 tornadoes in the southern Georgia towns of Nashville and Albany. After assigning EF-scale ratings to impacted structures, the authors geotagged hundreds of pieces of debris scattered around selected residential structures and outbuildings in each neighborhood and paired each piece of debris with its source structure. Detailed information on trees in the vicinity of the structures supplements the debris data, including the species, dimensions, location, fall direction, and level of damage. High-resolution satellite imagery helps to identify the location and fall direction of hundreds of additional forest trees. These debris and treefall patterns allow an estimation of the near-surface wind field using a Rankine vortex model coupled with both a tree stability model and an infrastructure fragility model that simulates debris flight. Comparisons between the modeled damage and the actual treefall and debris field show remarkable similarities for a selected set of vortex parameters, indicating the viability of this approach for estimating enhanced Fujita scale levels, determining the near-surface wind field of a tornado during its passage through a neighborhood, and identifying how debris may contribute to the overall risk from tornadoes.

  7. Grain-Size Analysis of Debris Flow Alluvial Fans in Panxi Area along Jinsha River, China

    Directory of Open Access Journals (Sweden)

    Wen Zhang

    2015-11-01

    Full Text Available The basic geometric parameters of 236 debris flow catchments were determined by interpreting SPOT5 remote sensing images with a resolution of 2.5 m in a 209 km section along the Jinsha River in the Panxi area, China. A total of 27 large-scale debris flow catchments were selected for detailed in situ investigation. Samples were taken from two profiles in the deposition zone for each debris flow catchment. The φ value gradation method of the grain size was used to obtain 54 histograms with abscissa in a logarithmic scale. Five types of debris flows were summarized from the outline of the histogram. Four grain size parameters were calculated: mean grain size, standard deviation, coefficient of skewness, and coefficient of kurtosis. These four values were used to evaluate the features of the histogram. The grain index that reflects the transport (kinetic energy information of debris flows was defined to describe the characteristics of the debris-flow materials. Furthermore, a normalized grain index based on the catchment area was proposed to allow evaluation of the debris flow mobility. The characteristics of the debris-flow materials were well-described by the histogram of grain-size distribution and the normalized grain index.

  8. Dynamic Analysis of Emotions through Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Susana Mejía M.

    2016-04-01

    Full Text Available Emotions have been demonstrated to be an important aspect of human intelligence and to play a significant role in human decision-making processes. Emotions are not only feelings but also processes of establishing, maintaining or disrupting the relation between the organism and the environment. In the present paper, several features of social and developmental Psychology are introduced, especially concepts that are related to Theories of Emotions and the Mathematical Tools applied in psychology (i.e., Dynamic Systems and Fuzzy Logic. Later, five models that infer emotions from a single event, in AV-Space, are presented and discussed along with the finding that fuzzy logic can measure human emotional states

  9. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  10. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  11. Decision Tree and Texture Analysis for Mapping Debris-Covered Glaciers in the Kangchenjunga Area, Eastern Himalaya

    Directory of Open Access Journals (Sweden)

    Adina Racoviteanu

    2012-10-01

    Full Text Available In this study we use visible, short-wave infrared and thermal Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER data validated with high-resolution Quickbird (QB and Worldview2 (WV2 for mapping debris cover in the eastern Himalaya using two independent approaches: (a a decision tree algorithm, and (b texture analysis. The decision tree algorithm was based on multi-spectral and topographic variables, such as band ratios, surface reflectance, kinetic temperature from ASTER bands 10 and 12, slope angle, and elevation. The decision tree algorithm resulted in 64 km2 classified as debris-covered ice, which represents 11% of the glacierized area. Overall, for ten glacier tongues in the Kangchenjunga area, there was an area difference of 16.2 km2 (25% between the ASTER and the QB areas, with mapping errors mainly due to clouds and shadows. Texture analysis techniques included co-occurrence measures, geostatistics and filtering in spatial/frequency domain. Debris cover had the highest variance of all terrain classes, highest entropy and lowest homogeneity compared to the other classes, for example a mean variance of 15.27 compared to 0 for clouds and 0.06 for clean ice. Results of the texture image for debris-covered areas were comparable with those from the decision tree algorithm, with 8% area difference between the two techniques.

  12. ENERGY AUDIT ANALYSIS BY BUSINESS INTELLIGENCE APPLICATION

    Directory of Open Access Journals (Sweden)

    Alfa Firdaus

    2015-12-01

    Full Text Available Energy audit is one of the first tasks to be performed in the accomplishment of an effective energy cost control program. To obtain the best information for a successful energy audit, the auditor must make some measurements during the audit visit. One of the tools that primarily used in audit visit is the portable Power Quality Analyzers (PQA for measuring single to three-phase lines with a high degree of precision and accuracy. It is utilized for monitoring and recording power supply anomalies. For most survey applications, changing currents makes it mandatory for data to be compiled over a period of time with enormous amount of electricity data. Hence, this paper proposed a Business Intelligence approach that can facilitate the auditor to quickly analyze the PQA data. There are five Key Performance Indicators (KPI to be displayed for analyze in form of dashboard. The method that uses to construct the dashboard is classification and association rules with the help of orange dataminer tools. Classification method is utilized to display the data distributions by frequency on a bar chart. Once we got the frequent sets, they allow us to extract association rules among the item sets, where we make some statement about how likely are two sets of items to co-occur or to conditionally occur. The result of this paper is a dashboard of five scorecards, namely unbalanced voltage, unbalanced currents, voltage harmonic, currents harmonic, and power factor. 

  13. Intelligent Data Analysis in the 21st Century

    Science.gov (United States)

    Cohen, Paul; Adams, Niall

    When IDA began, data sets were small and clean, data provenance and management were not significant issues, workflows and grid computing and cloud computing didn’t exist, and the world was not populated with billions of cellphone and computer users. The original conception of intelligent data analysis — automating some of the reasoning of skilled data analysts — has not been updated to account for the dramatic changes in what skilled data analysis means, today. IDA might update its mission to address pressing problems in areas such as climate change, habitat loss, education, and medicine. It might anticipate data analysis opportunities five to ten years out, such as customizing educational trajectories to individual students, and personalizing medical protocols. Such developments will elevate the conference and our community by shifting our focus from arbitrary measures of the performance of isolated algorithms to the practical, societal value of intelligent data analysis systems.

  14. analysis of an analysis of an intelligent temperature transmitter

    African Journals Online (AJOL)

    eobe

    temperature sensors and analyze a typical Rosemount Intelligent Temperature Transmitter (RITT) with a view to identifying and ... material science and communication technologies [2]. ... Some benefits of the 4-20mA transmission standard.

  15. Distributed Collaborative Analysis: A New Approach for Intelligence Analysis

    National Research Council Canada - National Science Library

    Greene, Gus

    2001-01-01

    ... calls for resource reductions by the public. At the same time, the rapid pace of this growth has caused decision makers at all echelons - tactical to strategic - to challenge the Intelligence Community to become more responsive and agile...

  16. Revisiting the Psychology of Intelligence Analysis: From Rational Actors to Adaptive Thinkers

    Science.gov (United States)

    Puvathingal, Bess J.; Hantula, Donald A.

    2012-01-01

    Intelligence analysis is a decision-making process rife with ambiguous, conflicting, irrelevant, important, and excessive information. The U.S. Intelligence Community is primed for psychology to lend its voice to the "analytic transformation" movement aimed at improving the quality of intelligence analysis. Traditional judgment and decision making…

  17. Analysis of debris-flow recordings in an instrumented basin: confirmations and new findings

    Directory of Open Access Journals (Sweden)

    M. Arattano

    2012-03-01

    Full Text Available On 24 August 2006, a debris flow took place in the Moscardo Torrent, a basin of the Eastern Italian Alps instrumented for debris-flow monitoring. The debris flow was recorded by two seismic networks located in the lower part of the basin and on the alluvial fan, respectively. The event was also recorded by a pair of ultrasonic sensors installed on the fan, close to the lower seismic network. The comparison between the different recordings outlines particular features of the August 2006 debris flow, different from that of events recorded in previous years. A typical debris-flow wave was observed at the upper seismic network, with a main front abruptly appearing in the torrent, followed by a gradual decrease of flow height. On the contrary, on the alluvial fan the wave displayed an irregular pattern, with low flow depth and the main peak occurring in the central part of the surge both in the seismic recording and in the hydrographs. Recorded data and field evidences indicate that the surge observed on the alluvial fan was not a debris flow, and probably consisted in a water surge laden with fine to medium-sized sediment. The change in shape and characteristics of the wave can be ascribed to the attenuation of the surge caused by the torrent control works implemented in the lower basin during the last years.

  18. Capability Challenges in the Human Domain for Intelligence Analysis: Report on Community-Wide Discussions with Canadian Intelligence Professionals

    Science.gov (United States)

    2012-03-01

    consultation ont discuté avec des membres de la collectivité canadienne du renseignement. Cette recherche repose sur une étude antérieure d’un petit...community); − Conduct social network analysis research of the intelligence community to identify any gaps in collaboration; 10.2.2 - Developing a...DRDC Toronto. Fischhoff, B., & Chauvin, C. (Eds.). (2011). Intelligence Analysis: Behavioral and Social Scientific Foundations (Committee on

  19. Intelligent control of HVAC systems. Part II: perceptron performance analysis

    Directory of Open Access Journals (Sweden)

    Ioan URSU

    2013-09-01

    Full Text Available This is the second part of a paper on intelligent type control of Heating, Ventilating, and Air-Conditioning (HVAC systems. The whole study proposes a unified approach in the design of intelligent control for such systems, to ensure high energy efficiency and air quality improving. In the first part of the study it is considered as benchmark system a single thermal space HVAC system, for which it is assigned a mathematical model of the controlled system and a mathematical model(algorithm of intelligent control synthesis. The conception of the intelligent control is of switching type, between a simple neural network, a perceptron, which aims to decrease (optimize a cost index,and a fuzzy logic component, having supervisory antisaturating role for neuro-control. Based on numerical simulations, this Part II focuses on the analysis of system operation in the presence only ofthe neural control component. Working of the entire neuro-fuzzy system will be reported in a third part of the study.

  20. The use of vapour phase ultra-violet spectroscopy for the analysis of arson accelerants in fire scene debris.

    Science.gov (United States)

    McCurdy, R J; Atwell, T; Cole, M D

    2001-12-01

    A method has been developed for the analysis of arson accelerants in fire scene debris by vapour phase ultra-violet (UV) spectroscopy. The method is rapid, inexpensive, simple to use and is sufficiently sensitive and discriminating to be of use for the analysis of crime scene samples. Application to casework samples is described. On occasion, the method offers additional information to that which can be obtained by gas chromatography-flame ionisation detection (GC-FID) and gas chromatography-mass spectrometry (GC-MS) and represents a useful adjunct to these techniques. In addition, the method offers advantages where the use of GC-MS analysis of arson accelerants in fire scene debris is not a practical proposition.

  1. Numerical models for the analysis of thermal behavior and coolability of a particulate debris bed in reactor lower head

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Kwang Il; Kim, Sang Baik; Kim, Byung Seok [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    This report provides three distinctive, but closely related numerical models developed for the analysis of thermal behavior and coolability of a particulate debris bed that is may be formed inside the reactor lower head during severe accident late phases. The first numerical module presented in the report, MELTPRO-DRY, is used to analyze numerically heat-up and melting process of the dry particle bed, downward- and sideward-relocation of the liquid melt under gravity force and capillary force acting among porous particles, and solidification of the liquid melt relocated into colder region. The second module, MELTPROG-WET, is used to simulate numerically the cooling process of the particulate debris bed under the existence of water, which is subjected to two types of numerical models. The first type of WET module utilizes distinctive models that parametrically simulate the water cooling process, that is, quenching region, dryout region, and transition region. The choice of each parametric model depends on temperature gradient between the cooling water and the debris particles. The second type of WET module utilizes two-phase flow model that mechanically simulates the cooling process of the debris bed. For a consistent simulation from the water cooling to the dryout debris bed, on the other hand, the aforementioned two modules, MELTPROG-DRY and MELTPROG-WET, were integrated into a single computer program DBCOOL. Each of computational models was verified through limited applications to a heat-generating particulate bed contained in the rectangular cavity. 22 refs., 5 figs., 2 tabs. (Author)

  2. Structural analysis consultation using artificial intelligence

    Science.gov (United States)

    Melosh, R. J.; Marcal, P. V.; Berke, L.

    1978-01-01

    The primary goal of consultation is definition of the best strategy to deal with a structural engineering analysis objective. The knowledge base to meet the need is designed to identify the type of numerical analysis, the needed modeling detail, and specific analysis data required. Decisions are constructed on the basis of the data in the knowledge base - material behavior, relations between geometry and structural behavior, measures of the importance of time and temperature changes - and user supplied specifics characteristics of the spectrum of analysis types, the relation between accuracy and model detail on the structure, its mechanical loadings, and its temperature states. Existing software demonstrated the feasibility of the approach, encompassing the 36 analysis classes spanning nonlinear, temperature affected, incremental analyses which track the behavior of structural systems.

  3. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  4. Models for dryout in debris beds. Review and application to the analysis of PAHR

    International Nuclear Information System (INIS)

    Yamakoshi, Yoshinori

    2000-03-01

    There are many models for dryout in debiris beds and various conditions under which these models are applicable. For a reliable analysis of post-accident heat removal (PAHR), it is important that characteristics and applicability of each model should be made clear. In this report, formulation of the models for dryout and applicability of them are studied through comparing with experimental data. A new model for dryout prediction is also discussed here. It is difficult to predict the dryout power especially for a relatively shallow bed using a conventional model for channeled beds. The new model, which is based on the one-dimensional model derived by Lipinski, has permeability of channels in the governing equation, and enables us to predict the dryout power for relatively shallow beds. The following conclusions are derived from comparing the predicted dryout power with experimental data. The model for series heat removal is applicable to a packed bed while the DEBRIS-MD underestimates the dryout power for it. Either the original model assuming channel formation on the top of the bed or the modified model is applicable to a relatively deep bed with channels. For a relatively shallow bed with channels, the dryout power predicted by the modified model agrees with the experimental data in comparison with other models. (author)

  5. Integral analysis of debris material and heat transport in reactor vessel lower plenum

    International Nuclear Information System (INIS)

    Suh, K.Y.; Henry, R.E.

    1994-01-01

    An integral, fast-running, two-region model has been developed to characterize the debris material and heat transport in the reactor lower plenum under severe accident conditions. The debris bed is segregated into the oxidic pool and an overlying metallic layer. Debris crusts can develop on three surfaces: the top of the molten pool, the RPV wall, and the internal structures. To account for the decay heat generation, the crust temperature profile is assumed to be parabolic. The oxidic debris pool is homogeneously mixed and has the same material composition, and hence the same thermophysical properties, as the crusts, while the metallic constituents are assumed to rise to the top of the debris pool. Steady-state relationships are used to describe the heat transfer rates, with the assessment of solid or liquid state, and the liquid superheat in the pool being based on the average debris temperature. Natural convection heat transfer from the molten debris pool to the upper, lower and embedded crusts is calculated based on the pool Rayleigh number with the conduction heat transfer from the crusts being determined by the crust temperature profile. The downward heat flux is transferred to the lowest part of the RPV lower head through a crust-to-RPV contact resistance. The sideward heat flux is transferred to the upper regions of the RPV lower head as well as to the internal structures. The upward heat flux goes to the metal layer, water, or available heat sink structures above. Quenching due to water ingression is modeled separately from the energy transfer through the crust. The RPV wall temperature distribution and the primary system pressure are utilized to estimate challenges to the RPV integrity. ((orig.))

  6. Debris-flow risk analysis in a managed torrent based on a stochastic life-cycle performance

    International Nuclear Information System (INIS)

    Ballesteros Cánovas, J.A.; Stoffel, M.; Corona, C.; Schraml, K.; Gobiet, A.; Tani, S.; Sinabell, F.; Fuchs, S.; Kaitna, R.

    2016-01-01

    Two key factors can affect the functional ability of protection structures in mountains torrents, namely (i) infrastructure maintenance of existing infrastructures (as a majority of existing works is in the second half of their life cycle), and (ii) changes in debris-flow activity as a result of ongoing and expected future climatic changes. Here, we explore the applicability of a stochastic life-cycle performance to assess debris-flow risk in the heavily managed Wartschenbach torrent (Lienz region, Austria) and to quantify associated, expected economic losses. We do so by considering maintenance costs to restore infrastructure in the aftermath of debris-flow events as well as by assessing the probability of check dam failure (e.g., as a result of overload). Our analysis comprises two different management strategies as well as three scenarios defining future changes in debris-flow activity resulting from climatic changes. At the study site, an average debris-flow frequency of 21 events per decade was observed for the period 1950–2000; activity at the site is projected to change by + 38% to − 33%, according to the climate scenario used. Comparison of the different management alternatives suggests that the current mitigation strategy will allow to reduce expected damage to infrastructure and population almost fully (89%). However, to guarantee a comparable level of safety, maintenance costs is expected to increase by 57–63%, with an increase of maintenance costs by ca. 50% for each intervention. Our analysis therefore also highlights the importance of taking maintenance costs into account for risk assessments realized in managed torrent systems, as they result both from progressive and event-related deteriorations. We conclude that the stochastic life-cycle performance adopted in this study represents indeed an integrated approach to assess the long-term effects and costs of prevention structures in managed torrents. - Highlights: • Debris flows are considered

  7. Debris-flow risk analysis in a managed torrent based on a stochastic life-cycle performance

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros Cánovas, J.A., E-mail: juan.ballesteros@dendrolab.ch [Dendrolab.ch. Institute for Geological Sciences, University of Bern, Baltzerstrasse 1 + 3, CH-3012 Bern (Switzerland); Climate Change an Climate Impacts (C3i) Institute for Environmental Sciences, University of Geneva, 66 Boulevard Carl-Vogt, CH-1205 Geneva (Switzerland); Stoffel, M. [Dendrolab.ch. Institute for Geological Sciences, University of Bern, Baltzerstrasse 1 + 3, CH-3012 Bern (Switzerland); Climate Change an Climate Impacts (C3i) Institute for Environmental Sciences, University of Geneva, 66 Boulevard Carl-Vogt, CH-1205 Geneva (Switzerland); Department of Earth Sciences, University of Geneva, 13 rue des Maraîchers, CH-1205 Geneva (Switzerland); Corona, C. [Centre National de la Recherche Scientifique (CNRS) UMR6042 Geolab, 4 rue Ledru, F-63057 Clermont-Ferrand Cedex (France); Schraml, K. [Institute for Alpine Hazards, University of Natural Resources and Life Sciences, Vienna (BOKU), A-1190 Vienna (Austria); Gobiet, A. [University of Graz, Wegener Center for Climate and Global Change (WegCenter), A-8010 Graz (Austria); Central Office for Meteorology and Geodynamics (ZAMG), A-1190 Vienna (Austria); Tani, S. [University of Graz, Wegener Center for Climate and Global Change (WegCenter), A-8010 Graz (Austria); Sinabell, F. [Austrian Institute of Economic Research, A-1030 Vienna (Austria); Fuchs, S.; Kaitna, R. [Institute for Alpine Hazards, University of Natural Resources and Life Sciences, Vienna (BOKU), A-1190 Vienna (Austria)

    2016-07-01

    Two key factors can affect the functional ability of protection structures in mountains torrents, namely (i) infrastructure maintenance of existing infrastructures (as a majority of existing works is in the second half of their life cycle), and (ii) changes in debris-flow activity as a result of ongoing and expected future climatic changes. Here, we explore the applicability of a stochastic life-cycle performance to assess debris-flow risk in the heavily managed Wartschenbach torrent (Lienz region, Austria) and to quantify associated, expected economic losses. We do so by considering maintenance costs to restore infrastructure in the aftermath of debris-flow events as well as by assessing the probability of check dam failure (e.g., as a result of overload). Our analysis comprises two different management strategies as well as three scenarios defining future changes in debris-flow activity resulting from climatic changes. At the study site, an average debris-flow frequency of 21 events per decade was observed for the period 1950–2000; activity at the site is projected to change by + 38% to − 33%, according to the climate scenario used. Comparison of the different management alternatives suggests that the current mitigation strategy will allow to reduce expected damage to infrastructure and population almost fully (89%). However, to guarantee a comparable level of safety, maintenance costs is expected to increase by 57–63%, with an increase of maintenance costs by ca. 50% for each intervention. Our analysis therefore also highlights the importance of taking maintenance costs into account for risk assessments realized in managed torrent systems, as they result both from progressive and event-related deteriorations. We conclude that the stochastic life-cycle performance adopted in this study represents indeed an integrated approach to assess the long-term effects and costs of prevention structures in managed torrents. - Highlights: • Debris flows are considered

  8. Intelligent Performance Analysis with a Natural Language Interface

    Science.gov (United States)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  9. An Intelligent technical analysis using neural network

    Directory of Open Access Journals (Sweden)

    Reza Raei

    2011-07-01

    Full Text Available Technical analysis has been one of the most popular methods for stock market predictions for the past few decades. There have been enormous technical analysis methods to study the behavior of stock market for different kinds of trading markets such as currency, commodity or stock. In this paper, we propose two different methods based on volume adjusted moving average and ease of movement for stock trading. These methods are used with and without generalized regression neural network methods and the results are compared with each other. The preliminary results on historical stock price of 20 firms indicate that there is no meaningful difference between various proposed models of this paper.

  10. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  11. Scientific & Intelligence Exascale Visualization Analysis System

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-14

    SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.

  12. Analysis of dryout behaviour in laterally non-homogeneous debris beds using the MEWA-2D code

    International Nuclear Information System (INIS)

    Rahman, Saidur; Buerger, Manfred; Buck, Michael; Pohlner, Georg; Kulenovic, Rudi; Nayak, Arun Kumar; Sehgal, Bal Raj

    2009-01-01

    The present study analyses the impact of lateral non-homogeneities on the coolability of heated, initially water filled debris beds. Debris beds which may be formed in a postulated severe accident in light water reactors can not be expected to have a homogeneous structure. Lateral non-homogeneities are given e.g. already by a variation in height as in a heap of debris. Internally, less porous or more porous region may occur, the latter even as downcomer-like structures are considered to favour supply of water to the bed and thus coolability. In previous work it has been shown that such non-homogeneities are often strongly enhancing coolability, as compared to earlier investigations on laterally homogeneous beds. The present contribution aims at extending the view by analysing further cases of non-homogeneities with the MEWA-2D code. Especially, effects of capillary forces are considered in contrast to earlier analysis. Part of the paper deals with specific experiments performed in the POMECO facility at KTH in which a laterally stratified debris bed has been considered, whereby especially a strong jump of porosity, from 0.26 to 0.38, has been established. Astonishingly, under top as well as bottom flooding, dryout in these experiments occurred first in the lateral layer with higher porosity. Understanding is now provided by the effect of capillary forces: water is drawn from this layer to the less porous one. This effect improves the cooling in the less porous layer while it reduces coolability of the more porous layer. No real loop behaviour of inflow via the higher porosities with subsequent upflow in the less porous layer establishes here, in contrast to expectations. Other cases (different lateral heating in an otherwise homogeneous bed, closed downcomer in a homogeneous bed and heap-like debris) show, on the other hand, strongly improved coolability by such loops establishing due to the lateral differences in void and the corresponding pressure differences

  13. Monitoring of Wind Turbine Gearbox Condition through Oil and Wear Debris Analysis: A Full-Scale Testing Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Shuangwen

    2016-10-01

    Despite the wind industry's dramatic development during the past decade, it is still challenged by premature turbine subsystem/component failures, especially for turbines rated above 1 MW. Because a crane is needed for each replacement, gearboxes have been a focal point for improvement in reliability and availability. Condition monitoring (CM) is a technique that can help improve these factors, leading to reduced turbine operation and maintenance costs and, subsequently, lower cost of energy for wind power. Although technical benefits of CM for the wind industry are normally recognized, there is a lack of published information on the advantages and limitations of each CM technique confirmed by objective data from full-scale tests. This article presents first-hand oil and wear debris analysis results obtained through tests that were based on full-scale wind turbine gearboxes rated at 750 kW. The tests were conducted at the 2.5-MW dynamometer test facility at the National Wind Technology Center at the National Renewable Energy Laboratory. The gearboxes were tested in three conditions: run-in, healthy, and damaged. The investigated CM techniques include real-time oil condition and wear debris monitoring, both inline and online sensors, and offline oil sample and wear debris analysis, both onsite and offsite laboratories. The reported results and observations help increase wind industry awareness of the benefits and limitations of oil and debris analysis technologies and highlight the challenges in these technologies and other tribological fields for the Society of Tribologists and Lubrication Engineers and other organizations to help address, leading to extended gearbox service life.

  14. Applications Of Artificial Intelligence In Control System Analysis And Design

    Science.gov (United States)

    Birdwell, J. D.

    1987-10-01

    To date, applications of artificial intelligence in control system analysis and design are primarily associated with the design process. These applications take the form of knowledge bases incorporating expertise on a design method, such as multivariable linear controller design, or on a field such as identification. My experience has demonstrated that, while such expert systems are useful, perhaps a greater benefit will come from applications in the maintenance of technical databases, as are found in real-time data acquisition systems, and of modeling and design databases, which represent the status of a computer-aided design process for a human user. This reflects the observation that computers are best at maintaining relations about large sets of objects, whereas humans are best at maintaining knowledge of depth, as occurs when a design option involving a sequence of steps is explored. This paper will discuss some of these issues, and will provide some examples which illustrate the potential of artificial intelligence.

  15. THE SPITZER INFRARED SPECTROGRAPH DEBRIS DISK CATALOG. I. CONTINUUM ANALYSIS OF UNRESOLVED TARGETS

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Christine H. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Mittal, Tushar [Department of Earth and Planetary Science, University of California Berkeley, Berkeley, CA 94720-4767 (United States); Kuchner, Marc [NASA Goddard Space Flight Center, Exoplanets and Stellar Astrophysics Laboratory, Code 667, Greenbelt, MD 20771 (United States); Forrest, William J.; Watson, Dan M. [Department of Physics and Astronomy, University of Rochester, Rochester, NY 14627 (United States); Lisse, Carey M. [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Manoj, P. [Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005 (India); Sargent, Benjamin A., E-mail: cchen@stsci.edu [Center for Imaging Science and Laboratory for Multiwavelength Astrophysics, Rochester Institute of Technology, 54 Lomb Memorial Drive, Rochester, NY 14623 (United States)

    2014-04-01

    During the Spitzer Space Telescope cryogenic mission, Guaranteed Time Observers, Legacy Teams, and General Observers obtained Infrared Spectrograph (IRS) observations of hundreds of debris disk candidates. We calibrated the spectra of 571 candidates, including 64 new IRAS and Multiband Imaging Photometer for Spitzer (MIPS) debris disks candidates, modeled their stellar photospheres, and produced a catalog of excess spectra for unresolved debris disks. For 499 targets with IRS excess but without strong spectral features (and a subset of 420 targets with additional MIPS 70 μm observations), we modeled the IRS (and MIPS data) assuming that the dust thermal emission was well-described using either a one- or two-temperature blackbody model. We calculated the probability for each model and computed the average probability to select among models. We found that the spectral energy distributions for the majority of objects (∼66%) were better described using a two-temperature model with warm (T {sub gr} ∼ 100-500 K) and cold (T {sub gr} ∼ 50-150 K) dust populations analogous to zodiacal and Kuiper Belt dust, suggesting that planetary systems are common in debris disks and zodiacal dust is common around host stars with ages up to ∼1 Gyr. We found that younger stars generally have disks with larger fractional infrared luminosities and higher grain temperatures and that higher-mass stars have disks with higher grain temperatures. We show that the increasing distance of dust around debris disks is inconsistent with self-stirred disk models, expected if these systems possess planets at 30-150 AU. Finally, we illustrate how observations of debris disks may be used to constrain the radial dependence of material in the minimum mass solar nebula.

  16. DebriSat Project Update and Planning

    Science.gov (United States)

    Sorge, M.; Krisko, P. H.

    2016-01-01

    DebriSat Reporting Topics: DebriSat Fragment Analysis Calendar; Near-term Fragment Extraction Strategy; Fragment Characterization and Database; HVI (High-Velocity Impact) Considerations; Requirements Document.

  17. 3rd Euro-China Conference on Intelligent Data Analysis and Applications

    CERN Document Server

    Snášel, Václav; Sung, Tien-Wen; Wang, Xiao

    2017-01-01

    This book gathers papers presented at the ECC 2016, the Third Euro-China Conference on Intelligent Data Analysis and Applications, which was held in Fuzhou City, China from November 7 to 9, 2016. The aim of the ECC is to provide an internationally respected forum for scientific research in the broad areas of intelligent data analysis, computational intelligence, signal processing, and all associated applications of artificial intelligence (AI). The third installment of the ECC was jointly organized by Fujian University of Technology, China, and VSB-Technical University of Ostrava, Czech Republic. The conference was co-sponsored by Taiwan Association for Web Intelligence Consortium, and Immersion Co., Ltd.

  18. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  19. Project X: competitive intelligence data mining and analysis

    Science.gov (United States)

    Gilmore, John F.; Pagels, Michael A.; Palk, Justin

    2001-03-01

    Competitive Intelligence (CI) is a systematic and ethical program for gathering and analyzing information about your competitors' activities and general business trends to further your own company's goals. CI allows companies to gather extensive information on their competitors and to analyze what the competition is doing in order to maintain or gain a competitive edge. In commercial business this potentially translates into millions of dollars in annual savings or losses. The Internet provides an overwhelming portal of information for CI analysis. The problem is how a company can automate the translation of voluminous information into valuable and actionable knowledge. This paper describes Project X, an agent-based data mining system specifically developed for extracting and analyzing competitive information from the Internet. Project X gathers CI information from a variety of sources including online newspapers, corporate websites, industry sector reporting sites, speech archiving sites, video news casts, stock news sites, weather sites, and rumor sites. It uses individual industry specific (e.g., pharmaceutical, financial, aerospace, etc.) commercial sector ontologies to form the knowledge filtering and discovery structures/content required to filter and identify valuable competitive knowledge. Project X is described in detail and an example competitive intelligence case is shown demonstrating the system's performance and utility for business intelligence.

  20. BUSINESS INTELLIGENCE TOOLS FOR DATA ANALYSIS AND DECISION MAKING

    Directory of Open Access Journals (Sweden)

    DEJAN ZDRAVESKI

    2011-04-01

    Full Text Available Every business is dynamic in nature and is affected by various external and internal factors. These factors include external market conditions, competitors, internal restructuring and re-alignment, operational optimization and paradigm shifts in the business itself. New regulations and restrictions, in combination with the above factors, contribute to the constant evolutionary nature of compelling, business-critical information; the kind of information that an organization needs to sustain and thrive. Business intelligence (“BI” is broad term that encapsulates the process of gathering information pertaining to a business and the market it functions in. This information when collated and analyzed in the right manner, can provide vital insights into the business and can be a tool to improve efficiency, reduce costs, reduce time lags and bring many positive changes. A business intelligence application helps to achieve precisely that. Successful organizations maximize the use of their data assets through business intelligence technology. The first data warehousing and decision support tools introduced companies to the power and benefits of accessing and analyzing their corporate data. Business users at every level found new, more sophisticated ways to analyze and report on the information mined from their vast data warehouses.Choosing a Business Intelligence offering is an important decision for an enterprise, one that will have a significant impact throughout the enterprise. The choice of a BI offering will affect people up and down the chain of command (senior management, analysts, and line managers and across functional areas (sales, finance, and operations. It will affect business users, application developers, and IT professionals. BI applications include the activities of decision support systems (DSS, query and reporting, online analyticalprocessing (OLAP, statistical analysis, forecasting, and data mining. Another way of phrasing this is

  1. Analysis of Damage in Laminated Architectural Glazing Subjected to Wind Loading and Windborne Debris Impact

    Directory of Open Access Journals (Sweden)

    Daniel S. Stutts

    2013-05-01

    Full Text Available Wind loading and windborne debris (missile impact are the two primary mechanisms that result in window glazing damage during hurricanes. Wind-borne debris is categorized into two types: small hard missiles; such as roof gravel; and large soft missiles representing lumber from wood-framed buildings. Laminated architectural glazing (LAG may be used in buildings where impact resistance is needed. The glass plies in LAG undergo internal damage before total failure. The bulk of the published work on this topic either deals with the stress and dynamic analyses of undamaged LAG or the total failure of LAG. The pre-failure damage response of LAG due to the combination of wind loading and windborne debris impact is studied. A continuum damage mechanics (CDM based constitutive model is developed and implemented via an axisymmetric finite element code to study the failure and damage behavior of laminated architectural glazing subjected to combined loading of wind and windborne debris impact. The effect of geometric and material properties on the damage pattern is studied parametrically.

  2. Review on the NEI Methodology of Debris Transport Analysis in Sump Blockage Issue for APR1400

    International Nuclear Information System (INIS)

    Kim, Jong Uk; Lee, Jeong Ik; Hong, Soon Joon; Lee, Byung Chul; Bang, Young Seok

    2007-01-01

    Since USNRC (United State Nuclear Regulatory Committee) initially addressed post-accident sump performance under Unresolved Safety Issue USI A-43, sump blockage issue has gone through GSI-191, Regulation Guide 1.82, Rev. 3 (RG. 1.82 Rev.3), and generic Letter 2004-02 for PWRs (Pressurized Water Reactors). As a response of these USNRC's activities, NEI 04-07 was issued in order to evaluate the post-accident performance of a plant's recirculation sump. The baseline methodology of NEI 04-07 is composed of break selection, debris generation, latent debris, debris transport, and head loss. In analytical refinement of NEI 04-07, computational fluid dynamic (CFD) is suggested for the evaluation of debris transport in emergency core cooling (ECC) recirculation mode as guided by RG. 1.82 Rev.3. In Korea nuclear industry also keeps step with international activities of this safety issue, with Kori 1 plant as a pioneering edge. Korean nuclear industry has been also pursuing development of an advanced PWR of APR1400, which incorporates several improved safety features. One of the key features, considering sump blockage issue, is the adoption of IRWST (In-containment Refueling Water Storage Tank). This device, as the acronym implies, changes the emergency core cooling water injection pattern. This fact makes us to review the applicability of NEI 04-07's methodology. In this paper we discuss the applicability of NEI 04- 07's methodology, and more over, new methodology is proposed. And finally the preliminary debris transport is analyzed

  3. Debris flow analysis with a one dimensional dynamic run-out model that incorporates entrained material

    Science.gov (United States)

    Luna, Byron Quan; Remaître, Alexandre; van Asch, Theo; Malet, Jean-Philippe; van Westen, Cees

    2010-05-01

    Estimating the magnitude and the intensity of rapid landslides like debris flows is fundamental to evaluate quantitatively the hazard in a specific location. Intensity varies through the travelled course of the flow and can be described by physical features such as deposited volume, velocities, height of the flow, impact forces and pressures. Dynamic run-out models are able to characterize the distribution of the material, its intensity and define the zone where the elements will experience an impact. These models can provide valuable inputs for vulnerability and risk calculations. However, most dynamic run-out models assume a constant volume during the motion of the flow, ignoring the important role of material entrained along its path. Consequently, they neglect that the increase of volume enhances the mobility of the flow and can significantly influence the size of the potential impact area. An appropriate erosion mechanism needs to be established in the analyses of debris flows that will improve the results of dynamic modeling and consequently the quantitative evaluation of risk. The objective is to present and test a simple 1D debris flow model with a material entrainment concept based on limit equilibrium considerations and the generation of excess pore water pressure through undrained loading of the in situ bed material. The debris flow propagation model is based on a one dimensional finite difference solution of a depth-averaged form of the Navier-Stokes equations of fluid motions. The flow is treated as a laminar one phase material, which behavior is controlled by a visco-plastic Coulomb-Bingham rheology. The model parameters are evaluated and the model performance is tested on a debris flow event that occurred in 2003 in the Faucon torrent (Southern French Alps).

  4. Composable Analytic Systems for next-generation intelligence analysis

    Science.gov (United States)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  5. Analysis of Thermal Comfort in an Intelligent Building

    Science.gov (United States)

    Majewski, Grzegorz; Telejko, Marek; Orman, Łukasz J.

    2017-06-01

    Analysis of thermal comfort in the ENERGIS Building, an intelligent building in the campus of the Kielce University of Technology, Poland is the focus of this paper. For this purpose, air temperature, air relative humidity, air flow rate and carbon dioxide concentration were measured and the mean radiant temperature was determined. Thermal sensations of the students occupying the rooms of the building were evaluated with the use of a questionnaire. The students used a seven-point scale of thermal comfort. The microclimate measurement results were used to determine the Predicted Mean Vote and the Predicted Percentage Dissatisfied indices.

  6. A meta-analysis of emotional intelligence and work attitudes

    OpenAIRE

    Miao, Chao; Humphrey, Ronald; Qian, Shanshan

    2017-01-01

    Our meta-analysis of emotional intelligence (EI) demonstrates that: First, all three types of EI are significantly related with job satisfaction (ability EI: ρ ̂ = .08; self-report EI: ρ ̂ = .32; and mixed EI: ρ ̂ = .39). Second, both self-report EI and mixed EI exhibit modest yet statistically significant incremental validity (ΔR2 = .03 for self-report EI and ΔR2 = .06 for mixed EI) and large relative importance (31.3% for self-report EI and 42.8% for mixed EI) in the presence of cognitive a...

  7. Intelligent Data Analysis in the EMERCOM Information System

    Science.gov (United States)

    Elena, Sharafutdinova; Tatiana, Avdeenko; Bakaev, Maxim

    2017-01-01

    The paper describes an information system development project for the Russian Ministry of Emergency Situations (MES, whose international operations body is known as EMERCOM), which was attended by the representatives of both the IT industry and the academia. Besides the general description of the system, we put forward OLAP and Data Mining-based approaches towards the intelligent analysis of the data accumulated in the database. In particular, some operational OLAP reports and an example of multi-dimensional information space based on OLAP Data Warehouse are presented. Finally, we outline Data Mining application to support decision-making regarding security inspections planning and results consideration.

  8. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  9. Synthetic-Creative Intelligence and Psychometric Intelligence: Analysis of the Threshold Theory and Creative Process

    Science.gov (United States)

    Ferrando, Mercedes; Soto, Gloria; Prieto, Lola; Sáinz, Marta; Ferrándiz, Carmen

    2016-01-01

    There has been an increasing body of research to uncover the relationship between creativity and intelligence. This relationship usually has been examined using traditional measures of intelligence and seldom using new approaches (i.e. Ferrando et al. 2005). In this work, creativity is measured by tools developed based on Sternberg's successful…

  10. A review of intelligent systems for heart sound signal analysis.

    Science.gov (United States)

    Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S

    2017-10-01

    Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.

  11. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  12. Transient core-debris bed heat-removal experiments and analysis

    International Nuclear Information System (INIS)

    Ginsberg, T.; Klein, J.; Klages, J.; Schwarz, C.E.; Chen, J.C.

    1982-08-01

    An experimental investigation is reported of the thermal interaction between superheated core debris and water during postulated light-water reactor degraded core accidents. Data are presented for the heat transfer characteristics of packed beds of 3 mm spheres which are cooled by overlying pools of water. Results of transient bed temperature and steam flow rate measurements are presented for bed heights in the range 218 mm-433 mm and initial particle bed temperatures between 530K and 972K. Results display a two-part sequential quench process. Initial frontal cooling leaves pockets or channels of unquenched spheres. Data suggest that heat transfer process is limited by a mechanism of countercurrent two-phase flow. An analytical model which combines a bed energy equation with either a quasisteady version of the Lipinski debris bed model or a critical heat flux model reasonably well predicts the characteristic features of the bed quench process. Implications with respect to reactor safety are discussed

  13. Cooling of particulate debris beds: analysis of the initial D-series experiments

    International Nuclear Information System (INIS)

    Rivard, J.B.

    1978-01-01

    In an effort to provide basic data on the cooling of fast reactor debris, three in-pile experiments employing oxide fuel particulate in liquid sodium were completed in late 1977. Preliminary results from these experiments were reported shortly after their completion at the Third Post-Accident Heat Removal Information Exchange, at Argonne National Laboratory. In these experiments, a distribution of 100 μm to 1000 μm-sized particles of enriched UO 2 was fission-heated to simulate decay-heated debris. In each experiment, the UO 2 particles were contained in a closed, flat-bottomed vessel 012 mm in diameter which was insulated on the diameter and bottom. Sufficient sodium was included to saturate the bed of particles and to provide a volume of bulk sodium above the bed at a controlled temperature. Parameters of interest in the experiments are given

  14. An Analysis of the Ecology and Public Perception of Coarse Woody Debris in Virginia

    OpenAIRE

    Fuhrman, Nicholas E.

    2004-01-01

    Coarse woody debris (CWD) is an important habitat component for wildlife, fish, and plants and is important in nutrient cycling and soil formation. Knowledge of the volume, distribution, and use of CWD across Virginia would be useful to forest managers modeling nutrient budgets in southeastern forests and is important to wildlife management efforts. Knowledge of the effectiveness of informational brochures and cooperative learning activities/presentations at influencing public perception of C...

  15. Joint Intelligence Analysis Complex: DOD Needs to Fully Incorporate Best Practices into Future Cost Estimates

    Science.gov (United States)

    2016-11-01

    February 2015 Joint Intelligence Analysis Complex (JIAC) Cost Estimate Compared to Best Practices 34 Contents Page ii GAO-17-29...staff of the House Permanent Select Committee on Intelligence conducted a review of the JIAC consolidation and compared locating the JIAC at RAF...Committee on Intelligence conducted an evaluation of DOD’s decision to consolidate the JIAC at RAF Croughton and developed a business case analysis

  16. The Lawrence Livermore National Laboratory Intelligent Actinide Analysis System

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Koenig, Z.M.

    1993-07-01

    The authors have developed an Intelligent Actinide Analysis System (IAAS) for Materials Management to use in the Plutonium Facility at the Lawrence Livermore National Laboratory. The IAAS will measure isotopic ratios for plutonium and other actinides non-destructively by high-resolution gamma-ray spectrometry. This system will measure samples in a variety of matrices and containers. It will provide automated control of many aspects of the instrument that previously required manual intervention and/or control. The IAAS is a second-generation instrument, based on the authors' experience in fielding gamma isotopic systems, that is intended to advance non-destructive actinide analysis for nuclear safeguards in performance, automation, ease of use, adaptability, systems integration and extensibility to robotics. It uses a client-server distributed monitoring and control architecture. The IAAS uses MGA 3 as the isotopic analysis code. The design of the IAAS reduces the need for operator intervention, operator training, and operator exposure

  17. The Lawrence Livermore National Laboratory Intelligent Actinide Analysis System

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Koenig, Z.M.

    1993-01-01

    The authors have developed an Intelligent Actinide Analysis System (IAAS) for Materials Management to use in the Plutonium Facility at the Lawrence Livermore National Laboratory. The IAAS will measure isotopic ratios for plutonium and other actinides non-destructively by high-resolution gamma-ray spectrometry. This system will measure samples in a variety of matrices and containers. It will provide automated control of many aspects of the instrument that previously required manual intervention and/or control. The IAAS is a second-generation instrument, based on experience in fielding gamma isotopic systems, that is intended to advance non-destructive actinide analysis for nuclear safeguards in performance, automation, ease of use, adaptability, systems integration and extensibility to robotics. It uses a client-server distributed monitoring and control architecture. The IAAS uses MGA as the isotopic analysis code. The design of the IAAS reduces the need for operator intervention, operator training, and operator exposure

  18. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Garrison Nicole [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Van Buren, Kendra Lu [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-21

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, and finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where

  19. Links between Bloom's Taxonomy and Gardener's Multiple Intelligences: The issue of Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Mahmoud Abdi Tabari

    2015-02-01

    Full Text Available The major thrust of this research was to investigate the cognitive aspect of the high school textbooks and interchange series, due to their extensive use, through content analysis based on Bloom's taxonomy and Gardner's Multiple Intelligences (MI. This study embraced two perspectives in a grid in order to broaden and deepen the analysis by determining the numbers and the types of intelligences with respect to their learning objectives tapped in the textbooks and comparing them. Through codification of Bloom’s learning objectives and Gardner's MI, the results showed that there was a significant difference between the numbers of intelligences with respect to their learning objectives in the textbooks. However, the interchange series enjoyed a large number of the spatial and the interpersonal intelligences across eight levels of learning objectives, whereas they had the least number of the intrapersonal, the musical, and the bodily-kinesthetic intelligences across knowledge understanding and application levels. Keywords: learning objectives, multiple intelligences, textbook analysis

  20. Morphometric Analysis and Delineation of Debris Flow Susceptible Alluvial Fans in the Philippines after the 2015 Koppu and Melor Typhoon Events

    Science.gov (United States)

    Llanes, F.; Rodolfo, K. S.; Lagmay, A. M. A.

    2017-12-01

    On 17 October 2015, Typhoon Koppu brought heavy rains that generated debris flows in the municipalities of Bongabon, Laur, and Gabaldon in Nueva Ecija province. Roughly two months later on 15 December, Typhoon Melor made landfall in the province of Oriental Mindoro, bringing heavy rains that also generated debris flows in multiple watersheds in the municipality of Baco. Despite not being in the direct path of the typhoon, debris flows were triggered in Bongabon, Gabaldon, and Laur, whereas old debris-flow deposits were remobilized in Dingalan, a coastal town in Aurora province adjacent to Gabaldon. During the onslaught of Typhoons Koppu and Melor, landslides of rock, soil, and debris converged in the mountain stream networks where they were remobilized into debris flows that destroyed numerous houses and structures situated on alluvial fans. Satellite images before and after the two typhoons were compared to calculate the deposit extents on the fans and to determine the number and extent of landslides on each watershed. The affected alluvial fans were investigated in the field to determine whether they are debris flow or flood-prone, using a set of established geomorphic and sedimentary characteristics that differentiate deposits of the two processes. Melton ratio, watershed length, and other significant morphometric indices were calculated and analyzed for the affected watersheds using geographic information system (GIS) and high-resolution digital terrain models. A GIS model that can delineate debris flow susceptible alluvial fans in the Philippines was derived and developed from the analysis. Limitations of the model are discussed, as well as recommendations to improve and refine it.

  1. Social media for intelligence: practical examples of analysis for understanding

    Science.gov (United States)

    Juhlin, Jonas A.; Richardson, John

    2016-05-01

    Social media has become a dominating feature in modern life. Platforms like Facebook, Twitter, and Google have users all over the world. People from all walks of life use social media. For the intelligence services, social media is an element that cannot be ignored. It holds immense amount of information, and the potential to extract useful intelligence cannot be ignored. Social media has been around for sufficient time that most intelligence services recognize the fact that social media needs some form of attention. However, for the intelligence collector and analyst several aspects must be uncovered in order to fully exploit social media for intelligence purposes. This paper will present Project Avatar, an experiment in obtaining effective intelligence from social media sources, and several emerging analytic techniques to expand the intelligence gathered from these sources.

  2. Social Media for Intelligence: Practical Examples of Analysis for Understanding

    DEFF Research Database (Denmark)

    Juhlin, Jonas Alastair

    2016-01-01

    be uncovered in order to fully exploit social media for intelligence purposes. This paper will present Project Avatar, an experiment in obtaining effective intelligence from social media sources, and several emerging analytic techniques to expand the intelligence gathered from these sources.......Social media has become a dominating feature in modern life. Platforms like Facebook, Twitter, and Google have users all over the world. People from all walks of life use social media. For the intelligence services, social media is an element that cannot be ignored. It holds immense amount...... of information, and the potential to extract useful intelligence cannot be ignored. Social media has been around for sufficient time that most intelligence services recognize the fact that social media needs some form of attention. However, for the intelligence collector and analyst several aspects must...

  3. Towards an intelligent framework for multimodal affective data analysis.

    Science.gov (United States)

    Poria, Soujanya; Cambria, Erik; Hussain, Amir; Huang, Guang-Bin

    2015-03-01

    An increasingly large amount of multimodal content is posted on social media websites such as YouTube and Facebook everyday. In order to cope with the growth of such so much multimodal data, there is an urgent need to develop an intelligent multi-modal analysis framework that can effectively extract information from multiple modalities. In this paper, we propose a novel multimodal information extraction agent, which infers and aggregates the semantic and affective information associated with user-generated multimodal data in contexts such as e-learning, e-health, automatic video content tagging and human-computer interaction. In particular, the developed intelligent agent adopts an ensemble feature extraction approach by exploiting the joint use of tri-modal (text, audio and video) features to enhance the multimodal information extraction process. In preliminary experiments using the eNTERFACE dataset, our proposed multi-modal system is shown to achieve an accuracy of 87.95%, outperforming the best state-of-the-art system by more than 10%, or in relative terms, a 56% reduction in error rate. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Debris interactions in reactor vessel lower plena during a severe accident. II. Integral analysis

    International Nuclear Information System (INIS)

    Suh, K.Y.; Henry, R.E.

    1996-01-01

    For pt.I see ibid., p.147-63, 1996. The integral physico-numerical model for the reactor vessel lower head response has been exercised for the TMI-2 accident and possible severe accident scenarios in PWR and BWR designs. The proposed inherent cooling mechanism of the reactor material creep and subsequent water ingression implemented in this predictive model provides a consistent representation of how the debris was finally cooled in the TMI-2 accident and how the reactor lower head integrity was maintained during the course of the incident. It should be recalled that in order for this strain to occur, the vessel lower head had to achieve temperatures in excess of 1000 C. This is certainly in agreement with the temperatures determined by metallographic examinations during the TMI-2 vessel inspection program. The integral model was also applied to typical PWR and BWR lower plena with and without structures under pressurized conditions spanning the first relocation of core material to the reactor vessel failure due to creep without recovery actions. The design application results are presented with particular attention being focused on water ingression into the debris bed through the gap formed between the debris and the vessel wall. As an illustration of the accident management application, the lower plenum with structures was recovered after an extensive amount of creep had damaged the vessel wall. The computed lower head temperatures were found to be significantly lower (by more than 300 K in this particular example) with recovery relative to the case without recovery. This clearly demonstrates the potential for in-vessel cooling of the reactor vessel without a need to externally submerge the lower head should such a severe accident occur as core melting and relocation. (orig.)

  5. THE SPITZER INFRARED SPECTROGRAPH DEBRIS DISK CATALOG. II. SILICATE FEATURE ANALYSIS OF UNRESOLVED TARGETS

    Energy Technology Data Exchange (ETDEWEB)

    Mittal, Tushar [Department of Earth and Planetary Science, University of California Berkeley, Berkeley, CA 94720-4767 (United States); Chen, Christine H. [Space Telescope Science Institute, 3700 San Martin Drive Baltimore, MD 21218 (United States); Jang-Condell, Hannah [Department of Physics and Astronomy, University of Wyoming, Laramie, WY 82071 (United States); Manoj, P. [Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005 (India); Sargent, Benjamin A. [Center for Imaging Science and Laboratory for Multiwavelength Astrophysics, Rochester Institute of Technology, 54 Lomb Memorial Drive, Rochester, NY 14623 (United States); Watson, Dan M. [Department of Physics and Astronomy, University of Rochester, Rochester, NY 14627 (United States); Lisse, Carey M., E-mail: cchen@stsci.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2015-01-10

    During the Spitzer Space Telescope cryogenic mission, astronomers obtained Infrared Spectrograph (IRS) observations of hundreds of debris disk candidates that have been compiled in the Spitzer IRS Debris Disk Catalog. We have discovered 10 and/or 20 μm silicate emission features toward 120 targets in the catalog and modeled the IRS spectra of these sources, consistent with MIPS 70 μm observations, assuming that the grains are composed of silicates (olivine, pyroxene, forsterite, and enstatite) and are located either in a continuous disk with power-law size and surface density distributions or thin rings that are well-characterized using two separate dust grain temperatures. For systems better fit by the continuous disk model, we find that (1) the dust size distribution power-law index is consistent with that expected from a collisional cascade, q = 3.5-4.0, with a large number of values outside this range, and (2) the minimum grain size, a {sub min}, increases with stellar luminosity, L {sub *}, but the dependence of a {sub min} on L {sub *} is weaker than expected from radiation pressure alone. In addition, we also find that (3) the crystalline fraction of dust in debris disks evolves as a function of time with a large dispersion in crystalline fractions for stars of any particular stellar age or mass, (4) the disk inner edge is correlated with host star mass, and (5) there exists substantial variation in the properties of coeval disks in Sco-Cen, indicating that the observed variation is probably due to stochasticity and diversity in planet formation.

  6. Rheological analysis of fine-grained natural debris-flow material

    Science.gov (United States)

    Major, Jon J.; Pierson, Thomas C.; ,

    1990-01-01

    Experiments were conducted on large samples of fine-grained material (???2mm) from a natural debris flow using a wide-gap concentric-cylinder viscometer. The rheological behavior of this material is compatible with a Bingham model at shear rates in excess of 5 sec. At lesser shear rates, rheological behavior of the material deviates from the Bingham model, and when sand concentration of the slurry exceeds 20 percent by volume, particle interaction between sand grains dominates the mechanical behavior. Yield strength and plastic viscosity are extremely sensitive to sediment concentration.

  7. Active space debris removal—A preliminary mission analysis and design

    Science.gov (United States)

    Castronuovo, Marco M.

    2011-11-01

    The active removal of five to ten large objects per year from the low Earth orbit (LEO) region is the only way to prevent the debris collisions from cascading. Among the three orbital regions near the Earth where most catastrophic collisions are predicted to occur, the one corresponding to a sun-synchronous condition is considered the most relevant. Forty-one large rocket bodies orbiting in this belt have been identified as the priority targets for removal. As part of a more comprehensive system engineering solution, a space mission dedicated to the de-orbiting of five rocket bodies per year from this orbital regime has been designed. The selected concept of operations envisages the launch of a satellite carrying a number of de-orbiting devices, such as solid propellant kits. The satellite performs a rendezvous with an identified object and mates with it by means of a robotic arm. A de-orbiting device is attached to the object by means of a second robotic arm, the object is released and the device is activated. The spacecraft travels then to the next target. The present paper shows that an active debris removal mission capable of de-orbiting 35 large objects in 7 years is technically feasible, and the resulting propellant mass budget is compatible with many existing platforms.

  8. Microscopical analysis of synovial fluid wear debris from failing CoCr hip prostheses

    Science.gov (United States)

    Ward, M. B.; Brown, A. P.; Cox, A.; Curry, A.; Denton, J.

    2010-07-01

    Metal on metal hip joint prostheses are now commonly implanted in patients with hip problems. Although hip replacements largely go ahead problem free, some complications can arise such as infection immediately after surgery and aseptic necrosis caused by vascular complications due to surgery. A recent observation that has been made at Manchester is that some Cobalt Chromium (CoCr) implants are causing chronic pain, with the source being as yet unidentified. This form of replacement failure is independent of surgeon or hospital and so some underlying body/implant interface process is thought to be the problem. When the synovial fluid from a failed joint is examined particles of metal (wear debris) can be found. Transmission Electron Microscopy (TEM) has been used to look at fixed and sectioned samples of the synovial fluid and this has identified fine (< 100 nm) metal and metal oxide particles within the fluid. TEM EDX and Electron Energy Loss Spectroscopy (EELS) have been employed to examine the composition of the particles, showing them to be chromium rich. This gives rise to concern that the failure mechanism may be associated with the debris.

  9. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  10. Integrated intelligent instruments using supercritical fluid technology for soil analysis

    International Nuclear Information System (INIS)

    Liebman, S.A.; Phillips, C.; Fitzgerald, W.; Levy, E.J.

    1994-01-01

    Contaminated soils pose a significant challenge for characterization and remediation programs that require rapid, accurate and comprehensive data in the field or laboratory. Environmental analyzers based on supercritical fluid (SF) technology have been designed and developed for meeting these global needs. The analyzers are designated the CHAMP Systems (Chemical Hazards Automated Multimedia Processors). The prototype instrumentation features SF extraction (SFE) and on-line capillary gas chromatographic (GC) analysis with chromatographic and/or spectral identification detectors, such as ultra-violet, Fourier transform infrared and mass spectrometers. Illustrations are given for a highly automated SFE-capillary GC/flame ionization (FID) configuration to provide validated screening analysis for total extractable hydrocarbons within ca. 5--10 min, as well as a full qualitative/quantitative analysis in 25--30 min. Data analysis using optional expert system and neural networks software is demonstrated for test gasoline and diesel oil mixtures in this integrated intelligent instrument approach to trace organic analysis of soils and sediments

  11. Debris Examination Using Ballistic and Radar Integrated Software

    Science.gov (United States)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  12. An Analysis of the Influence of Signals Intelligence Through Wargaming

    National Research Council Canada - National Science Library

    McCaffrey, Charles

    2000-01-01

    Signals intelligence (SIGINT), information derived from the monitoring, interception, decryption and evaluation of an adversary's electronic communications, has long been viewed as a significant factor in modem warfare...

  13. A conceptual analysis of the role of competitive intelligence in Zimbabwe’s banking sector

    Directory of Open Access Journals (Sweden)

    Alexander Maune

    2014-11-01

    Full Text Available This article aims to provide a conceptual framework and analysis of the role of competitive intelligence in Zimbabwe`s banking sector. The article used literature and conceptual research approach. Literature review has shown the concept of competitive intelligence to be multidimensional, with a multitude of varying definitions, as well as multifaceted and fuzzy. The concept of competitive intelligence has been presented variously as a process, a function, a product or a mix of all three. Literature review has also shown numerous intelligence concepts that are linked to the concept of competitive intelligence. This article will increase the academic understanding and state of the concept of competitive intelligence in Zimbabwe`s banking sector as well as assisting the entire banking sector.

  14. The relation between intelligence and religiosity: a meta-analysis and some proposed explanations.

    Science.gov (United States)

    Zuckerman, Miron; Silberman, Jordan; Hall, Judith A

    2013-11-01

    A meta-analysis of 63 studies showed a significant negative association between intelligence and religiosity. The association was stronger for college students and the general population than for participants younger than college age; it was also stronger for religious beliefs than religious behavior. For college students and the general population, means of weighted and unweighted correlations between intelligence and the strength of religious beliefs ranged from -.20 to -.25 (mean r = -.24). Three possible interpretations were discussed. First, intelligent people are less likely to conform and, thus, are more likely to resist religious dogma. Second, intelligent people tend to adopt an analytic (as opposed to intuitive) thinking style, which has been shown to undermine religious beliefs. Third, several functions of religiosity, including compensatory control, self-regulation, self-enhancement, and secure attachment, are also conferred by intelligence. Intelligent people may therefore have less need for religious beliefs and practices.

  15. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  16. Structural failure analysis of reactor vessels due to molten core debris

    International Nuclear Information System (INIS)

    Pfeiffer, P.A.

    1993-01-01

    Maintaining structural integrity of the reactor vessel during a postulated core melt accident is an important safety consideration in the design of the vessel. This paper addresses the failure predictions of the vessel due to thermal and pressure loadings from the molten core debris depositing on the lower head of the vessel. Different loading combinations were considered based on a wet or dry cavity and pressurization of the vessel based on operating pressure or atmospheric (pipe break). The analyses considered both short term (minutes) and long term (days) failure modes. Short term failure modes include creep at elevated temperatures and plastic instabilities of the structure. Long term failure modes are caused by creep rupture that lead to plastic instability of the structure. The analyses predict the reactor vessel will remain intact after the core melt has deposited on the lower vessel head

  17. PIXE microbeam analysis of the metallic debris release around endosseous implants

    International Nuclear Information System (INIS)

    Buso, G.P.; Galassini, S.; Moschini, G.; Passi, P.; Zadro, A.; Uzunov, N.M.; Doyle, B.L.; Rossi, P.; Provencio, P.

    2005-01-01

    The mechanical friction that occurs during the surgical insertion of endosseous implants, both in dentistry and orthopaedics, may cause the detachment of metal debris which are dislodged into the peri-implant tissues and can lead to adverse clinical effects. This phenomenon more likely happens with coated or roughened implants that are the most widely employed. In the present study were studied dental implants screws made of commercially pure titanium and coated using titanium plasma-spray (TPS) technique. The implants were inserted in the tibia of rabbits, and removed 'en bloc' with the surrounding bone after one month. After proper processing and mounting on plastic holders, samples from bones were analysed by EDXRF setup at of National Laboratories of Legnaro, INFN, Italy, and consequently at 3 MeV proton microbeam setup at Sandia National Laboratories. Elemental maps were drawn, showing some occasional presence of metal particles in the peri-implant bone

  18. Predicting Intelligibility Gains in Dysarthria through Automated Speech Feature Analysis

    Science.gov (United States)

    Fletcher, Annalise R.; Wisler, Alan A.; McAuliffe, Megan J.; Lansford, Kaitlin L.; Liss, Julie M.

    2017-01-01

    Purpose: Behavioral speech modifications have variable effects on the intelligibility of speakers with dysarthria. In the companion article, a significant relationship was found between measures of speakers' baseline speech and their intelligibility gains following cues to speak louder and reduce rate (Fletcher, McAuliffe, Lansford, Sinex, &…

  19. Multiple Intelligences Theory and Iranian Textbooks: An Analysis

    Science.gov (United States)

    Taase, Yoones

    2012-01-01

    The purpose of this study is to investigate locally designed ELT textbooks in the light of multiple intelligences theory. Three textbooks (grade 1.2.3) used in guidance school of Iranian educational system were analyzed using MI checklist developed by Botelho, Mario do Rozarioand. Catered for kinds of intelligences in the activities and exercises…

  20. Sibling analysis of adolescent intelligence and chronic diseases in older adulthood.

    Science.gov (United States)

    Jokela, Markus; Batty, G David; Deary, Ian J; Silventoinen, Karri; Kivimäki, Mika

    2011-07-01

    We examined whether associations of adolescent intelligence with chronic diseases in adulthood are explained by socioeconomic factors, health behaviors, or common sources of variance in intelligence and chronic disease risk. A prospective cohort study (Wisconsin Longitudinal Study) of high school graduates and their siblings with intelligence assessed in adolescence and chronic diseases reported in adulthood (n = 10,168; mean age 53.9 and n = 9051; mean age 64.8 in two follow-ups). After adjustment for age and sex, greater intelligence was associated with lower risk of heart disease (odds ratio per 1 SD advantage in intelligence 0.93; 95% confidence interval 0.87-0.99), circulation problems (0.85; 0.79-0.92), stroke (0.80; 0.70-0.91), and diabetes (0.88; 0.81-0.95). Participants' risk of stroke and circulation problems also was predicted by their sibling's intelligence, suggesting potential common causes for intelligence and cerebrovascular diseases. Sibling analysis provided no support for shared family environment in explaining associations between intelligence and disease outcomes because between-families and within-siblings regression models were not different. Adjusting for common risk factors had little impact on these associations. In contrast, adjusting for adult socioeconomic status attenuated the associations by 25%-100% (66% on average). Multiple mechanisms may link intelligence with occurrence of chronic diseases of major public health importance. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. JSC Orbital Debris Website Description

    Science.gov (United States)

    Johnson, Nicholas L.

    2006-01-01

    required. These data also help in the analysis and interpretation of impact features on returned spacecraft surfaces. 4) Mitigation - Controlling the growth of the orbital debris population is a high priority for NASA, the United States, and the major space-faring nations of the world to preserve near-Earth space for future generations. Mitigation measures can take the form of curtailing or preventing the creation of new debris, designing satellites to withstand impacts by small debris, and implementing operational procedures ranging from utilizing orbital regimes with less debris, adopting specific spacecraft attitudes, and even maneuvering to avoid collisions with debris. Downloadable items include several documents in PDF format and executable software.and 5) Reentry - Because of the increasing number of objects in space, NASA has adopted guidelines and assessment procedures to reduce the number of non-operational spacecraft and spent rocket upper stages orbiting the Earth. One method of postmission disposal is to allow reentry of these spacecraft, either from orbital decay (uncontrolled entry) or with a controlled entry. Orbital decay may be achieved by firing engines to lower the perigee altitude so that atmospheric drag will eventually cause the spacecraft to enter. However, the surviving debris impact footprint cannot be guaranteed to avoid inhabited landmasses. Controlled entry normally occurs by using a larger amount of propellant with a larger propulsion system to drive the spacecraft to enter the atmosphere at a steeper flight path angle. It will then enter at a more precise latitude, longitude, and footprint in a nearly uninhabited impact region, generally located in the ocean.

  2. Intelligent Systems Approaches to Product Sound Quality Analysis

    Science.gov (United States)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  3. Tomographic flow cytometry assisted by intelligent wavefronts analysis

    Science.gov (United States)

    Merola, F.; Memmolo, P.; Miccio, L.; Mugnano, M.; Ferraro, P.

    2017-06-01

    High-throughput single-cell analysis is a challenging target for implementing advanced biomedical applications. An excellent candidate for this aim is label-free tomographic phase microscopy. However, in-line tomography is very difficult to be implemented in practice, as it requires complex setup for rotating the sample and/or illuminate the cell along numerous directions [1]. We exploit random rolling of cells while they are flowing along a microfluidic channel demonstrating that it is possible to obtain in-line phase-contrast tomography by adopting strategies for intelligent wavefronts analysis thus obtaining complete retrieval of both 3D-position and orientation of rotating cells [2]. Thus, by numerical wavefront analysis a-priori knowledge of such information is no longer needed. This approach makes continuos-flow cyto-tomography suitable for practical operation in real-world, single-cell analysis and with substantial simplification of the optical system avoiding any mechanical/optical scanning of light source. Demonstration is given for different classes of biosamples, red-blood-cells (RBCs), diatom algae and fibroblast cells [3]. Accurate characterization of each type of cells is reported despite their very different nature and materials content, thus showing the proposed method can be extended, by adopting two alternate strategies of wavefront-analysis, to many classes of cells. In particular, for RBCs we furnish important parameters as 3D morphology, Corpuscular Hemoglobin (CH), Volume (V), and refractive index (RI) for each single cell in the total population [3]. This could open a new route in blood disease diagnosis, for example for the isolation and characterization of "foreign" cells in the blood stream, the so called Circulating Tumor Cells (CTC), early manifestation of metastasis.

  4. The potential of using laser ablation inductively coupled plasma time of flight mass spectrometry (LA-ICP-TOF-MS) in the forensic analysis of micro debris.

    Science.gov (United States)

    Scadding, Cameron J; Watling, R John; Thomas, Allen G

    2005-08-15

    The majority of crimes result in the generation of some form of physical evidence, which is available for collection by crime scene investigators or police. However, this debris is often limited in amount as modern criminals become more aware of its potential value to forensic scientists. The requirement to obtain robust evidence from increasingly smaller sized samples has required refinement and modification of old analytical techniques and the development of new ones. This paper describes a new method for the analysis of oxy-acetylene debris, left behind at a crime scene, and the establishment of its co-provenance with single particles of equivalent debris found on the clothing of persons of interest (POI). The ability to rapidly determine and match the elemental distribution patterns of debris collected from crime scenes to those recovered from persons of interest is essential in ensuring successful prosecution. Traditionally, relatively large amounts of sample (up to several milligrams) have been required to obtain a reliable elemental fingerprint of this type of material [R.J. Walting , B.F. Lynch, D. Herring, J. Anal. At. Spectrom. 12 (1997) 195]. However, this quantity of material is unlikely to be recovered from a POI. This paper describes the development and application of laser ablation inductively coupled plasma time of flight mass spectrometry (LA-ICP-TOF-MS), as an analytical protocol, which can be applied more appropriately to the analysis of micro-debris than conventional quadrupole based mass spectrometry. The resulting data, for debris as small as 70mum in diameter, was unambiguously matched between a single spherule recovered from a POI and a spherule recovered from the scene of crime, in an analytical procedure taking less than 5min.

  5. Applying an energy balance model of a debris covered glacier through the Himalayan seasons - insights from the field and sensitivity analysis

    Science.gov (United States)

    Steiner, Jakob; Pellicciotti, Francesca; Buri, Pascal; Brock, Ben

    2016-04-01

    Although some recent studies have attempted to model melt below debris cover in the Himalaya as well as the European Alps, field measurements remain rare and uncertainties of a number of parameters are difficult to constrain. The difficulty of accurately measuring sub-debris melt at one location over a longer period of time with stakes adds to the challenge of calibrating models adequately, as moving debris tends to tilt stakes. Based on measurements of sub-debris melt with stakes as well as air and surface temperature at the same location during three years from 2012 to 2014 at Lirung Glacier in the Nepalese Himalaya, we investigate results with the help of an earlier developed energy balance model. We compare stake readings to cumulative melt as well as observed to modelled surface temperatures. With timeseries stretching through the pre-Monsoon, Monsoon and post-Monsoon of different years we can show the difference of sensitive parameters during these seasons. Using radiation measurements from the AWS we can use a temporarily variable time series of albedo. A thorough analysis of thermistor data showing the stratigraphy of the temperature through the debris layer allows a detailed discussion of the variability as well as the uncertainty range of thermal conductivity. Distributed wind data as well as results from a distributed surface roughness assessment allows to constrain variability of turbulent fluxes between the different locations of the stakes. We show that model results are especially sensitive to thermal conductivity, a value that changes substantially between the seasons. Values obtained from the field are compared to earlier studies, which shows large differences within locations in the Himalaya. We also show that wind varies with more than a factor two between depressions and on debris mounds which has a significant influence on turbulent fluxes. Albedo decreases from the dry to the wet season and likely has some spatial variability that is

  6. Emotional intelligence in professional nursing practice: A concept review using Rodgers's evolutionary analysis approach

    Directory of Open Access Journals (Sweden)

    Angelina E. Raghubir

    2018-04-01

    Full Text Available Background: Knowledge around emotional intelligence originated in the 1990s from research regarding thoughts, emotions and abilities. The concept of emotional intelligence has evolved over the last 25 years; however, the understanding and use is still unclear. Despite this, emotional intelligence has been a widely-considered concept within professions such as business, management, education, and within the last 10 years has gained traction within nursing practice. Aims and objectives: The aim of this concept review is to clarify the understanding of the concept emotional intelligence, what attributes signify emotional intelligence, what are its antecedents, consequences, related terms and implications to advance nursing practice. Method: A computerized search was guided by Rodger's evolutional concept analysis. Data courses included: CINAHL, PyschINFO, Scopus, EMBASE and ProQuest, focusing on articles published in Canada and the United Stated during 1990–2017. Results: A total of 23 articles from various bodies of disciplines were included in this integrative concept review. The analysis reveals that there are many inconsistencies regarding the description of emotional intelligence, however, four common attributes were discovered: self-awareness, self-management, social awareness and social/relationship management. These attributes facilitate the emotional well-being among advance practice nurses and enhances the ability to practice in a way that will benefit patients, families, colleagues and advance practice nurses as working professionals and as individuals. Conclusion: The integration of emotional intelligence is supported within several disciplines as there is consensus on the impact that emotional intelligence has on job satisfaction, stress level, burnout and helps to facilitate a positive environment. Explicit to advance practice nursing, emotional intelligence is a concept that may be central to nursing practice as it has the

  7. ANALYSIS OF DEBRIS FLOW DISASTER DUE TO HEAVY RAIN BY X-BAND MP RADAR DATA

    Directory of Open Access Journals (Sweden)

    M. Nishio

    2016-06-01

    Full Text Available On August 20 of 2014, Hiroshima City (Japan was struck by local heavy rain from an autumnal rain front. The resultant debris flow disaster claimed 75 victims and destroyed many buildings. From 1:30 am to 4:30 am on August 20, the accumulated rainfall in Hiroshima City exceeded 200 mm. Serious damage occurred in the Asakita and Asaminami wards of Hiroshima City. As a disaster prevention measure, local heavy rain (localized torrential rains is usually observed by the Automated Meteorological Data Acquisition System (AMeDAS operated by the Japan Meteorological Agency (JMA and by the C-band radar operated by the Ministry of Land, Infrastructure, Transport and Tourism (MLIT of Japan, with spatial resolutions of 2.5 km and 1 km, respectively. The new X-band MP radar system enables more detailed rainfall observations than the C-band radar. In fact, this radar can observe local rainfall throughout Japan in near-real time over a minimum mesh size of 250 m. A fine-scale accumulated rainfall monitoring system is crucial for disaster prevention, and potential disasters can be alerted by the hazard levels of the accumulated rainfall.

  8. Using Monte Carlo techniques and parallel processing for debris hazard analysis of rocket systems

    Energy Technology Data Exchange (ETDEWEB)

    LaFarge, R.A.

    1994-02-01

    Sandia National Laboratories has been involved with rocket systems for many years. Some of these systems have carried high explosive onboard, while others have had FTS for destruction purposes whenever a potential hazard is detected. Recently, Sandia has also been involved with flight tests in which a target vehicle is intentionally destroyed by a projectile. Such endeavors always raise questions about the safety of personnel and the environment in the event of a premature detonation of the explosive or an activation of the FTS, as well as intentional vehicle destruction. Previous attempts to investigate fragmentation hazards for similar configurations have analyzed fragment size and shape in detail but have computed only a limited number of trajectories to determine the probabilities of impact and casualty expectations. A computer program SAFETIE has been written in support of various SNL flight experiments to compute better approximations of the hazards. SAFETIE uses the AMEER trajectory computer code and the Engineering Sciences Center LAN of Sun workstations to determine more realistically the probability of impact for an arbitrary number of exclusion areas. The various debris generation models are described.

  9. Application and analysis of debris-flow early warning system in Wenchuan earthquake-affected area

    Science.gov (United States)

    Liu, D. L.; Zhang, S. J.; Yang, H. J.; Zhao, L. Q.; Jiang, Y. H.; Tang, D.; Leng, X. P.

    2016-02-01

    The activities of debris flow (DF) in the Wenchuan earthquake-affected area significantly increased after the earthquake on 12 May 2008. The safety of the lives and property of local people is threatened by DFs. A physics-based early warning system (EWS) for DF forecasting was developed and applied in this earthquake area. This paper introduces an application of the system in the Wenchuan earthquake-affected area and analyzes the prediction results via a comparison to the DF events triggered by the strong rainfall events reported by the local government. The prediction accuracy and efficiency was first compared with a contribution-factor-based system currently used by the weather bureau of Sichuan province. The storm on 17 August 2012 was used as a case study for this comparison. The comparison shows that the false negative rate and false positive rate of the new system is, respectively, 19 and 21 % lower than the system based on the contribution factors. Consequently, the prediction accuracy is obviously higher than the system based on the contribution factors with a higher operational efficiency. On the invitation of the weather bureau of Sichuan province, the authors upgraded their prediction system of DF by using this new system before the monsoon of Wenchuan earthquake-affected area in 2013. Two prediction cases on 9 July 2013 and 10 July 2014 were chosen to further demonstrate that the new EWS has high stability, efficiency, and prediction accuracy.

  10. Automated Machinery Health Monitoring Using Stress Wave Analysis & Artificial Intelligence

    National Research Council Canada - National Science Library

    Board, David

    1998-01-01

    .... Army, for application to helicopter drive train components. The system will detect structure borne, high frequency acoustic data, and process it with feature extraction and polynomial network artificial intelligence software...

  11. Special Operations Reconnaissance (SOR) Scenario: Intelligence Analysis and Mission Planning

    National Research Council Canada - National Science Library

    Warner, Norman; Burkman, Lisa; Biron, H. C

    2008-01-01

    ...) scenario and the methodology used to generate and validate the scenario. The face of military team collaboration has changed due to gathering intelligence from broader and more diverse sources...

  12. Business and Social Behaviour Intelligence Analysis Using PSO

    OpenAIRE

    Vinay S Bhaskar; Abhishek Kumar Singh; Jyoti Dhruw; Anubha Parashar; Mradula Sharma

    2014-01-01

    The goal of this paper is to elaborate swarm intelligence for business intelligence decision making and the business rules management improvement. The paper introduces the decision making model which is based on the application of Artificial Neural Networks (ANNs) and Particle Swarm Optimization (PSO) algorithm. Essentially the business spatial data illustrate the group behaviors. The swarm optimization, which is highly influenced by the behavior of creature, performs in group. The Spatial dat...

  13. Analysis of the relation between intelligence and criminal behavior

    Directory of Open Access Journals (Sweden)

    Dragan Jovanovic

    2012-12-01

    Full Text Available Introduction: One of the cognitive aspects of personality is intelligence. A large number of previous studies have shown that the intelligence within the criminal population is decreased, particularly in its verbal aspect.The aim of this study is to determine whether there is a link between intelligence and criminal behavior and how it is manifested.Methods: The research involved criminal inmates of the Correctional institutes of Republic of Srpska and Court Department of Psychiatry Clinic Sokolac who committed homicide and various non-homicide acts. Thetest group consisted of 60 inmates who have committed homicide (homicide offenders and a control group of 60 inmates who did not commit homicide (non-homicide offenders. The study was controlled, transverse or cross-sectional study.Results: Average intelligence of inmates (homicidal and non-homicidal was IQ 95.7. Intelligence of homicide inmates was IQ 97.4 and non-homicide IQ 94.09. Intelligence coeffi cients for non-homicide inmatesubgroups were as follows - subgroup consisting of robbery offenders (IQ 96.9, subgroup consisting of theft perpetrators (IQ 93.83, subgroups consisting of other criminal offenders (IQ 92.8. Verbal intellectual ability– IQw of homicide inmates was 91.22, and 91.10 IQw of non-homicide inmates. Intellectual abilities in nonverbal or manipulative part were average, but they were higher in homicide inmates group (IQm 103.65 than in the group of non-homicide inmates (IQm 97.08.Conclusion: Average intelligence of investigated inmates (homicide and non-homicide is lower than in the general population and corresponds to low average. Verbal part of intelligence is lowered while nonverbalpart is within the average range.

  14. Intelligent simulations for on-line transient analysis

    International Nuclear Information System (INIS)

    Hassberger, J.A.; Lee, J.C.

    1987-01-01

    A unique combination of simulation, parameter estimation and expert systems technology is applied to the problem of diagnosing nuclear power plant transients. Knowledge-based reasoning is ued to monitor plant data and hypothesize about the status of the plant. Fuzzy logic is employed as the inferencing mechanism and an implication scheme based on observations is developed and employed to handle scenarios involving competing failures. Hypothesis testing is performed by simulating the behavior of faulted components using numerical models. A filter has been developed for systematically adjusting key model parameters to force agreement between simulations and actual plant data. Pattern recognition is employed as a decision analysis technique for choosing among several hypotheses based on simulation results. An artificial Intelligence framework based on a critical functions approach is used to deal with the complexity of a nuclear plant system. Detailed simulation results of various nuclear power plant accident scenarios are presented to demonstrate the performance and robustness properties of the diagnostic algorithm developed. The system is shown to be successful in diagnosing and identifying fault parameters for a normal reactor scram, loss-of-feedwater (LOFW) and small loss-of-coolant (LOCA) transients occurring together in a scenario similar to the accident at Three Mile Island

  15. Energy Demand Forecasting: Combining Cointegration Analysis and Artificial Intelligence Algorithm

    Directory of Open Access Journals (Sweden)

    Junbing Huang

    2018-01-01

    Full Text Available Energy is vital for the sustainable development of China. Accurate forecasts of annual energy demand are essential to schedule energy supply and provide valuable suggestions for developing related industries. In the existing literature on energy use prediction, the artificial intelligence-based (AI-based model has received considerable attention. However, few econometric and statistical evidences exist that can prove the reliability of the current AI-based model, an area that still needs to be addressed. In this study, a new energy demand forecasting framework is presented at first. On the basis of historical annual data of electricity usage over the period of 1985–2015, the coefficients of linear and quadratic forms of the AI-based model are optimized by combining an adaptive genetic algorithm and a cointegration analysis shown as an example. Prediction results of the proposed model indicate that the annual growth rate of electricity demand in China will slow down. However, China will continue to demand about 13 trillion kilowatt hours in 2030 because of population growth, economic growth, and urbanization. In addition, the model has greater accuracy and reliability compared with other single optimization methods.

  16. Artificial intelligence applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1985-01-01

    An objective of the US space reactor program is to design systems with high reliability and safety of control over long operating lifetimes. Argonne National Laboratory (ANL) is a participant in the National Man-Machine Integration (MMI) program for Liquid Metal Fast Breeder Reactors (LMFBR). A purpose of this program is to promote the development of concepts and technologies that enhance the operational safety and reliability of fast-breeder reactors. Much of the work is directly applicable to the space reactor program. This paper reports on one of the MMI projects being developed by ANL. The project reported pertains to an automated system that demonstrates the use of artificial intelligence (AI) for design validation (DA) and sneak function analysis (SFA). The AI system models the design specification and the physical design of the cooling process assigned to the Argon Cooling System (ACS) at Experimental Breeder Reactor II (EBR-II). The models are developed using heuristic knowledge and natural laws. 13 refs

  17. Analysis of Debris Flow Kuranji River in Padang City Using Rainfall Data, Remote Sensing and Geographic Information System

    International Nuclear Information System (INIS)

    Umar, Z; Wan Mohd Akib, W A A; Ahmad, A

    2014-01-01

    Flash flood is the most common environmental hazard worldwide. This phenomenon is usually occurs due to intense and prolonged rainfall spells on saturated ground. When there is a rapid rise in water levels and high flow-velocities of the stream occur, the channel overflows and the result is a flash flood. Flash floods normally cause a dangerous wall of roaring water carrying rocks, mud and other debris. On Tuesday, July 24, 2012 at 18:00 pm, a flash flood (debris flow) struck Kuranji River whereby 19 urban villages in seven (7) sub-districts in the city of Padang were affected by this flood disaster. The temporary loss estimated is 40 Billion US Dollar reported by the West Sumatra Provincial Government due to many damages of the built environment infrastructures. This include damaged houses of 878 units, mosque 15 units, irrigation damaged 12 units, bridges 6 units, schools 2 units and health posts 1 unit. Generally, widely used methods for making a landslide study are Geographic Information System (GIS) and Remote Sensing techniques. The landslide information extracted from remotely sensed products is mainly related to morphology, vegetation and hydrologic conditions of a slope. While GIS is used to create a database, data management, data display and to analyze data such as thematic maps of land use/land cover, normalized difference vegetation index (NDVI), rainfall data and soil texture. This paper highlights the analysis of the condition of the Watershed Kuranji River experiencing flash floods, using remote sensing satellite image of Landsat ETM 7 in 2009 and 2012 and Geographic Information System (GIS). Furthermore, the data was analyzed to determine whether this flash flood occurred due to extreme rain or collapse of existing natural dams in the upstream of the Kuranji River

  18. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  19. Army Intelligence Analysis and Interpretation: Assessing the Utility and Limitations of Computational Diagnostic Reasoning

    National Research Council Canada - National Science Library

    Powell, Gerald M

    2004-01-01

    .... For fusion in the Army, little has been published reflecting an elaboration of functionality on levels 2 and 3 of this model, both of which are viewed as critical elements of intelligence analysis and interpretation. Walsh (2002...

  20. Development of a fiber-coupled laser-induced breakdown spectroscopy instrument for analysis of underwater debris in a nuclear reactor core

    International Nuclear Information System (INIS)

    Saeki, Morihisa; Iwanade, Akio; Ohba, Hironori; Ito, Chikara; Wakaida, Ikuo; Thornton, Blair; Sakka, Tetsuo

    2014-01-01

    To inspect the post-accident nuclear core reactor of the TEPCO Fukushima Daiichi nuclear power plant (F1-NPP), a transportable fiber-coupled laser-induced breakdown spectroscopy (LIBS) instrument has been developed. The developed LIBS instrument was designed to analyze underwater samples in a high-radiation field by single-pulse breakdown with gas flow or double-pulse breakdown. To check the feasibility of the assembled fiber-coupled LIBS instrument for the analysis of debris material (mixture of the fuel core, fuel cladding, construction material and so on) in the F1-NPP, we investigated the influence of the radiation dose on the optical transmittance of the laser delivery fiber, compared data quality among various LIBS techniques for an underwater sample and studied the feasibility of the fiber-coupled LIBS system in an analysis of the underwater sample of the simulated debris in F1-NPP. In a feasible study conducted by using simulated debris, which was a mixture of CeO 2 (surrogate of UO 2 ), ZrO 2 and Fe, we selected atomic lines suitable for the analysis of materials, and prepared calibration curves for the component elements. The feasible study has guaranteed that the developed fiber-coupled LIBS system is applicable for analyzing the debris materials in the F1-NPP. (author)

  1. A genome-wide analysis of putative functional and exonic variation associated with extremely high intelligence.

    Science.gov (United States)

    Spain, S L; Pedroso, I; Kadeva, N; Miller, M B; Iacono, W G; McGue, M; Stergiakouli, E; Davey Smith, G; Putallaz, M; Lubinski, D; Meaburn, E L; Plomin, R; Simpson, M A

    2016-08-01

    Although individual differences in intelligence (general cognitive ability) are highly heritable, molecular genetic analyses to date have had limited success in identifying specific loci responsible for its heritability. This study is the first to investigate exome variation in individuals of extremely high intelligence. Under the quantitative genetic model, sampling from the high extreme of the distribution should provide increased power to detect associations. We therefore performed a case-control association analysis with 1409 individuals drawn from the top 0.0003 (IQ >170) of the population distribution of intelligence and 3253 unselected population-based controls. Our analysis focused on putative functional exonic variants assayed on the Illumina HumanExome BeadChip. We did not observe any individual protein-altering variants that are reproducibly associated with extremely high intelligence and within the entire distribution of intelligence. Moreover, no significant associations were found for multiple rare alleles within individual genes. However, analyses using genome-wide similarity between unrelated individuals (genome-wide complex trait analysis) indicate that the genotyped functional protein-altering variation yields a heritability estimate of 17.4% (s.e. 1.7%) based on a liability model. In addition, investigation of nominally significant associations revealed fewer rare alleles associated with extremely high intelligence than would be expected under the null hypothesis. This observation is consistent with the hypothesis that rare functional alleles are more frequently detrimental than beneficial to intelligence.

  2. IEDA [Intelligent Eddy Current Data Analysis] helps make sense of eddy current data [steam generators

    International Nuclear Information System (INIS)

    Clark, R.

    1989-01-01

    The increasing sophistication of eddy current signal interpretation in steam generator tubing has improved capabilities, but has also made the process of analysis more complex and time consuming. Westinghouse has developed an intelligent computerised tool - the IEDA (Intelligent Eddy Current Data Analysis) system, to lighten the load on analysts. Since 1985, 44 plants have been inspected with IEDA, representing over 400,000 tubes. The system has provided a repeatability and a consistency not achieved by human operators. (U.K.)

  3. An analysis of the application of AI to the development of intelligent aids for flight crew tasks

    Science.gov (United States)

    Baron, S.; Feehrer, C.

    1985-01-01

    This report presents the results of a study aimed at developing a basis for applying artificial intelligence to the flight deck environment of commercial transport aircraft. In particular, the study was comprised of four tasks: (1) analysis of flight crew tasks, (2) survey of the state-of-the-art of relevant artificial intelligence areas, (3) identification of human factors issues relevant to intelligent cockpit aids, and (4) identification of artificial intelligence areas requiring further research.

  4. Tutor system for the application of programming through intelligence analysis

    Directory of Open Access Journals (Sweden)

    Ivelisse Teresa Machín-Torres

    2017-05-01

    Full Text Available The present article is part of a research for the development of an intelligent tutor system for the application of programming in the José Martí University of Sancti -Spíritus. The objective of the implementation of this system is to enhance the management knowledge related to programming issues and improve the orientation in solving problems in the university. In order to carry out the implementation of the intelligent tutoring system, the intelligent tutor systems currently in the programming area described the tools and technologies used in the developed solution (methodology, patterns, softwares, programming languages, etc.. It allowed an efficient implementation in a short time of the proposed system. The foregoing is reflected positively in a better student satisfaction and therefore in a higher performance in the teaching-learning process of the university.

  5. Small satellites and space debris issues

    Science.gov (United States)

    Yakovlev, M.; Kulik, S.; Agapov, V.

    2001-10-01

    The objective of this report is the analysis of the tendencies in designing of small satellites (SS) and the effect of small satellites on space debris population. It is shown that SS to include nano- and pico-satellites should be considered as a particularly dangerous source of space debris when elaborating international standards and legal documents concerning the space debris problem, in particular "International Space Debris Mitigation Standard". These issues are in accordance with the IADC goals in its main activity areas and should be carefully considered within the IADC framework.

  6. Hypsometric Analysis of Glacial Features: A Survey of Lobate Debris Apron Populations in Eastern Hellas Basin and Deuteronilus Mensae, Mars

    Science.gov (United States)

    Rutledge, A. M.; Christensen, P. R.

    2014-07-01

    Hypsometric curves of lobate debris apron populations in Hellas Basin and Deuteronilus Mensae were evaluated and compared with respect to inferred ice accumulation and flow. Curve types are elevation-dependent, indicating a past shift in climate.

  7. Effects of Fire Suppression Agents and Weathering in the Analysis of Fire Debris by HS-MS eNose

    Directory of Open Access Journals (Sweden)

    Barbara Falatová

    2018-06-01

    Full Text Available In arson attacks the detection of ignitable liquid residues (ILRs at fire scenes provides key evidence since ignitable liquids, such as gasoline, are commonly used to initiate the fire. In most forensic laboratories gas chromatography-mass spectrometry is employed for the analysis of ILRs. When a fire occurs, suppression agents are used to extinguish the fire and, before the scene is investigated, the samples at the scene are subjected to a variety of processes such as weathering, which can significantly modify the chemical composition and thus lead to erroneous conclusions. In order to avoid this possibility, the application of chemometric tools that help the analyst to extract useful information from data is very advantageous. The study described here concerned the application of a headspace-mass spectrometry electronic nose (HS-MS eNose combined with chemometric tools to determine the presence/absence of gasoline in weathered fire debris samples. The effect of applying two suppression agents (Cafoam Aquafoam AF-6 and Pyro-chem PK-80 Powder and delays in the sampling time (from 0 to 48 h were studied. It was found that, although the suppression systems affect the mass spectra, the HS-MS eNose in combination with suitable pattern recognition chemometric tools, such as linear discriminant analysis, is able to identify the presence of gasoline in any of the studied situations (100% correct classification.

  8. Orbital debris: a technical assessment

    National Research Council Canada - National Science Library

    Committee on Space Debris, National Research Council

    ..., and other debris created as a byproduct of space operations. Orbital Debris examines the methods we can use to characterize orbital debris, estimates the magnitude of the debris population, and assesses the hazard that this population poses to spacecraft...

  9. Spatial-temporal analysis of marine debris on beaches of Niterói, RJ, Brazil: Itaipu and Itacoatiara.

    Science.gov (United States)

    Silva, Melanie Lopes da; Araújo, Fábio Vieira de; Castro, Rebeca Oliveira; Sales, Alessandro Souza

    2015-03-15

    In many areas of the world, studies of marine debris are conducted with an emphasis on analyzing their composition, quantification and distribution on sandy beaches. However, in Brazil, studies are still restricted to some areas of the coast, and the quantities and the spatial and temporal patterns are unknown. To enhance the marine debris information in these areas, we selected the Itaipu and Itacoatiara beaches in Niterói, RJ, to collect, quantify and qualify the solid residues present in their sands. We collected 12 samples and recorded 118.39 kg of residues in Itaipu and 62.94 kg in Itacoatiara. At both beaches, the largest portion of debris was located on the upper part of the beach. Several debris items were related to food and drink consumption on the beaches, which indicated the contribution of beach users to pollution. Most of the debris was plastic. The greatest amount of debris was found at Itaipu in January and February and at Itacoatiara in January and March, months related to both the holiday season and abundant rainfall. The results demonstrated the necessity to implement an Environmental Education project for these areas to reduce its degradation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Application of artificial intelligence to risk analysis for forested ecosystems

    Science.gov (United States)

    Daniel L. Schmoldt

    2001-01-01

    Forest ecosystems are subject to a variety of natural and anthropogenic disturbances that extract a penalty from human population values. Such value losses (undesirable effects) combined with their likelihoods of occurrence constitute risk. Assessment or prediction of risk for various events is an important aid to forest management. Artificial intelligence (AI)...

  11. Business intelligence gap analysis: a user, supplier and academic perspective

    NARCIS (Netherlands)

    Molensky, L.; Ketter, W.; Collins, J.; Bloemhof, J.M.; Koppel, van der H.

    2010-01-01

    Business intelligence (BI) takes many different forms, as indicated by the varying definitions of BI that can be found in industry and academia. These different definitions help us understand of what BI issues are important to the main players in the field of BI; users, suppliers and academics. The

  12. Eye-Movement Analysis Demonstrates Strategic Influences on Intelligence

    Science.gov (United States)

    Vigneau, Francois; Caissie, Andre F.; Bors, Douglas A.

    2006-01-01

    Taking into account various models and findings pertaining to the nature of analogical reasoning, this study explored quantitative and qualitative individual differences in intelligence using latency and eye-movement data. Fifty-five university students were administered 14 selected items of the Raven's Advanced Progressive Matrices test. Results…

  13. Thinking and Writing: Cognitive Science and Intelligence Analysis

    Science.gov (United States)

    2010-02-01

    TheAtlantic.com, 6 October 2009: A former chief technology officer at the Defense Intelligence Agency . . . [stated,] “in some cases we are seeing IT departments...may be enlight - ening for the participants, but nothing about them presses participants toward consensus or closure. Their mode is conversational

  14. Analysis of Bridge Player Profiles According to Their Intelligence Areas

    Science.gov (United States)

    Bilir, Fatma Pervin; Sirin, Yeliz

    2017-01-01

    The aim of this study is to figure out profiles of bridge players and analyzing them according to their intelligence areas. The sample of the study is consist of 100 volunteers out of 200 bridge players who have attended "Çukurova open double bridge championship" in Adana, Turkey at February 2016. Data have been collected via…

  15. Global Research on Artificial Intelligence from 1990–2014: Spatially-Explicit Bibliometric Analysis

    Directory of Open Access Journals (Sweden)

    Jiqiang Niu

    2016-05-01

    Full Text Available In this article, we conducted the evaluation of artificial intelligence research from 1990–2014 by using bibliometric analysis. We introduced spatial analysis and social network analysis as geographic information retrieval methods for spatially-explicit bibliometric analysis. This study is based on the analysis of data obtained from database of the Science Citation Index Expanded (SCI-Expanded and Conference Proceedings Citation Index-Science (CPCI-S. Our results revealed scientific outputs, subject categories and main journals, author productivity and geographic distribution, international productivity and collaboration, and hot issues and research trends. The growth of article outputs in artificial intelligence research has exploded since the 1990s, along with increasing collaboration, reference, and citations. Computer science and engineering were the most frequently-used subject categories in artificial intelligence studies. The top twenty productive authors are distributed in countries with a high investment of research and development. The United States has the highest number of top research institutions in artificial intelligence, producing most single-country and collaborative articles. Although there is more and more collaboration among institutions, cooperation, especially international ones, are not highly prevalent in artificial intelligence research as expected. The keyword analysis revealed interesting research preferences, confirmed that methods, models, and application are in the central position of artificial intelligence. Further, we found interesting related keywords with high co-occurrence frequencies, which have helped identify new models and application areas in recent years. Bibliometric analysis results from our study will greatly facilitate the understanding of the progress and trends in artificial intelligence, in particular, for those researchers interested in domain-specific AI-driven problem-solving. This will be

  16. LightForce Photon-pressure Collision Avoidance: Efficiency Analysis in the Current Debris Environment and Long-Term Simulation Perspective

    Science.gov (United States)

    Yang, Fan Y.; Nelson, Bron; Carlino, Roberto; Perez, Andres D.; Faber, Nicolas; Henze, Chris; Karacahoglu, Arif G.; O'Toole, Conor; Swenson, Jason; Stupl, Jan

    2015-01-01

    This work provides an efficiency analysis of the LightForce space debris collision avoidance scheme in the current debris environment and describes a simulation approach to assess its impact on the long-term evolution of the space debris environment. LightForce aims to provide just-in-time collision avoidance by utilizing photon pressure from ground-based industrial lasers. These ground stations impart minimal accelerations to increase the miss distance for a predicted conjunction between two objects. In the first part of this paper we will present research that investigates the short-term effect of a few systems consisting of 10kW class lasers directed by 1.5 m diameter telescopes using adaptive optics. The results found such a network of ground stations to mitigate more than 85 percent of conjunctions and could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. While these are impressive numbers that indicate LightForce's utility in the short-term, the remaining 15 percent of possible collisions contain (among others) conjunctions between two massive objects that would add large amount of debris if they collide. Still, conjunctions between massive objects and smaller objects can be mitigated. Hence we choose to expand the capabilities of the simulation software to investigate the overall effect of a network of LightForce stations on the long-term debris evolution. In the second part of this paper, we will present the planed simulation approach for that effort.

  17. Integrated analysis of core debris interactions and their effects on containment integrity using the CONTAIN computer code

    International Nuclear Information System (INIS)

    Carroll, D.E.; Bergeron, K.D.; Williams, D.C.; Tills, J.L.; Valdez, G.D.

    1987-01-01

    The CONTAIN computer code includes a versatile system of phenomenological models for analyzing the physical, chemical and radiological conditions inside the containment building during severe reactor accidents. Important contributors to these conditions are the interactions which may occur between released corium and cavity concrete. The phenomena associated with interactions between ejected corium debris and the containment atmosphere (Direct Containment Heating or DCH) also pose a potential threat to containment integrity. In this paper, we describe recent enhancements of the CONTAIN code which allow an integrated analysis of these effects in the presence of other mitigating or aggravating physical processes. In particular, the recent inclusion of the CORCON and VANESA models is described and a calculation example presented. With this capability CONTAIN can model core-concrete interactions occurring simultaneously in multiple compartments and can couple the aerosols thereby generated to the mechanistic description of all atmospheric aerosol components. Also discussed are some recent results of modeling the phenomena involved in Direct Containment Heating. (orig.)

  18. Genome-wide association meta-analysis of 78,308 individuals identifies new loci and genes influencing human intelligence.

    Science.gov (United States)

    Sniekers, Suzanne; Stringer, Sven; Watanabe, Kyoko; Jansen, Philip R; Coleman, Jonathan R I; Krapohl, Eva; Taskesen, Erdogan; Hammerschlag, Anke R; Okbay, Aysu; Zabaneh, Delilah; Amin, Najaf; Breen, Gerome; Cesarini, David; Chabris, Christopher F; Iacono, William G; Ikram, M Arfan; Johannesson, Magnus; Koellinger, Philipp; Lee, James J; Magnusson, Patrik K E; McGue, Matt; Miller, Mike B; Ollier, William E R; Payton, Antony; Pendleton, Neil; Plomin, Robert; Rietveld, Cornelius A; Tiemeier, Henning; van Duijn, Cornelia M; Posthuma, Danielle

    2017-07-01

    Intelligence is associated with important economic and health-related life outcomes. Despite intelligence having substantial heritability (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (r g = 0.89, LD score regression P = 5.4 × 10 -29 ). These findings provide new insight into the genetic architecture of intelligence.

  19. Data Acquisition, Management, and Analysis in Support of the Audiology and Hearing Conservation and the Orbital Debris Program Office

    Science.gov (United States)

    Dicken, Todd

    2012-01-01

    My internship at Johnson Space Center, Houston TX comprised of working simultaneously in the Space Life Science Directorate (Clinical Services Branch, SD3) in Audiology and Hearing Conservation and in the Astromaterials Research and Exploration Sciences Directorate in the Orbital Debris Program Office (KX). The purpose of the project done to support the Audiology and Hearing Conservation Clinic (AuHCon) is to organize and analyze auditory test data that has been obtained from tests conducted onboard the International Space Station (ISS) and in Johnson Space Center's clinic. Astronauts undergo a special type of auditory test called an On-Orbit Hearing Assessment (OOHA), which monitors hearing function while crewmembers are exposed to noise and microgravity during long-duration spaceflight. Data needed to be formatted to assist the Audiologist in studying, analyzing and reporting OOHA results from all ISS missions, with comparison to conventional preflight and post-flight audiometric test results of crewmembers. Orbital debris is the #1 threat to manned spacecraft; therefore NASA is investing in different measurement techniques to acquire information on orbital debris. These measurements are taken with telescopes in different parts of the world to acquire brightness variations over time, from which size, rotation rates and material information can be determined for orbital debris. Currently many assumptions are taken to resolve size and material from observed brightness, therefore a laboratory (Optical Measurement Center) is used to simulate the space environment and acquire information of known targets suited to best model the orbital debris population. In the Orbital Debris Program Office (ODPO) telescopic data were acquired and analyzed to better assess the orbital debris population.

  20. Sampling and analysis validates acceptable knowledge on LANL transuranic, heterogeneous, debris waste, or ''Cutting the Gordian knot that binds WIPP''

    International Nuclear Information System (INIS)

    Kosiewicz, S.T.; Triay, I.R.; Souza, L.A.

    1999-01-01

    Through sampling and toxicity characteristic leaching procedure (TCLP) analyses, LANL and the DOE validated that a LANL transuranic (TRU) waste (TA-55-43, Lot No. 01) was not a Resource Recovery and Conservation Act (RCRA) hazardous waste. This paper describes the sampling and analysis project as well as the statistical assessment of the analytical results. The analyses were conducted according to the requirements and procedures in the sampling and analysis plan approved by the New Mexico Environmental Department. The plan used a statistical approach that was consistent with the stratified, random sampling requirements of SW-846. LANL adhered to the plan during sampling and chemical analysis of randomly selected items of the five major types of materials in this heterogeneous, radioactive, debris waste. To generate portions of the plan, LANL analyzed a number of non-radioactive items that were representative of the mix of items present in the waste stream. Data from these cold surrogates were used to generate means and variances needed to optimize the design. Based on statistical arguments alone, only two samples from the entire waste stream were deemed necessary, however a decision was made to analyze at least two samples of each of the five major waste types. To obtain these samples, nine TRU waste drums were opened. Sixty-six radioactively contaminated and four non-radioactive grab samples were collected. Portions of the samples were composited for chemical analyses. In addition, a radioactively contaminated sample of rust-colored powder of interest to the New Mexico Environment Department (NMED) was collected and qualitatively identified as rust

  1. Energy Demand Forecasting: Combining Cointegration Analysis and Artificial Intelligence Algorithm

    OpenAIRE

    Huang, Junbing; Tang, Yuee; Chen, Shuxing

    2018-01-01

    Energy is vital for the sustainable development of China. Accurate forecasts of annual energy demand are essential to schedule energy supply and provide valuable suggestions for developing related industries. In the existing literature on energy use prediction, the artificial intelligence-based (AI-based) model has received considerable attention. However, few econometric and statistical evidences exist that can prove the reliability of the current AI-based model, an area that still needs to ...

  2. Analysis and reconstructed modelling of the debris flow event of the 21st of July 2012 of St. Lorenzen (Styria, Austira)

    Science.gov (United States)

    Janu, Stefan; Mehlhorn, Susanne; Moser, Markus

    2013-04-01

    Analysis and reconstructed modelling of the debris flow event of the 21st of July 2012 of St. Lorenzen (Styria, Austria) Authors: Stefan Janu, Susanne Mehlhorn, Markus Moser The village of St. Lorenzen, in the Styrian Palten valley is situated on the banks of the Lorenz torrent, in which a debris flow event occurred in the early morning hours of the 21st of July 2012, causing catastrophic damage to residential buildings and other infrastructural facilities. In the ministry-approved hazard zone map of 2009, the flood water discharge and bedload volume associated with a 150-year event was estimated at 34 m³/s and 25,000 m³ respectively for the 5.84 km² catchment area. The bedload transport capacity of the torrent was classified as ranging from 'heavy' to 'capable of producing debris flows'. The dominant process type of the mass movement event may be described as a fine-grained debris flow. The damage in the residential area of St.Lorenzen was caused by a debris flow pulse in the lower reach of the Lorenz torrent. This debris flow pulse was in turn caused by numerous landslides along the middle reaches of the torrent, some of which caused blockages, ultimately leading to an outburst event in the main torrent. Discharge cross-sections ranging from 65 - 90 m², and over 100 m² in a few instances, were measured upstream of the St. Lorenzen residential area. Back-calculations of velocities yielded an average debris flow velocity along the middle reaches of the torrent between 11 and 16 m/s. An average velocity of 9 m/s was calculated for the debris flow at the neck of the alluvial fan directly behind the center of the village. Due to both the high discharge values as well as to the height of the mass movement deposits, the natural hazard event of 21 July 2012 in St. Lorenzen is clearly to be described as having had an extreme intensity. A total of 67 buildings were damaged along the Lorenz torrent, 7 of were completely destroyed. According to the Austrian Service for

  3. Low Power Multi-Hop Networking Analysis in Intelligent Environments.

    Science.gov (United States)

    Etxaniz, Josu; Aranguren, Gerardo

    2017-05-19

    Intelligent systems are driven by the latest technological advances in many different areas such as sensing, embedded systems, wireless communications or context recognition. This paper focuses on some of those areas. Concretely, the paper deals with wireless communications issues in embedded systems. More precisely, the paper combines the multi-hop networking with Bluetooth technology and a quality of service (QoS) metric, the latency. Bluetooth is a radio license-free worldwide communication standard that makes low power multi-hop wireless networking available. It establishes piconets (point-to-point and point-to-multipoint links) and scatternets (multi-hop networks). As a result, many Bluetooth nodes can be interconnected to set up ambient intelligent networks. Then, this paper presents the results of the investigation on multi-hop latency with park and sniff Bluetooth low power modes conducted over the hardware test bench previously implemented. In addition, the empirical models to estimate the latency of multi-hop communications over Bluetooth Asynchronous Connectionless Links (ACL) in park and sniff mode are given. The designers of devices and networks for intelligent systems will benefit from the estimation of the latency in Bluetooth multi-hop communications that the models provide.

  4. Business and Social Behaviour Intelligence Analysis Using PSO

    Directory of Open Access Journals (Sweden)

    Vinay S Bhaskar

    2014-06-01

    Full Text Available The goal of this paper is to elaborate swarm intelligence for business intelligence decision making and the business rules management improvement. The paper introduces the decision making model which is based on the application of Artificial Neural Networks (ANNs and Particle Swarm Optimization (PSO algorithm. Essentially the business spatial data illustrate the group behaviors. The swarm optimization, which is highly influenced by the behavior of creature, performs in group. The Spatial data is defined as data that is represented by 2D or 3D images. SQL Server supports only 2D images till now. As we know that location is an essential part of any organizational data as well as business data: enterprises maintain customer address lists, own property, ship goods from and to warehouses, manage transport flows among their workforce, and perform many other activities. By means to say a lot of spatial data is used and processed by enterprises, organizations and other bodies in order to make the things more visible and self-descriptive. From the experiments, we found that PSO is can facilitate the intelligence in social and business behaviour

  5. Breastfeeding and intelligence: a systematic review and meta-analysis.

    Science.gov (United States)

    Horta, Bernardo L; Loret de Mola, Christian; Victora, Cesar G

    2015-12-01

    This study was aimed at systematically reviewing evidence of the association between breastfeeding and performance in intelligence tests. Two independent searches were carried out using Medline, LILACS, SCIELO and Web of Science. Studies restricted to infants and those where estimates were not adjusted for stimulation or interaction at home were excluded. Fixed- and random-effects models were used to pool the effect estimates, and a random-effects regression was used to assess potential sources of heterogeneity. We included 17 studies with 18 estimates of the relationship between breastfeeding and performance in intelligence tests. In a random-effects model, breastfed subjects achieved a higher IQ [mean difference: 3.44 points (95% confidence interval: 2.30; 4.58)]. We found no evidence of publication bias. Studies that controlled for maternal IQ showed a smaller benefit from breastfeeding [mean difference 2.62 points (95% confidence interval: 1.25; 3.98)]. In the meta-regression, none of the study characteristics explained the heterogeneity among the studies. Breastfeeding is related to improved performance in intelligence tests. A positive effect of breastfeeding on cognition was also observed in a randomised trial. This suggests that the association is causal. ©2015 The Authors. Acta Paediatrica published by John Wiley & Sons Ltd on behalf of Foundation Acta Paediatrica.

  6. Artificial intelligence

    CERN Document Server

    Ennals, J R

    1987-01-01

    Artificial Intelligence: State of the Art Report is a two-part report consisting of the invited papers and the analysis. The editor first gives an introduction to the invited papers before presenting each paper and the analysis, and then concludes with the list of references related to the study. The invited papers explore the various aspects of artificial intelligence. The analysis part assesses the major advances in artificial intelligence and provides a balanced analysis of the state of the art in this field. The Bibliography compiles the most important published material on the subject of

  7. Solar Radiation Pressure Estimation and Analysis of a GEO Class of High Area-to-Mass Ratio Debris Objects

    Science.gov (United States)

    Kelecy, Tom; Payne, Tim; Thurston, Robin; Stansbery, Gene

    2007-01-01

    A population of deep space objects is thought to be high area-to-mass ratio (AMR) debris having origins from sources in the geosynchronous orbit (GEO) belt. The typical AMR values have been observed to range anywhere from 1's to 10's of m(sup 2)/kg, and hence, higher than average solar radiation pressure effects result in long-term migration of eccentricity (0.1-0.6) and inclination over time. However, the nature of the debris orientation-dependent dynamics also results time-varying solar radiation forces about the average which complicate the short-term orbit determination processing. The orbit determination results are presented for several of these debris objects, and highlight their unique and varied dynamic attributes. Estimation or the solar pressure dynamics over time scales suitable for resolving the shorter term dynamics improves the orbit estimation, and hence, the orbit predictions needed to conduct follow-up observations.

  8. Links between Bloom's Taxonomy and Gardener's Multiple Intelligences: The Issue of Textbook Analysis

    Science.gov (United States)

    Tabari, Mahmoud Abdi; Tabari, Iman Abdi

    2015-01-01

    The major thrust of this research was to investigate the cognitive aspect of the high school textbooks and interchange series, due to their extensive use, through content analysis based on Bloom's taxonomy and Gardner's Multiple Intelligences (MI). This study embraced two perspectives in a grid in order to broaden and deepen the analysis by…

  9. Property measurements and inner state estimation of simulated fuel debris

    Energy Technology Data Exchange (ETDEWEB)

    Hirooka, S.; Kato, M.; Morimoto, K.; Washiya, T. [Japan Atomic Energy Agency, Ibaraki (Japan)

    2014-07-01

    Fuel debris properties and inner state such as temperature profile were evaluated by using analysis of simulated fuel debris manufactured from UO{sub 2} and oxidized zircaloy. The center of the fuel debris was expected to be molten state soon after the melt down accident of LWRs because power density was very high. On the other hand, the surface of the fuel debris was cooled in the water. This large temperature gradient may cause inner stress and consequent cracks were expected. (author)

  10. 3rd International Workshop on Intelligent Data Analysis and Management (IDAM)

    CERN Document Server

    Wang, Leon; Hong, Tzung-Pei; Yang, Hsin-Chang; Ting, I-Hsien

    2013-01-01

    These papers on Intelligent Data Analysis and Management (IDAM) examine issues related to the research and applications of Artificial Intelligence techniques in data analysis and management across a variety of disciplines. The papers derive from the 2013 IDAM conference in Kaohsiung ,Taiwan. It is an interdisciplinary research field involving academic researchers in information technologies, computer science, public policy, bioinformatics, medical informatics, and social and behavior studies, etc. The techniques studied include (but are not limited to): data visualization, data pre-processing, data engineering, database mining techniques, tools and applications, evolutionary algorithms, machine learning, neural nets, fuzzy logic, statistical pattern recognition, knowledge filtering, and post-processing, etc.

  11. Pattern analysis, intelligent security and the Internet of Things

    CERN Document Server

    Muda, Azah; Choo, Yun-Huoy

    2015-01-01

    This Volume presents the selected papers from the 5 Parallel Symposiums of the 2014 Fourth World Congress on Information and Communication Technologies (WICT 2014) held in Malacca, Malaysia. The theme of WICT 2014 'Innovating ICT for Social Revolutions'. WICT 2014 is Co-Organized by Machine Intelligence Research Labs (MIR Labs), USA and Universiti Teknikal Malaysia Melaka, Malaysia. WICT 2014 is technically co-sponsored by IEEE Systems, Man & Cybernetics Society Malaysia and Spain Chapters and Technically Supported by IEEE Systems Man and Cybernetics Society, Technical Committee on Soft Computing.

  12. Continuous quality improvement using intelligent infusion pump data analysis.

    Science.gov (United States)

    Breland, Burnis D

    2010-09-01

    The use of continuous quality-improvement (CQI) processes in the implementation of intelligent infusion pumps in a community teaching hospital is described. After the decision was made to implement intelligent i.v. infusion pumps in a 413-bed, community teaching hospital, drug libraries for use in the safety software had to be created. Before drug libraries could be created, it was necessary to determine the epidemiology of medication use in various clinical care areas. Standardization of medication administration was performed through the CQI process, using practical knowledge of clinicians at the bedside and evidence-based drug safety parameters in the scientific literature. Post-implementation, CQI allowed refinement of clinically important safety limits while minimizing inappropriate, meaningless soft limit alerts on a few select agents. Assigning individual clinical care areas (CCAs) to individual patient care units facilitated customization of drug libraries and identification of specific CCA compliance concerns. Between June 2007 and June 2008, there were seven library updates. These involved drug additions and deletions, customization of individual CCAs, and alterations of limits. Overall compliance with safety software use rose over time, from 33% in November 2006 to over 98% in December 2009. Many potentially clinically significant dosing errors were intercepted by the safety software, prompting edits by end users. Only 4-6% of soft limit alerts resulted in edits. Compliance rates for use of infusion pump safety software varied among CCAs over time. Education, auditing, and refinement of drug libraries led to improved compliance in most CCAs.

  13. Prediction for human intelligence using morphometric characteristics of cortical surface: partial least square analysis.

    Science.gov (United States)

    Yang, J-J; Yoon, U; Yun, H J; Im, K; Choi, Y Y; Lee, K H; Park, H; Hough, M G; Lee, J-M

    2013-08-29

    A number of imaging studies have reported neuroanatomical correlates of human intelligence with various morphological characteristics of the cerebral cortex. However, it is not yet clear whether these morphological properties of the cerebral cortex account for human intelligence. We assumed that the complex structure of the cerebral cortex could be explained effectively considering cortical thickness, surface area, sulcal depth and absolute mean curvature together. In 78 young healthy adults (age range: 17-27, male/female: 39/39), we used the full-scale intelligence quotient (FSIQ) and the cortical measurements calculated in native space from each subject to determine how much combining various cortical measures explained human intelligence. Since each cortical measure is thought to be not independent but highly inter-related, we applied partial least square (PLS) regression, which is one of the most promising multivariate analysis approaches, to overcome multicollinearity among cortical measures. Our results showed that 30% of FSIQ was explained by the first latent variable extracted from PLS regression analysis. Although it is difficult to relate the first derived latent variable with specific anatomy, we found that cortical thickness measures had a substantial impact on the PLS model supporting the most significant factor accounting for FSIQ. Our results presented here strongly suggest that the new predictor combining different morphometric properties of complex cortical structure is well suited for predicting human intelligence. Copyright © 2013 IBRO. Published by Elsevier Ltd. All rights reserved.

  14. Spatial Analysis of Large Woody Debris Arrangement in a Midwestern U.S. River System: Geomorphic Implications and Influences

    Science.gov (United States)

    Martin, D. J.

    2013-12-01

    Large woody debris (LWD) is universally recognized as a key component of the geomorphological and ecological function of fluvial systems and has been increasingly incorporated into stream restoration and watershed management projects. However, 'natural' processes of recruitment and the subsequent arrangement of LWD within the river network are poorly understood and are thus, rarely a management consideration. Additionally, LWD research tends to be regionally biased toward mountainous regions, and scale biased toward the micro-scale. In many locations, the lack of understanding has led to the failure of restoration/rehabilitation projects that involved the use of LWD. This research uses geographic information systems and spatial analysis techniques to investigate longitudinal arrangement patterns of LWD in a low-gradient, Midwestern river. A large-scale GPS inventory of LWD was performed on the Big River, located in the eastern Missouri Ozarks resulting in over 5,000 logged positions of LWD along seven river segments covering nearly 100 km of the 237 km river system. A time series analysis framework was used to statistically identify longitudinal spatial patterns of LWD arrangement along the main stem of the river, and correlation analyses were performed to help identify physical controls of those patterns. Results indicate that upstream segments have slightly lower densities than downstream segments, with the exception of the farthest upstream segment. Results also show lack of an overall longitudinal trend in LWD density; however, periodogram analysis revealed an inherent periodicity in LWD arrangement. Periodicities were most evident in the downstream segments with frequencies ranging from 3 km to 7 km. Additionally, Pearson correlation analysis, performed within the segment displaying the strongest periodic behavior, show that LWD densities are correlated with channel sinuosity (r=0.25). Ongoing research is investigating further relationships between arrangement

  15. Analysis of rainfall-induced shallow landslides and debris flows in the Eastern Pyrenees

    Science.gov (United States)

    Portilla Gamboa, M.; Hürlimann, M.; Corominas, J.

    2009-09-01

    The inventory of rainfall-induced mass movements, rainfall data, and slope characteristics are considered the basis of the analysis determining appropriate rainfall thresholds for mass movements in a specific region. The rainfall-induced landslide thresholds established in the literature for the Catalan Pyrenees have been formulated referring to the rainfall events of November 1982, September 1992, December 1997, and others occurred after 1999. It has been shown that a rainfall intensity greater than 190 mm in 24 hours without antecedent rainfall would be necessary to produce mass movements (Corominas and Moya, 1999; Corominas et al, 2002) or 51mm in 24h with 61 mm of accumulated rainfall (Marco, 2007). Short duration-high intensity rainfalls have brought about several mass movements in some Catalonian regions throughout the course of twenty-first century (Berga, Bonaigua, Saldes, Montserrat, Port-Ainé, Riu Runer, and Sant Nicolau). Preliminary analysis of these events shows that it is necessary to review the thresholds defined so far and redo the existing inventory of mass movements for the Catalan Pyrenees. The present work shows the usefulness of aerial photographs in the reconstruction of the inventory of historic mass movements (Molló-Queralbs, 1940; Arties-Vielha, 1963; Barruera-Senet, 1940 and 1963, and Berga-Cercs, 1982, 1997 and 2008). Also, it highlights the treatment given to scarce and scattered rainfall data available inside these Catalonia’s regions, and the application of Geographic Information Systems (ArcGIS) in the management of the gathered information. The results acquired until now show that the historic rainfall events occurred in the Eastern Pyrenees have yielded many more mass movements than those reported in the literature. Besides, it can be said that the thresholds formulated for the Pyrenees are valid for longstanding regional rainfalls, and not for local downpours. In the latter cases it should be necessary to take into account the

  16. Comparison of an Inductance In-Line Oil Debris Sensor and Magnetic Plug Oil Debris Sensor

    Science.gov (United States)

    Dempsey, Paula J.; Tuck, Roger; Showalter, Stephen

    2012-01-01

    The objective of this research was to compare the performance of an inductance in-line oil debris sensor and magnetic plug oil debris sensor when detecting transmission component health in the same system under the same operating conditions. Both sensors were installed in series in the NASA Glenn Spiral Bevel Gear Fatigue Rig during tests performed on 5 gear sets (pinion/gear) when different levels of damage occurred on the gear teeth. Results of this analysis found both the inductance in-line oil debris sensor and magnetic plug oil debris sensor have benefits and limitations when detecting gearbox component damage.

  17. Data Gap Analysis and Damage Case Studies: Risk Analyses from Construction and Demolition Debris Landfills and Recycling Facilities

    Science.gov (United States)

    The report presents an evaluation of construction and demolition (C&D) debris management in the US to update and expand upon the previous set of data to include information on more recent cases of damage and potential impacts and expand the breadth of damages beyond groundwater a...

  18. Sem Analysis of particles from the 28, 000 B.P El Zaguan debris avalanche deposit, Nevado de Toluca volcano, Central Mexico: evidences of flow behavior during emplacement

    Science.gov (United States)

    Caballero, L.; Capra, L.

    2008-12-01

    The Zaguan deposit originated at 28, 000 yr. B.P from the flank collapse of the Nevado de Toluca volcano, a dacitic stratovolcano of the Transmexican Volcanic Belt. A Scanning Electron Microprobe analysis (SEM) was made to some clasts of this deposit to observe microtextures produced during transport and emplacement of the debris avalanche flow. Particles from 2, 0 and -2 Φ granulometric classes were randomly selected and their surface textures were described. The textures observed were divided in two groups, collision and shear structures indicating different clast interaction. Shear textures were observed predominantly on the basal part of the deposit and consisted of parallel ridges, parallel grooves, scratches and lips. Collision textures were mainly present in the upper part of the deposit and consisted of fractures, percussion marks, and broken or grinded crystals. These characteristics, coupled with field observation, like the presence of clast dikes and deformed lacustrine megaclasts, indicate that the basal part of the debris avalanche was moving in a partially liquefied state, were particles were not able to move freely because of the confinement exerted by the upper part of the flow, so shear stresses dominated. On the contrary, the particles in the upper part were able to move freely so the principal mechanism of interaction between particles was collision. These microscopic textures are in agreement with previously described behavior of emplacement of debris avalanches of volcanic origin, that suggest a stratified flow dominated by different transport and depositional mechanism depending on flow depth and possible fluid content at their base.

  19. Determining the Particle Size of Debris from a Tunnel Boring Machine Through Photographic Analysis and Comparison Between Excavation Performance and Rock Mass Properties

    Science.gov (United States)

    Rispoli, A.; Ferrero, A. M.; Cardu, M.; Farinetti, A.

    2017-10-01

    This paper presents the results of a study carried out on a 6.3-m-diameter exploratory tunnel excavated in hard rock by an open tunnel boring machine (TBM). The study provides a methodology, based on photographic analysis, for the evaluation of the particle size distribution of debris produced by the TBM. A number of tests were carried out on the debris collected during the TBM advancement. In order to produce a parameter indicative of the particle size of the debris, the coarseness index (CI) was defined and compared with some parameters representative of the TBM performance [i.e. the excavation specific energy (SE) and field penetration index (FPI)] and rock mass features, such as RMR, GSI, uniaxial compression strength and joint spacing. The results obtained showed a clear trend between the CI and some TBM performance parameters, such as SE and FPI. On the contrary, due to the rock mass fracturing, a clear relationship between the CI and rock mass characteristics was not found.

  20. Quantitative Analysis of Mixed Halogen Dioxins and Furans in Fire Debris Utilizing Atmospheric Pressure Ionization Gas Chromatography-Triple Quadrupole Mass Spectrometry.

    Science.gov (United States)

    Organtini, Kari L; Myers, Anne L; Jobst, Karl J; Reiner, Eric J; Ross, Brian; Ladak, Adam; Mullin, Lauren; Stevens, Douglas; Dorman, Frank L

    2015-10-20

    Residential and commercial fires generate a complex mixture of volatile, semivolatile, and nonvolatile compounds. This study focused on the semi/nonvolatile components of fire debris to better understand firefighter exposure risks. Using the enhanced sensitivity of gas chromatography coupled to atmospheric pressure ionization-tandem mass spectrometry (APGC-MS/MS), complex fire debris samples collected from simulation fires were analyzed for the presence of potentially toxic polyhalogenated dibenzo-p-dioxins and dibenzofurans (PXDD/Fs and PBDD/Fs). Extensive method development was performed to create multiple reaction monitoring (MRM) methods for a wide range of PXDD/Fs from dihalogenated through hexa-halogenated homologue groups. Higher halogenated compounds were not observed due to difficulty eluting them off the long column used for analysis. This methodology was able to identify both polyhalogenated (mixed bromo-/chloro- and polybromo-) dibenzo-p-dioxins and dibenzofurans in the simulated burn study samples collected, with the dibenzofuran species being the dominant compounds in the samples. Levels of these compounds were quantified as total homologue groups due to the limitations of commercial congener availability. Concentration ranges in household simulation debris were observed at 0.01-5.32 ppb (PXDFs) and 0.18-82.11 ppb (PBDFs). Concentration ranges in electronics simulation debris were observed at 0.10-175.26 ppb (PXDFs) and 0.33-9254.41 ppb (PBDFs). Samples taken from the particulate matter coating the firefighters' helmets contained some of the highest levels of dibenzofurans, ranging from 4.10 ppb to 2.35 ppm. The data suggest that firefighters and first responders at fire scenes are exposed to a complex mixture of potentially hundreds to thousands of different polyhalogenated dibenzo-p-dioxins and dibenzofurans that could negatively impact their health.

  1. Intelligence Issues for Congress

    Science.gov (United States)

    2013-04-23

    open source information— osint (newspapers...by user agencies. Section 1052 of the Intelligence Reform Act expressed the sense of Congress that there should be an open source intelligence ...center to coordinate the collection, analysis, production, and dissemination of open source intelligence to other intelligence agencies. An Open Source

  2. Business Intelligence Systems Accounting Integration in Romania. a Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Daniela Postolache (Males

    2010-12-01

    Full Text Available Business Intelligence (BI systems have penetrated the Romanian market, providing a real decision support by integrating and synthesizing a large variety of information available in real time, anywhere in the world, including through mobile terminals. This study examines the BI solutions promoted in Romania through Internet sites written in Romanian, in terms of how the accounting information integration is done. Our paper highlights the most used economic and financial indicators and most often selected tools by BI systems developers to assist decisions. The writing bring forward the lack of transparency of the analyzed sites towards of configuration details of economic instruments, which we consider likely to delay the managers from Romania in order to become familiar with BI solutions, and it represent a weakness of this products promotion.

  3. In the Context of Multiple Intelligences Theory, Intelligent Data Analysis of Learning Styles Was Based on Rough Set Theory

    Science.gov (United States)

    Narli, Serkan; Ozgen, Kemal; Alkan, Huseyin

    2011-01-01

    The present study aims to identify the relationship between individuals' multiple intelligence areas and their learning styles with mathematical clarity using the concept of rough sets which is used in areas such as artificial intelligence, data reduction, discovery of dependencies, prediction of data significance, and generating decision…

  4. First Euro-China Conference on Intelligent Data Analysis and Applications

    CERN Document Server

    Snasel, Vaclav; Corchado, Emilio; Abraham, Ajith; Wang, Shyue-Liang

    2014-01-01

    The First Euro-China Conference on Intelligent Data Analysis and Applications (ECC 2014), which was hosted by Shenzhen Graduate School of Harbin Institute of Technology and was held in Shenzhen City on June 13-15, 2014. ECC 2014 was technically co-sponsored by Shenzhen Municipal People’s Government, IEEE Signal Processing Society, Machine Intelligence Research Labs, VSB-Technical University of Ostrava (Czech Republic), National Kaohsiung University of Applied Sciences (Taiwan), and Secure E-commerce Transactions (Shenzhen) Engineering Laboratory of Shenzhen Institute of Standards and Technology.

  5. Synthesis and Analysis in Artificial Intelligence: The Role of Theory in Agent Implementation

    NARCIS (Netherlands)

    Raine, Roxanne B.; op den Akker, Hendrikus J.A.; Cai, Zhiqiang; Graesser, Arthur C.; McNamara, Danielle S.

    2009-01-01

    The domain of artificial intelligence (AI) progresses with extraordinary vicissitude. Whereas prior authors have divided AI into the two categories of analysis and synthesis, Raine and op den Akker distinguish between four types of AI: that of appearance, function, simulation and interpretation.

  6. Theoretical analysis and real time implementation of a classical controller with intelligent properties

    Directory of Open Access Journals (Sweden)

    Essam Hendawi

    2018-05-01

    Full Text Available This paper presents theoretical analysis and experimental implementation of a classical controller with intelligent properties. The controller has constant parameters, but it performs as an intelligent controller. The controller design mimics the fuzzy logic controller in a classical form and combines the advantages of classical controllers and properties of intelligent controllers. The designed controller parameters force the controlled variable to behave such as a first order system with a desired time constant. DC motor practical system is used to demonstrate the effectiveness of the presented controller. Root locus and frequency response using Bode diagram are used to help the design of the controller parameters. Simulation and experimental results verify the high performance of the presented controller. Keywords: Classical controller, DC motor, Root locus, Frequency response, Arduino microcontroller

  7. Intelligent data analysis: the best approach for chronic heart failure (CHF) follow up management.

    Science.gov (United States)

    Mohammadzadeh, Niloofar; Safdari, Reza; Baraani, Alireza; Mohammadzadeh, Farshid

    2014-08-01

    Intelligent data analysis has ability to prepare and present complex relations between symptoms and diseases, medical and treatment consequences and definitely has significant role in improving follow-up management of chronic heart failure (CHF) patients, increasing speed ​​and accuracy in diagnosis and treatments; reducing costs, designing and implementation of clinical guidelines. The aim of this article is to describe intelligent data analysis methods in order to improve patient monitoring in follow and treatment of chronic heart failure patients as the best approach for CHF follow up management. Minimum data set (MDS) requirements for monitoring and follow up of CHF patient designed in checklist with six main parts. All CHF patients that discharged in 2013 from Tehran heart center have been selected. The MDS for monitoring CHF patient status were collected during 5 months in three different times of follow up. Gathered data was imported in RAPIDMINER 5 software. Modeling was based on decision trees methods such as C4.5, CHAID, ID3 and k-Nearest Neighbors algorithm (K-NN) with k=1. Final analysis was based on voting method. Decision trees and K-NN evaluate according to Cross-Validation. Creating and using standard terminologies and databases consistent with these terminologies help to meet the challenges related to data collection from various places and data application in intelligent data analysis. It should be noted that intelligent analysis of health data and intelligent system can never replace cardiologists. It can only act as a helpful tool for the cardiologist's decisions making.

  8. Debris flow risk mitigation by the means of rigid and flexible barriers – experimental tests and impact analysis

    Directory of Open Access Journals (Sweden)

    L. Canelli

    2012-05-01

    Full Text Available The impact of a debris flow on a structure can have disastrous effects because of the enormous destructive potential of this type of phenomenon. Although the introduction of risk mitigation structures such as the Sabo Dam, the filter dam and more recently flexible barriers is usual, there are very few methods that are universally recognized for the safe design of such structures. This study presents the results of experimental tests, conducted with the use of a specifically created flume, in order to obtain detailed knowledge of the mechanical aspects, and to analyze the dynamics of the impact of a debris flow on different types of structures. The analyses of the tests, together with the calculation of the thrust caused by the flow, have made it possible to analyze the dynamics of the impact, which has shown differing effects, on the basis of the type of barrier that has been installed.

  9. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  10. Quality control of intelligence research

    International Nuclear Information System (INIS)

    Lu Yan; Xin Pingping; Wu Jian

    2014-01-01

    Quality control of intelligence research is the core issue of intelligence management, is a problem in study of information science This paper focuses on the performance of intelligence to explain the significance of intelligence research quality control. In summing up the results of the study on the basis of the analysis, discusses quality control methods in intelligence research, introduces the experience of foreign intelligence research quality control, proposes some recommendations to improve quality control in intelligence research. (authors)

  11. Analysis of Approaches to the Near-Earth Orbit Cleanup from Space Debris of the Size Below10 cm

    Directory of Open Access Journals (Sweden)

    V. I. Maiorova

    2016-01-01

    Full Text Available Nowadays, there are a lot of concepts aimed at space debris removal from the near-Earth orbits being under way at different stages of detailed engineering and design. As opposed to large-size space debris (upper-stages, rocket bodies, non-active satellites, to track the small objects of space debris (SOSD, such as picosatellites, satellite fragments, pyrotechnic devices, and other items less than 10 cm in size, using the ground stations is, presently, a challenge.This SOSD feature allows the authors to propose the two most rational approaches, which use, respectively, a passive and an active (prompt maneuverable space vehicles (SV and appropriate schematic diagrams for their collection:1 Passive scheme – space vehicle (SV to be launched into an orbit is characterized by high mathematical expectation of collision with a large amount of SOSD and, accordingly, by high probability to be captured using both active or the passive tools. The SV does not execute any maneuvers, but can be equipped with a propulsion system required for orbit’s maintenance and correction and also for solving the tasks of long-range guidance.2 Active scheme – the SV is to be launched into the target or operating orbit and executes a number of maneuvers to capture the SOSD using both active and passive tools. Thus, such a SV has to be equipped with a rather high-trust propulsion system, which allows the change of its trajectory and also with the guidance system to provide it with target coordinates. The guidance system can be built on either radio or optical devices, it can be installed onboard the debris-removal SV or onboard the SV which operates as a supply unit (if such SVs are foreseen.The paper describes each approach, emphasizes advantages and disadvantages, and defines the cutting-edge technologies to be implemented.

  12. Analysis of potential debris flow source areas on Mount Shasta, California, by using airborne and satellite remote sensing data

    Science.gov (United States)

    Crowley, J.K.; Hubbard, B.E.; Mars, J.C.

    2003-01-01

    Remote sensing data from NASA's Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the first spaceborne imaging spectrometer, Hyperion, show hydrothermally altered rocks mainly composed of natroalunite, kaolinite, cristobalite, and gypsum on both the Mount Shasta and Shastina cones. Field observations indicate that much of the visible altered rock consists of talus material derived from fractured rock zones within and adjacent to dacitic domes and nearby lava flows. Digital elevation data were utilized to distinguish steeply sloping altered bedrock from more gently sloping talus materials. Volume modeling based on the imagery and digital elevation data indicate that Mount Shasta drainage systems contain moderate volumes of altered rock, a result that is consistent with Mount Shasta's Holocene record of mostly small to moderate debris flows. Similar modeling for selected areas at Mount Rainier and Mount Adams, Washington, indicates larger altered rock volumes consistent with the occurrence of much larger Holocene debris flows at those volcanoes. The availability of digital elevation and spectral data from spaceborne sensors, such as Hyperion and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER), greatly expands opportunities for studying potential debris flow source characteristics at stratovolcanoes around the world. ?? 2003 Elsevier Inc. All rights reserved.

  13. NASA's New Orbital Debris Engineering Model, ORDEM2010

    Science.gov (United States)

    Krisko, Paula H.

    2010-01-01

    This paper describes the functionality and use of ORDEM2010, which replaces ORDEM2000, as the NASA Orbital Debris Program Office (ODPO) debris engineering model. Like its predecessor, ORDEM2010 serves the ODPO mission of providing spacecraft designers/operators and debris observers with a publicly available model to calculate orbital debris flux by current-state-of-knowledge methods. The key advance in ORDEM2010 is the input file structure of the yearly debris populations from 1995-2035 of sizes 10 micron - 1 m. These files include debris from low-Earth orbits (LEO) through geosynchronous orbits (GEO). Stable orbital elements (i.e., those that do not randomize on a sub-year timescale) are included in the files as are debris size, debris number, material density, random error and population error. Material density is implemented from ground-test data into the NASA breakup model and assigned to debris fragments accordingly. The random and population errors are due to machine error and uncertainties in debris sizes. These high-fidelity population files call for a much higher-level model analysis than what was possible with the populations of ORDEM2000. Population analysis in the ORDEM2010 model consists of mapping matrices that convert the debris population elements to debris fluxes. One output mode results in a spacecraft encompassing 3-D igloo of debris flux, compartmentalized by debris size, velocity, pitch, and yaw with respect to spacecraft ram direction. The second output mode provides debris flux through an Earth-based telescope/radar beam from LEO through GEO. This paper compares the new ORDEM2010 with ORDEM2000 in terms of processes and results with examples of specific orbits.

  14. Intelligence: Real or artificial?

    OpenAIRE

    Schlinger, Henry D.

    1992-01-01

    Throughout the history of the artificial intelligence movement, researchers have strived to create computers that could simulate general human intelligence. This paper argues that workers in artificial intelligence have failed to achieve this goal because they adopted the wrong model of human behavior and intelligence, namely a cognitive essentialist model with origins in the traditional philosophies of natural intelligence. An analysis of the word “intelligence” suggests that it originally r...

  15. Nonlinear Analysis and Intelligent Control of Integrated Vehicle Dynamics

    Directory of Open Access Journals (Sweden)

    C. Huang

    2014-01-01

    Full Text Available With increasing and more stringent requirements for advanced vehicle integration, including vehicle dynamics and control, traditional control and optimization strategies may not qualify for many applications. This is because, among other factors, they do not consider the nonlinear characteristics of practical systems. Moreover, the vehicle wheel model has some inadequacies regarding the sideslip angle, road adhesion coefficient, vertical load, and velocity. In this paper, an adaptive neural wheel network is introduced, and the interaction between the lateral and vertical dynamics of the vehicle is analyzed. By means of nonlinear analyses such as the use of a bifurcation diagram and the Lyapunov exponent, the vehicle is shown to exhibit complicated motions with increasing forward speed. Furthermore, electric power steering (EPS and active suspension system (ASS, which are based on intelligent control, are used to reduce the nonlinear effect, and a negotiation algorithm is designed to manage the interdependences and conflicts among handling stability, driving smoothness, and safety. Further, a rapid control prototype was built using the hardware-in-the-loop simulation platform dSPACE and used to conduct a real vehicle test. The results of the test were consistent with those of the simulation, thereby validating the proposed control.

  16. Analysis of Intelligent Transportation Systems Using Model-Driven Simulations

    Directory of Open Access Journals (Sweden)

    Alberto Fernández-Isabel

    2015-06-01

    Full Text Available Intelligent Transportation Systems (ITSs integrate information, sensor, control, and communication technologies to provide transport related services. Their users range from everyday commuters to policy makers and urban planners. Given the complexity of these systems and their environment, their study in real settings is frequently unfeasible. Simulations help to address this problem, but present their own issues: there can be unintended mistakes in the transition from models to code; their platforms frequently bias modeling; and it is difficult to compare works that use different models and tools. In order to overcome these problems, this paper proposes a framework for a model-driven development of these simulations. It is based on a specific modeling language that supports the integrated specification of the multiple facets of an ITS: people, their vehicles, and the external environment; and a network of sensors and actuators conveniently arranged and distributed that operates over them. The framework works with a model editor to generate specifications compliant with that language, and a code generator to produce code from them using platform specifications. There are also guidelines to help researchers in the application of this infrastructure. A case study on advanced management of traffic lights with cameras illustrates its use.

  17. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  18. Debris thickness patterns on debris-covered glaciers

    Science.gov (United States)

    Anderson, Leif S.; Anderson, Robert S.

    2018-06-01

    Many debris-covered glaciers have broadly similar debris thickness patterns: surface debris thickens and tends to transition from convex- to concave-up-down glacier. We explain this pattern using theory (analytical and numerical models) paired with empirical observations. Down glacier debris thickening results from the conveyor-belt-like nature of the glacier surface in the ablation zone (debris can typically only be added but not removed) and from the inevitable decline in ice surface velocity toward the terminus. Down-glacier thickening of debris leads to the reduction of sub-debris melt and debris emergence toward the terminus. Convex-up debris thickness patterns occur near the up-glacier end of debris covers where debris emergence dominates (ablation controlled). Concave-up debris thickness patterns occur toward glacier termini where declining surface velocities dominate (velocity controlled). A convex-concave debris thickness profile inevitably results from the transition between ablation-control and velocity-control down-glacier. Debris thickness patterns deviating from this longitudinal shape are most likely caused by changes in hillslope debris supply through time. By establishing this expected debris thickness pattern, the effects of climate change on debris cover can be better identified.

  19. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    Science.gov (United States)

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  20. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    OpenAIRE

    Daniel-Petru GHENCEA; Miron ZAPCIU; Claudiu-Florinel BISU; Elena-Iuliana BOTEANU; Elena-Luminiţa OLTEANU

    2017-01-01

    The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic). The paper presents a prediction mode obtaining valid range of values f...

  1. Intelligent data-acquisition instrumentation for special nuclear material assay data analysis

    International Nuclear Information System (INIS)

    Ethridge, C.D.

    1980-01-01

    The Detection, Surveillance, Verification, and Recovery Group of the Los Alamos Scientific Laboratory Energy Division/Nuclear Safeguards Programs is now utilizing intelligent data-acquisition instrumentation for assay data analysis of special nuclear material. The data acquisition and analysis are enabled by the incorporation of a number-crunching microprocessor sequenced by a single component microcomputer. Microcomputer firmware establishes the capability for processing the computation of several selected functions and also the ability of instrumentation self-diagnostics

  2. A globally complete map of supraglacial debris cover and a new toolkit for debris cover research

    Science.gov (United States)

    Herreid, Sam; Pellicciotti, Francesca

    2017-04-01

    A growing canon of literature is focused on resolving the processes and implications of debris cover on glaciers. However, this work is often confined to a handful of glaciers that were likely selected based on criteria optimizing their suitability to test a specific hypothesis or logistical ease. The role of debris cover in a glacier system is likely to not go overlooked in forthcoming research, yet the magnitude of this role at a global scale has not yet been fully described. Here, we present a map of debris cover for all glacierized regions on Earth including the Greenland Ice Sheet using 30 m Landsat data. This dataset will begin to open a wider context to the high quality, localized findings from the debris-covered glacier research community and help inform large-scale modeling efforts. A global map of debris cover also facilitates analysis attempting to isolate first order geomorphological and climate controls of supraglacial debris production. Furthering the objective of expanding the inclusion of debris cover in forthcoming research, we also present an under development suite of open-source, Python based tools. Requiring minimal and often freely available input data, we have automated the mapping of: i) debris cover, ii) ice cliffs, iii) debris cover evolution over the Landsat era and iv) glacier flow instabilities from altered debris structures. At the present time, debris extent is the only globally complete quantity but with the expanding repository of high quality global datasets and further tool development minimizing manual tasks and computational cost, we foresee all of these tools being applied globally in the near future.

  3. Intelligent data analysis for e-learning enhancing security and trustworthiness in online learning systems

    CERN Document Server

    Miguel, Jorge; Xhafa, Fatos

    2016-01-01

    Intelligent Data Analysis for e-Learning: Enhancing Security and Trustworthiness in Online Learning Systems addresses information security within e-Learning based on trustworthiness assessment and prediction. Over the past decade, many learning management systems have appeared in the education market. Security in these systems is essential for protecting against unfair and dishonest conduct-most notably cheating-however, e-Learning services are often designed and implemented without considering security requirements. This book provides functional approaches of trustworthiness analysis, modeling, assessment, and prediction for stronger security and support in online learning, highlighting the security deficiencies found in most online collaborative learning systems. The book explores trustworthiness methodologies based on collective intelligence than can overcome these deficiencies. It examines trustworthiness analysis that utilizes the large amounts of data-learning activities generate. In addition, as proc...

  4. Intelligence in Artificial Intelligence

    OpenAIRE

    Datta, Shoumen Palit Austin

    2016-01-01

    The elusive quest for intelligence in artificial intelligence prompts us to consider that instituting human-level intelligence in systems may be (still) in the realm of utopia. In about a quarter century, we have witnessed the winter of AI (1990) being transformed and transported to the zenith of tabloid fodder about AI (2015). The discussion at hand is about the elements that constitute the canonical idea of intelligence. The delivery of intelligence as a pay-per-use-service, popping out of ...

  5. Artificial intelligence and medical imaging. Expert systems and image analysis

    International Nuclear Information System (INIS)

    Wackenheim, A.; Zoellner, G.; Horviller, S.; Jacqmain, T.

    1987-01-01

    This paper gives an overview on the existing systems for automated image analysis and interpretation in medical imaging, especially in radiology. The example of ORFEVRE, the system for the analysis of CAT-scan images of the cervical triplet (c3-c5) by image analysis and subsequent expert-system is given and discussed in detail. Possible extensions are described [fr

  6. Intelligence after traumatic brain injury: meta-analysis of outcomes and prognosis.

    Science.gov (United States)

    Königs, M; Engenhorst, P J; Oosterlaan, J

    2016-01-01

    Worldwide, 54-60 million individuals sustain traumatic brain injury (TBI) each year. This meta-analysis aimed to quantify intelligence impairments after TBI and to determine the value of age and injury severity in the prognosis of TBI. An electronic database search identified 81 relevant peer-reviewed articles encompassing 3890 patients. Full-scale IQ (FSIQ), performance IQ (PIQ) and verbal IQ (VIQ) impairments were quantified (Cohen's d) for patients with mild, moderate and severe TBI in the subacute phase of recovery and the chronic phase. Meta-regressions explored prognostic values of age and injury severity measures for intelligence impairments. The results showed that, in the subacute phase, FSIQ impairments were absent for patients with mild TBI, medium-sized for patients with moderate TBI (d = -0.61, P intelligence impairments, where children may have better recovery from mild TBI and poorer recovery from severe TBI than adults. Injury severity measures predict intelligence impairments and do not outperform one another. © 2015 EAN.

  7. Thermal analysis of fractures at Cerberus Fossae, Mars: Detection of air convection in the porous debris apron

    Science.gov (United States)

    Antoine, R.; Lopez, T.; Baratoux, D.; Rabinowicz, M.; Kurita, K.

    2011-08-01

    This study investigates the cause of high nighttime temperatures within Cerberus Fossae, a system of fractures affecting the Central Elysium Planitia. The inner parts (walls and floor) of the fractures are up to 40 K warmer than the surrounding plains. However, several temperature profiles exhibit a local temperature minima occurring in the central part of the fractures. We examined first the influence of cooling efficiency at night in the case of a strong reduction of the sky proportion induced by the fracture's geometry. However, the lack of correlation between temperature and sky proportion, calculated from extracted Mars Orbiter Laser Altimeter (MOLA) profiles argues against this hypothesis. Albedo variations were considered but appear to be limited within the fractures, and are generally not correlated with the temperatures. Variations of the thermal properties of bedrocks exposures, debris aprons and sand dunes inferred from high-resolution images do not either correlate with temperature variations within the fractures. As none of these factors taken alone, or combined, can satisfactorily explain the temperature variations within and near the fracture, we suggest that geothermal heat transported by air convection within the porous debris aprons may contribute to explain high temperatures at night and the local minima on the fracture floor. The conditions for the occurrence of the suggested phenomenon and the consequences on the surface temperature are numerically explored. A conservative geothermal gradient of 20 mW/m 2 was used in the simulations, this value being consistent with either inferred lithosphere elastic thicknesses below the shield volcanoes of the Tharsis dome or values predicted from numerical simulations of the thermal evolution of Mars. The model results indicate that temperature differences of 10-20 K between the central and upper parts of the fracture are explained in the case of high Darcy velocities which require high permeability values

  8. Association between water fluoride and the level of children's intelligence: a dose-response meta-analysis.

    Science.gov (United States)

    Duan, Q; Jiao, J; Chen, X; Wang, X

    2018-01-01

    Higher fluoride concentrations in water have inconsistently been associated with the levels of intelligence in children. The following study summarizes the available evidence regarding the strength of association between fluoridated water and children's intelligence. Meta-analysis. PubMed, Embase, and Cochrane Library databases were systematically analyzed from November 2016. Observational studies that have reported on intelligence levels in relation to high and low water fluoride contents, with 95% confidence intervals (CIs) were included. Further, the results were pooled using inverse variance methods. The correlation between water fluoride concentration and intelligence level was assessed by a dose-response meta-analysis. Twenty-six studies reporting data on 7258 children were included. The summary results indicated that high water fluoride exposure was associated with lower intelligence levels (standardized mean difference : -0.52; 95% CI: -0.62 to -0.42; P intelligence (P intelligence levels. Greater exposure to high levels of fluoride in water was significantly associated with reduced levels of intelligence in children. Therefore, water quality and exposure to fluoride in water should be controlled in areas with high fluoride levels in water. Copyright © 2017. Published by Elsevier Ltd.

  9. Space Transportation System Liftoff Debris Mitigation Process Overview

    Science.gov (United States)

    Mitchell, Michael; Riley, Christopher

    2011-01-01

    Liftoff debris is a top risk to the Space Shuttle Vehicle. To manage the Liftoff debris risk, the Space Shuttle Program created a team with in the Propulsion Systems Engineering & Integration Office. The Shutt le Liftoff Debris Team harnesses the Systems Engineering process to i dentify, assess, mitigate, and communicate the Liftoff debris risk. T he Liftoff Debris Team leverages off the technical knowledge and expe rtise of engineering groups across multiple NASA centers to integrate total system solutions. These solutions connect the hardware and ana lyses to identify and characterize debris sources and zones contribut ing to the Liftoff debris risk. The solutions incorporate analyses sp anning: the definition and modeling of natural and induced environmen ts; material characterizations; statistical trending analyses, imager y based trajectory analyses; debris transport analyses, and risk asse ssments. The verification and validation of these analyses are bound by conservative assumptions and anchored by testing and flight data. The Liftoff debris risk mitigation is managed through vigilant collab orative work between the Liftoff Debris Team and Launch Pad Operation s personnel and through the management of requirements, interfaces, r isk documentation, configurations, and technical data. Furthermore, o n day of launch, decision analysis is used to apply the wealth of ana lyses to case specific identified risks. This presentation describes how the Liftoff Debris Team applies Systems Engineering in their proce sses to mitigate risk and improve the safety of the Space Shuttle Veh icle.

  10. Security analysis - from analytical methods to intelligent systems

    Energy Technology Data Exchange (ETDEWEB)

    Lambert-Torres, G; Silva, A.P. Alves da; Ferreira, C [Escola Federal de Engenharia de Itajuba, MG (Brazil); Mattos dos Reis, L O [Taubate Univ., SP (Brazil)

    1994-12-31

    This paper presents an alternative approach to Security Analysis based on Artificial Neural Network (ANN) techniques. This new technique tries to imitate the human brain and is based on neurons and synopses. A critical review of the ANN used in Power System Operation problem solving is made, while structures to solve the Security Analysis problems are proposed. (author) 7 refs., 4 figs.

  11. Competitive intelligence and patent analysis in drug discovery.

    Science.gov (United States)

    Grandjean, Nicolas; Charpiot, Brigitte; Pena, Carlos Andres; Peitsch, Manuel C

    2005-01-01

    Patents are a major source of information in drug discovery and, when properly processed and analyzed, can yield a wealth of information on competitors activities, R&D trends, emerging fields, collaborations, among others. This review discusses the current state-of-the-art in textual data analysis and exploration methods as applied to patent analysis.: © 2005 Elsevier Ltd . All rights reserved.

  12. The Association Between Maternal Subclinical Hypothyroidism and Growth, Development, and Childhood Intelligence: A Meta-analysis

    Science.gov (United States)

    Liu, Yahong; Chen, Hui; Jing, Chen; Li, FuPin

    2018-06-01

    To explore the association between maternal subclinical hypothyroidism (SCH) in pregnancy and the somatic and intellectual development of their offspring. Using RevMan 5.3 software, a meta-analysis of cohort studies published from inception to May 2017, focusing on the association between maternal SCH in pregnancy and childhood growth, development and intelligence, was performed. Sources included the Cochrane Library, Pub-Med, Web of Science, China National Knowledge Infrastructure and Wan Fang Data. Analysis of a total of 15 cohort studies involving 1.896 pregnant women with SCH revealed that SCH in pregnancy was significantly associated with the intelligence (p=0.0007) and motor development (pdevelopment, low birth weight, premature delivery, fetal distress and fetal growth restriction.

  13. Intelligent Patching of Conceptual Geometry for CFD Analysis

    Science.gov (United States)

    Li, Wu

    2010-01-01

    The iPatch computer code for intelligently patching surface grids was developed to convert conceptual geometry to computational fluid dynamics (CFD) geometry (see figure). It automatically uses bicubic B-splines to extrapolate (if necessary) each surface in a conceptual geometry so that all the independently defined geometric components (such as wing and fuselage) can be intersected to form a watertight CFD geometry. The software also computes the intersection curves of surface patches at any resolution (up to 10.4 accuracy) specified by the user, and it writes the B-spline surface patches, and the corresponding boundary points, for the watertight CFD geometry in the format that can be directly used by the grid generation tool VGRID. iPatch requires that input geometry be in PLOT3D format where each component surface is defined by a rectangular grid {(x(i,j), y(i,j), z(i,j)):1less than or equal to i less than or equal to m, 1 less than or equal to j less than or equal to n} that represents a smooth B-spline surface. All surfaces in the PLOT3D file conceptually represent a watertight geometry of components of an aircraft on the half-space y greater than or equal to 0. Overlapping surfaces are not allowed, but could be fixed by a utility code "fixp3d". The fixp3d utility code first finds the two grid lines on the two surface grids that are closest to each other in Hausdorff distance (a metric to measure the discrepancies of two sets); then uses one of the grid lines as the transition line, extending grid lines on one grid to the other grid to form a merged grid. Any two connecting surfaces shall have a "visually" common boundary curve, or can be described by an intersection relationship defined in a geometry specification file. The intersection of two surfaces can be at a conceptual level. However, the intersection is directional (along either i or j index direction), and each intersecting grid line (or its spine extrapolation) on the first surface should intersect

  14. LEGACY - EOP Marine Debris

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — These data contains towed diver surveys of and weights of marine debris removed from the near shore environments of the NWHI.

  15. Pay attention to the enterprise competitive intelligence analysis research promotion enterprise scientific research production and product development

    International Nuclear Information System (INIS)

    Yang Yan

    2014-01-01

    This article covers the competitive intelligence content and five characteristics, and on the American Competitive intelligence Outstanding Company's place situation, shows fully the competitive intelligence constructs the core competitive power regarding the enterprise to have the significant function, Its contribution has already hold the pivotal status in the world famous enterprise. It is an important cornerstone for enterprises which construct the core competitive power. Along with the enterprise competition environment rapid change, the competitive intelligence importance suddenly to reveal day by day. Just like the world richest family Microsoft Corporation president Bill. Gates asserted that, How to collect, How to analysis, how to manage and how to use information, lt will decide the enterprise victory and loss. And unified the enterprise scientific research production the special details, take 'To develop the SF_6 New Product' to introduce as the example how did the enterprise competition intelligence, as well as how did the information development and using in it. (author)

  16. Emotional Intelligence and Organisational Citizenship Behaviour of Manufacturing Sector Employees: An Analysis

    Directory of Open Access Journals (Sweden)

    Susan Tee Suan Chin

    2011-06-01

    Full Text Available As with diversity, collaboration, co-operation and teamwork havebecome increasingly important issues for management to handle.The purpose of this study is to analyse the level of Emotional Intelligenceand Organisational Citizenship Behaviour among middlemanagement employees in the Malaysian manufacturing sector.A total of 536 employees from different organisations and industriestook part in this survey. Based on the descriptive analysis,employees in some industries tended to have a lower level ofemotional intelligence and organisational citizenship behaviour.

  17. Intelligence, previous convictions and interrogative suggestibility: a path analysis of alleged false-confession cases.

    Science.gov (United States)

    Sharrock, R; Gudjonsson, G H

    1993-05-01

    The main purpose of this study was to investigate the relationship between interrogative suggestibility and previous convictions among 108 defendants in criminal trials, using a path analysis technique. It was hypothesized that previous convictions, which may provide defendants with interrogative experiences, would correlate negatively with 'shift' as measured by the Gudjonsson Suggestibility Scale (Gudjonsson, 1984a), after intelligence and memory had been controlled for. The hypothesis was partially confirmed and the theoretical and practical implications of the findings are discussed.

  18. A comparative analysis of reactor lower head debris cooling models employed in the existing severe accident analysis codes

    International Nuclear Information System (INIS)

    Ahn, K.I.; Kim, D.H.; Kim, S.B.; Kim, H.D.

    1998-08-01

    MELCOR and MAAP4 are the representative severe accident analysis codes which have been developed for the integral analysis of the phenomenological reactor lower head corium cooling behavior. Main objectives of the present study is to identify merits and disadvantages of each relevant model through the comparative analysis of the lower plenum corium cooling models employed in these two codes. The final results will be utilized for the development of LILAC phenomenological models and for the continuous improvement of the existing MELCOR reactor lower head models, which are currently being performed at the KAERI. For these purposes, first, nine reference models are selected featuring the lower head corium behavior based on the existing experimental evidences and related models. Then main features of the selected models have been critically analyzed, and finally merits and disadvantages of each corresponding model have been summarized in the view point of realistic corium behavior and reasonable modeling. Being on these evidences, summarized and presented the potential improvements for developing more advanced models. The present study has been focused on the qualitative comparison of each model and so more detailed quantitative analysis is strongly required to obtain the final conclusions for their merits and disadvantages. In addition, in order to compensate the limitations of the current model, required further studies relating closely the detailed mechanistic models with the molten material movement and heat transfer based on phase-change in the porous medium, to the existing simple models. (author). 36 refs

  19. The fast debris evolution model

    Science.gov (United States)

    Lewis, H. G.; Swinerd, G. G.; Newland, R. J.; Saunders, A.

    2009-09-01

    The 'particles-in-a-box' (PIB) model introduced by Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] removed the need for computer-intensive Monte Carlo simulation to predict the gross characteristics of an evolving debris environment. The PIB model was described using a differential equation that allows the stability of the low Earth orbit (LEO) environment to be tested by a straightforward analysis of the equation's coefficients. As part of an ongoing research effort to investigate more efficient approaches to evolutionary modelling and to develop a suite of educational tools, a new PIB model has been developed. The model, entitled Fast Debris Evolution (FADE), employs a first-order differential equation to describe the rate at which new objects ⩾10 cm are added and removed from the environment. Whilst Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] based the collision theory for the PIB approach on collisions between gas particles and adopted specific values for the parameters of the model from a number of references, the form and coefficients of the FADE model equations can be inferred from the outputs of future projections produced by high-fidelity models, such as the DAMAGE model. The FADE model has been implemented as a client-side, web-based service using JavaScript embedded within a HTML document. Due to the simple nature of the algorithm, FADE can deliver the results of future projections immediately in a graphical format, with complete user-control over key simulation parameters. Historical and future projections for the ⩾10 cm LEO debris environment under a variety of different scenarios are possible, including business as usual, no future launches, post-mission disposal and remediation. A selection of results is presented with comparisons with predictions made using the DAMAGE environment model

  20. Operational Risk Management A Practical Approach to Intelligent Data Analysis

    CERN Document Server

    Kenett, Ron

    2010-01-01

    The book will introduce modern Operational Risk (OpR) Management and illustrates the various sources of OpR assessment and OpR mitigation. This book discusses how various data sources can be integrated and analyzed and how OpR is synergetic to other risk management activities such as Financial Risk Management and Internationalization. The topics will include state of the art technology such as semantic analysis, ontology engineering, data mining and statistical analysis.

  1. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  2. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  3. Quantum aspects of semantic analysis and symbolic artificial intelligence

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2004-01-01

    Modern approaches to semantic analysis if reformulated as Hilbert-space problems reveal formal structures known from quantum mechanics. A similar situation is found in distributed representations of cognitive structures developed for the purpose of neural networks. We take a closer look at similarities and differences between the above two fields and quantum information theory. (letter to the editor)

  4. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  5. Quantum aspects of semantic analysis and symbolic artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Aerts, Diederik [Centrum Leo Apostel (CLEA) and Foundations of the Exact Sciences (FUND), Vrije Universiteit Brussel, 1050 Brussels (Belgium); Czachor, Marek [Katedra Fizyki Teoretycznej i Metod Matematycznych, Politechnika Gdanska, 80-952 Gdansk (Poland)

    2004-03-26

    Modern approaches to semantic analysis if reformulated as Hilbert-space problems reveal formal structures known from quantum mechanics. A similar situation is found in distributed representations of cognitive structures developed for the purpose of neural networks. We take a closer look at similarities and differences between the above two fields and quantum information theory. (letter to the editor)

  6. Advances in intelligent analysis of medical data and decision support systems

    CERN Document Server

    Iantovics, Barna

    2013-01-01

    This volume is a result of the fruitful and vivid discussions during the MedDecSup'2012 International Workshop bringing together a relevant body of knowledge, and new developments in the increasingly important field of medical informatics. This carefully edited book presents new ideas aimed at the development of intelligent processing of various kinds of medical information and the perfection of the contemporary computer systems for medical decision support. The book presents advances of the medical information systems for intelligent archiving, processing, analysis and search-by-content which will improve the quality of the medical services for every patient and of the global healthcare system. The book combines in a synergistic way theoretical developments with the practicability of the approaches developed and presents the last developments and achievements in  medical informatics to a broad range of readers: engineers, mathematicians, physicians, and PhD students.

  7. Heart failure analysis dashboard for patient's remote monitoring combining multiple artificial intelligence technologies.

    Science.gov (United States)

    Guidi, G; Pettenati, M C; Miniati, R; Iadanza, E

    2012-01-01

    In this paper we describe an Heart Failure analysis Dashboard that, combined with a handy device for the automatic acquisition of a set of patient's clinical parameters, allows to support telemonitoring functions. The Dashboard's intelligent core is a Computer Decision Support System designed to assist the clinical decision of non-specialist caring personnel, and it is based on three functional parts: Diagnosis, Prognosis, and Follow-up management. Four Artificial Intelligence-based techniques are compared for providing diagnosis function: a Neural Network, a Support Vector Machine, a Classification Tree and a Fuzzy Expert System whose rules are produced by a Genetic Algorithm. State of the art algorithms are used to support a score-based prognosis function. The patient's Follow-up is used to refine the diagnosis.

  8. The Innovative Approaches to Packaging – Comparison Analysis of Intelligent and Active Packaging Perceptions in Slovakia

    Directory of Open Access Journals (Sweden)

    Loucanova Erika

    2017-06-01

    Full Text Available Packaging has always served a practical function - to hold goods together and protect it when moving toward the customer through distribution channel. Today packaging is also a container for promoting the product and making it easier and safer to use. The sheer importance of packaging functions is still growing and consequently the interest of the company is to access to the packaging more innovative and creative. The paper deals with the innovative approaches to packaging resulting in the creation of packaging with interactive active features in the form of active and intelligent packaging. Using comparative analysis, we monitored the perception of the active packaging functions in comparison to intelligent packaging function among different age categories. We identified the age categories which are most interested in these functions.

  9. Factor analysis of Wechsler Adult Intelligence Scale-Revised in developmentally disabled persons.

    Science.gov (United States)

    Di Nuovo, Santo F; Buono, Serafino

    2006-12-01

    The results of previous studies on the factorial structure of Wechsler Intelligence Scales are somewhat inconsistent across normal and pathological samples. To study specific clinical groups, such as developmentally disabled persons, it is useful to examine the factor structure in appropriate samples. A factor analysis was carried out using the principal component method and the Varimax orthogonal rotation on the Wechsler Adult Intelligence Scale (WAIS-R) in a sample of 203 developmentally disabled persons, with a mean age of 25 years 4 months. Developmental disability ranged from mild to moderate. Partially contrasting with previous studies on normal samples, results found a two-factor solution. Wechsler's traditional Verbal and Performance scales seems to be more appropriate for this sample than the alternative three-factor solution.

  10. An analysis of urban collisions using an artificial intelligence model.

    Science.gov (United States)

    Mussone, L; Ferrari, A; Oneta, M

    1999-11-01

    Traditional studies on road accidents estimate the effect of variables (such as vehicular flows, road geometry, vehicular characteristics), and the calculation of the number of accidents. A descriptive statistical analysis of the accidents (those used in the model) over the period 1992-1995 is proposed. The paper describes an alternative method based on the use of artificial neural networks (ANN) in order to work out a model that relates to the analysis of vehicular accidents in Milan. The degree of danger of urban intersections using different scenarios is quantified by the ANN model. Methodology is the first result, which allows us to tackle the modelling of urban vehicular accidents by the innovative use of ANN. Other results deal with model outputs: intersection complexity may determine a higher accident index depending on the regulation of intersection. The highest index for running over of pedestrian occurs at non-signalised intersections at night-time.

  11. Intelligent support of X-ray qualitative analysis

    International Nuclear Information System (INIS)

    Arai, Hiroshi; Uota, Atsushi; Ishida, Hidenobu

    1995-01-01

    'X-ray Diffraction AI Qualitative Analysis System' which can support analytical operation was developed. The purpose of this development is that even a beginner analyst can obtain the accurate results as well as specialists. This system has three new functions. The first one is to determine the suitable measuring conditions automatically for samples or analytical aim. The second is to guide the method of peak search. And, the third is to determine searching conditions same as measuring conditions. These were realized by applying experts' knowledge with 'AI techniques' and were added to our qualitative analysis systems. A beginner analyses through dialogue with this system, and can get the same results as a specialist would get. (author)

  12. Behavior of explosion debris clouds

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    In the normal course of events the behavior of debris clouds created by explosions will be of little concern to the atomic energy industry. However, two situations, one of them actual and one postulated, exist where the rise and spread of explosion clouds can affect site operations. The actual occurrence would be the detonation of nuclear weapons and the resultant release and transport of radioactive debris across the various atomic energy installations. Although the activity of the diffusing cloud is not of biological concern, it may still be sufficiently above background to play havoc with the normal readings of sensitive monitoring instruments. If it were not known that these anomalous readings resulted from explosion debris, considerable time and expense might be required for on-site testing and tracing. Fortunately it is usually possible, with the use of meteorological data and forecasts, to predict when individual sites are affected by nuclear weapon debris effects. The formation rise, and diffusion of weapon clouds will be discussed. The explosion of an atomic reactor is the postulated situation. It is common practice in reactor hazard analysis to assume a combination of circumstances which might result in a nuclear incident with a release of material to the atmosphere. It is not within the scope of this report to examine the manifold plausibilities that might lead to an explosion or the possible methods of release of gaseous and/or particulates from such an occurrence. However, if the information of a cloud is assumed and some idea of its energy content is obtainable, estimates of the cloud behavior in the atmosphere can be made

  13. What Friends Are For: Collaborative Intelligence Analysis and Search

    Science.gov (United States)

    2014-06-01

    preferences, then the similarity measure could then be some type of vector angularity measurement. Regardless of how similarity is computed, once 26 the...III. In addition to implementing the model, the software supports analysis of search performance. The program is written in Java and Python and...profiles within the profile database are encoded in XML format, as seen in Figure 13. Profiler is written in both Java and Python and is dependent upon

  14. A Concept Map Knowledge Model of Intelligence Analysis

    Science.gov (United States)

    2011-05-01

    Hoffman and Lintern 2006), cognitive task analysis (Crandall et al. 2006), expert systems (e.g., Coffey et al. 2003), and knowledge visualisation (e.g...information visualisation , application of CMapping is likely to continue expanding. 2.3 Components and properties of CMaps The main components of a...and relationships, e.g., images, texts, video and audio files, and Internet links, and enables the construction and sharing of CMap KMs. CMap KMs are

  15. Emplacement mechanisms of contrasting debris avalanches at Volcán Mombacho (Nicaragua), provided by structural and facies analysis

    Science.gov (United States)

    Shea, Thomas; van Wyk de Vries, Benjamin; Pilato, Martín

    2008-07-01

    We study the lithology, structure, and emplacement of two debris-avalanche deposits (DADs) with contrasting origins and materials from the Quaternary-Holocene Mombacho Volcano, Nicaragua. A clear comparison is possible because both DADs were emplaced onto similar nearly flat (3° slope) topography with no apparent barrier to transport. This lack of confinement allows us to study, in nature, the perfect case scenario of a freely spreading avalanche. In addition, there is good evidence that no substratum was incorporated in the events during flow, so facies changes are related only to internal dynamics. Mombacho shows evidence of at least three large flank collapses, producing the two well-preserved debris avalanches of this study; one on its northern flank, “Las Isletas,” directed northeast, and the other on its southern flank, “El Crater,” directed south. Other south-eastern features indicate that the debris-avalanche corresponding to the third collapse (La Danta) occurred before Las Isletas and El Crater events. The materials involved in each event were similar, except in their alteration state and in the amount of substrata initially included in the collapse. While “El Crater” avalanche shows no signs of substratum involvement and has characteristics of a hydrothermal weakening-related collapse, the “Las Isletas” avalanche involves significant substratum and was generated by gravity spreading-related failure. The latter avalanche may have interacted with Lake Nicaragua during transport, in which case its run-out could have been modified. Through a detailed morphological and structural description of the Mombacho avalanches, we provide two contrasting examples of non-eruptive volcanic flank collapse. We show that, remarkably, even with two distinct collapse mechanisms, the debris avalanches developed the same gross stratigraphy of a coarse layer above a fine layer. This fine layer provided a low friction basal slide layer. Whereas DAD layering and

  16. An intelligent hybrid system for surface coal mine safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lilic, N.; Obradovic, I.; Cvjetic, A. [University of Belgrade, Belgrade (Serbia)

    2010-06-15

    Analysis of safety in surface coal mines represents a very complex process. Published studies on mine safety analysis are usually based on research related to accidents statistics and hazard identification with risk assessment within the mining industry. Discussion in this paper is focused on the application of AI methods in the analysis of safety in mining environment. Complexity of the subject matter requires a high level of expert knowledge and great experience. The solution was found in the creation of a hybrid system PROTECTOR, whose knowledge base represents a formalization of the expert knowledge in the mine safety field. The main goal of the system is the estimation of mining environment as one of the significant components of general safety state in a mine. This global goal is subdivided into a hierarchical structure of subgoals where each subgoal can be viewed as the estimation of a set of parameters (gas, dust, climate, noise, vibration, illumination, geotechnical hazard) which determine the general mine safety state and category of hazard in mining environment. Both the hybrid nature of the system and the possibilities it offers are illustrated through a case study using field data related to an existing Serbian surface coal mine.

  17. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  18. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  19. Principles and tools for collaborative entity-based intelligence analysis.

    Science.gov (United States)

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  20. Space Debris & its Mitigation

    Science.gov (United States)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh

  1. ANALYSIS AND CONCEPTION DEVELOPMENT OF INFORMATION DEFENSE CID AND CLOUD PLATFORM ON THE BASE OF INTELLIGENCE TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    V. A. Vishniakov

    2014-01-01

    Full Text Available Two problems the use of intelligence technologies in information defense (ITID – creating specialized knowledge bases with threats simulation and high the security level in corporative nets and cloud computing are presented. The analysis of t wo directions of the second ITID problem: the intelligence decision support systems and the malt y-agent system use are given. As trends and conception development of intelligence technologies are the perfection of methods. models, architectures, and hard-sot ware tools for ITID in corporative systems and cloud computing.

  2. Debris flows associated with the 2015 Gorkha Earthquake in Nepal

    Science.gov (United States)

    Dahlquist, M. P.; West, A. J.; Martinez, J.

    2017-12-01

    Debris flows are a primary driver of erosion and a major geologic hazard in many steep landscapes, particularly near the headwaters of rivers, and are generated in large numbers by extreme events. The 2015 Mw 7.8 Gorkha Earthquake triggered 25,000 coseismic landslides in central Nepal. During the ensuing monsoon, sediment delivered to channels by landslides was mobilized in the heavy rains, and new postseismic landslides were triggered in rock weakened by the shaking. These coseismic and postseismic landslide-generated debris flows form a useful dataset for studying the impact and behavior of debris flows on one of the most active landscapes on Earth. Debris flow-dominated channel reaches are generally understood to have a topographic signature recognizable in slope-area plots and distinct from fluvial channels, but in examining debris flows associated with the Gorkha earthquake we find they frequently extend into reaches with geometry typically associated with fluvial systems. We examine a dataset of these debris flows, considering whether they are generated by coseismic or postseismic landslides, whether they are likely to be driving active incision into bedrock, and whether their channels correspond with those typically associated with debris flows. Preliminary analysis of debris flow channels in Nepal suggests there may be systematic differences in the geometry of channels containing debris flows triggered by coseismic versus postseismic landslides, which potentially holds implications for hazard analyses and the mechanics behind the different debris flow types.

  3. Intelligent acoustic data fusion technique for information security analysis

    Science.gov (United States)

    Jiang, Ying; Tang, Yize; Lu, Wenda; Wang, Zhongfeng; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Tone is an essential component of word formation in all tonal languages, and it plays an important role in the transmission of information in speech communication. Therefore, tones characteristics study can be applied into security analysis of acoustic signal by the means of language identification, etc. In speech processing, fundamental frequency (F0) is often viewed as representing tones by researchers of speech synthesis. However, regular F0 values may lead to low naturalness in synthesized speech. Moreover, F0 and tone are not equivalent linguistically; F0 is just a representation of a tone. Therefore, the Electroglottography (EGG) signal is collected for deeper tones characteristics study. In this paper, focusing on the Northern Kam language, which has nine tonal contours and five level tone types, we first collected EGG and speech signals from six natural male speakers of the Northern Kam language, and then achieved the clustering distributions of the tone curves. After summarizing the main characteristics of tones of Northern Kam, we analyzed the relationship between EGG and speech signal parameters, and laid the foundation for further security analysis of acoustic signal.

  4. LOG FILE ANALYSIS AND CREATION OF MORE INTELLIGENT WEB SITES

    Directory of Open Access Journals (Sweden)

    Mislav Šimunić

    2012-07-01

    Full Text Available To enable successful performance of any company or business system, both inthe world and in the Republic of Croatia, among many problems relating to its operationsand particularly to maximum utilization and efficiency of the Internet as a media forrunning business (especially in terms of marketing, they should make the best possible useof the present-day global trends and advantages of sophisticated technologies andapproaches to running a business. Bearing in mind the fact of daily increasing competitionand more demanding market, this paper addresses certain scientific and practicalcontribution to continuous analysis of demand market and adaptation thereto by analyzingthe log files and by retroactive effect on the web site. A log file is a carrier of numerousdata and indicators that should be used in the best possible way to improve the entirebusiness operations of a company. However, this is not always simple and easy. The websites differ in size, purpose, and technology used for designing them. For this very reason,the analytic analysis frameworks should be such that can cover any web site and at thesame time leave some space for analyzing and investigating the specific characteristicof each web site and provide for its dynamics by analyzing the log file records. Thoseconsiderations were a basis for this paper

  5. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  6. An artificial intelligence approach towards disturbance analysis in nuclear power plants

    International Nuclear Information System (INIS)

    Lindner, A.; Klebau, J.; Fielder, U.; Baldeweg, F.

    1987-01-01

    The scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in wellknown situations. But in cases of emergencies, the operator staff needs a more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application in nuclear power plants. 9 refs. (author)

  7. Advanced intelligent systems

    CERN Document Server

    Ryoo, Young; Jang, Moon-soo; Bae, Young-Chul

    2014-01-01

    Intelligent systems have been initiated with the attempt to imitate the human brain. People wish to let machines perform intelligent works. Many techniques of intelligent systems are based on artificial intelligence. According to changing and novel requirements, the advanced intelligent systems cover a wide spectrum: big data processing, intelligent control, advanced robotics, artificial intelligence and machine learning. This book focuses on coordinating intelligent systems with highly integrated and foundationally functional components. The book consists of 19 contributions that features social network-based recommender systems, application of fuzzy enforcement, energy visualization, ultrasonic muscular thickness measurement, regional analysis and predictive modeling, analysis of 3D polygon data, blood pressure estimation system, fuzzy human model, fuzzy ultrasonic imaging method, ultrasonic mobile smart technology, pseudo-normal image synthesis, subspace classifier, mobile object tracking, standing-up moti...

  8. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  9. Analysis and application of intelligence network based on FTTH

    Science.gov (United States)

    Feng, Xiancheng; Yun, Xiang

    2008-12-01

    With the continued rapid growth of Internet, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. The bandwidth requirement increase continuously. Network technique, optical device technical development is swift and violent. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network. Firstly, it introduces the main service which FTTH supports, main analysis key technology such as FTTH system composition way, topological structure, multiplexing, optical cable and device. It focus two kinds of realization methods - PON, P2P technology. Then it proposed that the solution of FTTH can support comprehensive access (service such as broadband data, voice, video and narrowband private line). Finally, it shows the engineering application for FTTH in the district and building. It brings enormous economic benefits and social benefit.

  10. Intelligent trend analysis for a solar thermal energy collector field

    Science.gov (United States)

    Juuso, E. K.

    2018-03-01

    Solar thermal power plants collect available solar energy in a usable form at a temperature range which is adapted to the irradiation levels and seasonal variations. Solar energy can be collected only when the irradiation is high enough to produce the required temperatures. During the operation, a trade-off of the temperature and the flow is needed to achieve a good level for the collected power. The scaling approach brings temporal analysis to all measurements and features: trend indices are calculated by comparing the averages in the long and short time windows, a weighted sum of the trend index and its derivative detects the trend episodes and severity of the trend is estimated by including also the variable level in the sum. The trend index, trend episodes and especially, the deviation index reveal early evolving changes in the operating conditions, including cloudiness and load disturbances. The solution is highly compact: all variables, features and indices are transformed to the range [-2, 2] and represented in natural language which is important in integrating data-driven solutions with domain expertise. The special situations detected during the test campaigns are explained well.

  11. Is crossed laterality associated with academic achievement and intelligence? A systematic review and meta-analysis.

    Science.gov (United States)

    Ferrero, Marta; West, Gillian; Vadillo, Miguel A

    2017-01-01

    Over the last century, sporadic research has suggested that people whose hand, eye, foot, or ear dominances are not consistently right- or left-sided are at special risk of suffering academic difficulties. This phenomenon is known as crossed laterality. Although the bulk of this research dates from 1960's and 1970's, crossed laterality is becoming increasingly popular in the area of school education, driving the creation of several interventions aimed at restoring or consolidating lateral dominance. However, the available evidence is fragmentary. To determine the impact of crossed laterality on academic achievement and intelligence, we conducted a systematic review and meta-analysis of articles published since 1900. The inclusion criteria for the review required that studies used one or more lateral preference tasks for at least two specific parts of the body; they included a valid measure of crossed laterality; they measured the impact of crossed laterality on academic achievement or intelligence; and they included participants between 3 and 17 years old. The final sample included 26 articles that covered a total population of 3578 children aged 5 to 12. Taken collectively, the results of these studies do not support the claim that there is a reliable association between crossed laterality and either academic achievement or intelligence. Along with this, we detected important shortcomings in the literature, such as considerable heterogeneity among the variables used to measure laterality and among the tasks utilized to measure the outcomes. The educational implications of these results are discussed.

  12. Intelligent analysis of energy consumption in school buildings

    International Nuclear Information System (INIS)

    Raatikainen, Mika; Skön, Jukka-Pekka; Leiviskä, Kauko; Kolehmainen, Mikko

    2016-01-01

    Highlights: • Electricity and heating energy consumptions of six school buildings were compared. • Complex multivariate data was analysed using modern computational methods. • Variation in electricity consumption cost is considerably low between study schools. • District heating variation is very slight in two new study schools. • District heating cost describes energy efficiency and state of building automation. - Abstract: Even though industry consumes nearly half of total energy production, the relative share of total energy consumption related to heating and operating buildings is growing constantly. The motivation for this study was to reveal the differences in electricity use and district heating consumption in school buildings of various ages during the working day and also during the night when human-based consumption is low. The overall aim of this study is to compare the energy (electricity and heating) consumption of six school buildings in Kuopio, Eastern Finland. The selected school buildings were built in different decades, and their ventilation and building automation systems are also inconsistent. The hourly energy consumption data was received from Kuopion Energia, the local energy supply company. In this paper, the results of data analysis on the energy consumption in these school buildings are presented. Preliminary results show that, generally speaking, new school buildings are more energy-efficient than older ones. However, concerning energy efficiency, two very new schools were exceptional because ventilation was on day and night in order to dry the building materials in the constructions. The novelty of this study is that it makes use of hourly smart metering consumption data on electricity and district heating, using modern computational methods to analyse complex multivariate data in order to increase knowledge of the buildings’ consumption profiles and energy efficiency.

  13. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  14. A Comparison of Vibration and Oil Debris Gear Damage Detection Methods Applied to Pitting Damage

    Science.gov (United States)

    Dempsey, Paula J.

    2000-01-01

    Helicopter Health Usage Monitoring Systems (HUMS) must provide reliable, real-time performance monitoring of helicopter operating parameters to prevent damage of flight critical components. Helicopter transmission diagnostics are an important part of a helicopter HUMS. In order to improve the reliability of transmission diagnostics, many researchers propose combining two technologies, vibration and oil monitoring, using data fusion and intelligent systems. Some benefits of combining multiple sensors to make decisions include improved detection capabilities and increased probability the event is detected. However, if the sensors are inaccurate, or the features extracted from the sensors are poor predictors of transmission health, integration of these sensors will decrease the accuracy of damage prediction. For this reason, one must verify the individual integrity of vibration and oil analysis methods prior to integrating the two technologies. This research focuses on comparing the capability of two vibration algorithms, FM4 and NA4, and a commercially available on-line oil debris monitor to detect pitting damage on spur gears in the NASA Glenn Research Center Spur Gear Fatigue Test Rig. Results from this research indicate that the rate of change of debris mass measured by the oil debris monitor is comparable to the vibration algorithms in detecting gear pitting damage.

  15. Analysis on the Dynamics of Burst Debris Flood at the Inclined Pressure-Shaft of Svandalsflona Hydropower Project, Norway

    Science.gov (United States)

    Panthi, K. K.

    2014-05-01

    Long-term stability of the waterway system of the hydropower plants is crucial and should not be underestimated. The compromise may result in severe economic consequences related to revenue loss caused by the plant closedown for needed repair, extra resources and time required for repair work, and third party loss related to industries and societies at large. In addition, possible contractual disputes between the clients and the contractors may arise in some occasions. Serious accidents may happen during repair and construction work with loss of life, since engineering geological environment (conditions) in the rock mass changed once under water for long period. This article focuses on one of the recent shaft collapse that happened in Norway in 2008. The article discusses and analyses the dynamics of burst debris flood that took place on 9 May 2009, while removing the slide rock mass deposited in the 45° inclined shaft of the Svandalsflona hydropower plant located at the Southern Norway. Careful review on the geological conditions inside the shaft, evaluation on the course of events, investigations on the inspections and inspections reports, assessment on the temperature and precipitation conditions have been carried out to come to the conclusion on what might have triggered the sudden burst flood.

  16. Disaster Debris Recovery Database - Landfills

    Data.gov (United States)

    U.S. Environmental Protection Agency — The US EPA Disaster Debris Recovery Database (DDRD) promotes the proper recovery, recycling, and disposal of disaster debris for emergency responders at the federal,...

  17. Disaster Debris Recovery Database - Recovery

    Data.gov (United States)

    U.S. Environmental Protection Agency — The US EPA Disaster Debris Recovery Database (DDRD) promotes the proper recovery, recycling, and disposal of disaster debris for emergency responders at the federal,...

  18. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    Science.gov (United States)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  19. An Intelligent System for Modelling, Design and Analysis of Chemical Processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    ICAS, Integrated Computer Aided System, is a software that consists of a number of intelligent tools, which are very suitable, among others, for computer aided modelling, sustainable design of chemical and biochemical processes, and design-analysis of product-process monitoring systems. Each...... the computer aided modelling tool will illustrate how to generate a desired process model, how to analyze the model equations, how to extract data and identify the model and make it ready for various types of application. In sustainable process design, the example will highlight the issue of integration...

  20. The neural determinants of age-related changes in fluid intelligence: a pre-registered, longitudinal analysis in UK Biobank.

    Science.gov (United States)

    Kievit, Rogier A; Fuhrmann, Delia; Borgeest, Gesa Sophia; Simpson-Kent, Ivan L; Henson, Richard N A

    2018-01-01

    Background:  Fluid intelligence declines with advancing age, starting in early adulthood. Within-subject declines in fluid intelligence are highly correlated with contemporaneous declines in the ability to live and function independently. To support healthy aging, the mechanisms underlying these declines need to be better understood. Methods:  In this pre-registered analysis, we applied latent growth curve modelling to investigate the neural determinants of longitudinal changes in fluid intelligence across three time points in 185,317 individuals (N=9,719 two waves, N=870 three waves) from the UK Biobank (age range: 39-73 years). Results:  We found a weak but significant effect of cross-sectional age on the mean fluid intelligence score, such that older individuals scored slightly lower. However, the mean longitudinal slope was positive, rather than negative, suggesting improvement across testing occasions. Despite the considerable sample size, the slope variance was non-significant, suggesting no reliable individual differences in change over time. This null-result is likely due to the nature of the cognitive test used. In a subset of individuals, we found that white matter microstructure (N=8839, as indexed by fractional anisotropy) and grey-matter volume (N=9931) in pre-defined regions-of-interest accounted for complementary and unique variance in mean fluid intelligence scores. The strongest effects were such that higher grey matter volume in the frontal pole and greater white matter microstructure in the posterior thalamic radiations were associated with higher fluid intelligence scores. Conclusions:  In a large preregistered analysis, we demonstrate a weak but significant negative association between age and fluid intelligence. However, we did not observe plausible longitudinal patterns, instead observing a weak increase across testing occasions, and no significant individual differences in rates of change, likely due to the suboptimal task design

  1. Loopy, Floppy and Fragmented: Debris Characteristics Matter

    Science.gov (United States)

    Parrish, J.; Burgess, H. K.

    2016-02-01

    Marine debris is a world-wide problem threatening the health and safety of marine organisms, ecosystems, and humans. Recent and ongoing research shows that risk of harm is not associated with identity, but rather with a set of specific character states, where the character state space intersection is defined by the organism of interest. For example, intersections of material, color, rigidity and size predict the likelihood of an object being ingested: plastic, clear-white, floppy objects risks to sea turtles whereas yellow-red, rigid objects risks to albatrosses. A character state space approach allows prioritization of prevention and removal of marine debris informed by risk assessments for species of interest by comparing species ranges with spatio-temporal hotspots of all debris with characteristics known to be associated with increased risk of harm, regardless of identity. With this in mind, the Coastal Observation and Seabird Survey Team (COASST) developed and tested a 20 character data collection approach to quantifying the diversity and abundance of marine debris found on beaches. Development resulted in meta-analysis of the literature and expert opinion eliciting harmful character state space. Testing included data collection on inter-rater reliability and accuracy, where the latter included 75 participants quantifying marine debris characteristics on monthly surveys of 30 beaches along the Washington and Oregon coastlines over the past year. Pilot work indicates that characters must be simply and operationally defined, states must be listed, and examples must be provided for color states. Complex characters (e.g., windage, shape) are not replicable across multiple data collectors. Although data collection takes longer than other marine debris surveys for a given amount of debris and area surveyed, volunteer rapidity and accuracy improved within 3-5 surveys. Initial feedback indicated that volunteers were willing to continue collecting data as long as they

  2. Space Debris Mitigation Guidelines

    Science.gov (United States)

    Johnson, Nicholas L.

    2011-01-01

    The purpose of national and international space debris mitigation guides is to promote the preservation of near-Earth space for applications and exploration missions far into the future. To accomplish this objective, the accumulation of objects, particularly in long-lived orbits, must be eliminated or curtailed.

  3. Recent developments in spatial analysis spatial statistics, behavioural modelling, and computational intelligence

    CERN Document Server

    Getis, Arthur

    1997-01-01

    In recent years, spatial analysis has become an increasingly active field, as evidenced by the establishment of educational and research programs at many universities. Its popularity is due mainly to new technologies and the development of spatial data infrastructures. This book illustrates some recent developments in spatial analysis, behavioural modelling, and computational intelligence. World renown spatial analysts explain and demonstrate their new and insightful models and methods. The applications are in areas of societal interest such as the spread of infectious diseases, migration behaviour, and retail and agricultural location strategies. In addition, there is emphasis on the uses of new technologoies for the analysis of spatial data through the application of neural network concepts.

  4. Research on Intelligent Avoidance Method of Shipwreck Based on Bigdata Analysis

    Directory of Open Access Journals (Sweden)

    Li Wei

    2017-11-01

    Full Text Available In order to solve the problem that current avoidance method of shipwreck has the problem of low success rate of avoidance, this paper proposes a method of intelligent avoidance of shipwreck based on big data analysis. Firstly,our method used big data analysis to calculate the safe distance of approach of ship under the head-on situation, the crossing situation and the overtaking situation.On this basis, by calculating the risk-degree of collision of ships,our research determined the degree of immediate danger of ships.Finally, we calculated the three kinds of evaluation function of ship navigation, and used genetic algorithm to realize the intelligent avoidance of shipwreck.Experimental result shows that compared the proposed method with the traditional method in two in a recent meeting when the distance to closest point of approach between two ships is 0.13nmile, they can effectively evade.The success rate of avoidance is high.

  5. Meta-analysis of fluid intelligence tests of children from the Chinese mainland with learning difficulties.

    Science.gov (United States)

    Tong, Fang; Fu, Tong

    2013-01-01

    To evaluate the differences in fluid intelligence tests between normal children and children with learning difficulties in China. PubMed, MD Consult, and other Chinese Journal Database were searched from their establishment to November 2012. After finding comparative studies of Raven measurements of normal children and children with learning difficulties, full Intelligent Quotation (FIQ) values and the original values of the sub-measurement were extracted. The corresponding effect model was selected based on the results of heterogeneity and parallel sub-group analysis was performed. Twelve documents were included in the meta-analysis, and the studies were all performed in mainland of China. Among these, two studies were performed at child health clinics, the other ten sites were schools and control children were schoolmates or classmates. FIQ was evaluated using a random effects model. WMD was -13.18 (95% CI: -16.50- -9.85). Children with learning difficulties showed significantly lower FIQ scores than controls (Pintelligence of children with learning difficulties was lower than that of normal children. Delayed development in sub-items of C, D, and E was more obvious.

  6. Post-traumatic amnesia predicts intelligence impairment following traumatic brain injury: a meta-analysis

    NARCIS (Netherlands)

    Konigs, M.; de Kieviet, J.F.; Oosterlaan, J.

    2012-01-01

    Context: Worldwide, millions of patients with traumatic brain injury (TBI) suffer from persistent and disabling intelligence impairment. Post-traumatic amnesia (PTA) duration is a promising predictor of intelligence following TBI. Objectives: To determine (1) the impact of TBI on intelligence

  7. Spiritual Intelligence, Emotional Intelligence and Auditor’s Performance

    OpenAIRE

    Hanafi, Rustam

    2010-01-01

    The objective of this research was to investigate empirical evidence about influence audi-tor spiritual intelligence on the performance with emotional intelligence as a mediator variable. Linear regression models are developed to examine the hypothesis and path analysis. The de-pendent variable of each model is auditor performance, whereas the independent variable of model 1 is spiritual intelligence, of model 2 are emotional intelligence and spiritual intelligence. The parameters were estima...

  8. Naturalist Intelligence Among the Other Multiple Intelligences [In Bulgarian

    Directory of Open Access Journals (Sweden)

    R. Genkov

    2007-09-01

    Full Text Available The theory of multiple intelligences was presented by Gardner in 1983. The theory was revised later (1999 and among the other intelligences a naturalist intelligence was added. The criteria for distinguishing of the different types of intelligences are considered. While Gardner restricted the analysis of the naturalist intelligence with examples from the living nature only, the present paper considered this problem on wider background including objects and persons of the natural sciences.

  9. GWAS-based pathway analysis differentiates between fluid and crystallized intelligence.

    Science.gov (United States)

    Christoforou, A; Espeseth, T; Davies, G; Fernandes, C P D; Giddaluru, S; Mattheisen, M; Tenesa, A; Harris, S E; Liewald, D C; Payton, A; Ollier, W; Horan, M; Pendleton, N; Haggarty, P; Djurovic, S; Herms, S; Hoffman, P; Cichon, S; Starr, J M; Lundervold, A; Reinvang, I; Steen, V M; Deary, I J; Le Hellard, S

    2014-09-01

    Cognitive abilities vary among people. About 40-50% of this variability is due to general intelligence (g), which reflects the positive correlation among individuals' scores on diverse cognitive ability tests. g is positively correlated with many life outcomes, such as education, occupational status and health, motivating the investigation of its underlying biology. In psychometric research, a distinction is made between general fluid intelligence (gF) - the ability to reason in novel situations - and general crystallized intelligence (gC) - the ability to apply acquired knowledge. This distinction is supported by developmental and cognitive neuroscience studies. Classical epidemiological studies and recent genome-wide association studies (GWASs) have established that these cognitive traits have a large genetic component. However, no robust genetic associations have been published thus far due largely to the known polygenic nature of these traits and insufficient sample sizes. Here, using two GWAS datasets, in which the polygenicity of gF and gC traits was previously confirmed, a gene- and pathway-based approach was undertaken with the aim of characterizing and differentiating their genetic architecture. Pathway analysis, using genes selected on the basis of relaxed criteria, revealed notable differences between these two traits. gF appeared to be characterized by genes affecting the quantity and quality of neurons and therefore neuronal efficiency, whereas long-term depression (LTD) seemed to underlie gC. Thus, this study supports the gF-gC distinction at the genetic level and identifies functional annotations and pathways worthy of further investigation. © 2014 The Authors. Genes, Brain and Behavior published by International Behavioural and Neural Genetics Society and John Wiley & Sons Ltd.

  10. Advanced intelligence and mechanism approach

    Institute of Scientific and Technical Information of China (English)

    ZHONG Yixin

    2007-01-01

    Advanced intelligence will feature the intelligence research in next 50 years.An understanding of the concept of advanced intelligence as well as its importance will be provided first,and detailed analysis on an approach,the mechanism approach.suitable to the advanced intelligence research will then be flolowed.And the mutual relationship among mechanism approach,traditional approaches existed in artificial intelligence research,and the cognitive informatics will be discussed.It is interesting to discover that mechanism approach is a good one to the Advanced Intelligence research and a tmified form of the existed approaches to artificial intelligence.

  11. ANALYSIS DATA SETS USING HYBRID TECHNIQUES APPLIED ARTIFICIAL INTELLIGENCE BASED PRODUCTION SYSTEMS INTEGRATED DESIGN

    Directory of Open Access Journals (Sweden)

    Daniel-Petru GHENCEA

    2017-06-01

    Full Text Available The paper proposes a prediction model of behavior spindle from the point of view of the thermal deformations and the level of the vibrations by highlighting and processing the characteristic equations. This is a model analysis for the shaft with similar electro-mechanical characteristics can be achieved using a hybrid analysis based on artificial intelligence (genetic algorithms - artificial neural networks - fuzzy logic. The paper presents a prediction mode obtaining valid range of values for spindles with similar characteristics based on measured data sets from a few spindles test without additional measures being required. Extracting polynomial functions of graphs resulting from simultaneous measurements and predict the dynamics of the two features with multi-objective criterion is the main advantage of this method.

  12. LEARNING STYLES BASED ADAPTIVE INTELLIGENT TUTORING SYSTEMS: DOCUMENT ANALYSIS OF ARTICLES PUBLISHED BETWEEN 2001. AND 2016.

    Directory of Open Access Journals (Sweden)

    Amit Kumar

    2017-12-01

    Full Text Available Actualizing instructional intercessions to suit learner contrasts has gotten extensive consideration. Among these individual contrast factors, the observational confirmation in regards to the academic benefit of learning styles has been addressed, yet the examination on the issue proceeds. Late improvements in web-based executions have driven researchers to re-examine the learning styles in adaptive tutoring frameworks. Adaptivity in intelligent tutoring systems is strongly influenced by the learning style of a learner. This study involved extensive document analysis of adaptive tutoring systems based on learning styles. Seventy-eight studies in literature from 2001 to 2016 were collected and classified under select parameters such as main focus, purpose, research types, methods, types and levels of participants, field/area of application, learner modelling, data gathering tools used and research findings. The current studies reveal that majority of the studies defined a framework or architecture of adaptive intelligent tutoring system (AITS while others focused on impact of AITS on learner satisfaction and academic outcomes. Currents trends, gaps in literature and ications were discussed.

  13. Forensic intelligence applied to questioned document analysis: A model and its application against organized crime.

    Science.gov (United States)

    De Alcaraz-Fossoul, Josep; Roberts, Katherine A

    2017-07-01

    The capability of forensic sciences to fight crime, especially against organized criminal groups, becomes relevant in the recent economic downturn and the war on terrorism. In view of these societal challenges, the methods of combating crime should experience critical changes in order to improve the effectiveness and efficiency of the current resources available. It is obvious that authorities have serious difficulties combating criminal groups of transnational nature. These are characterized as well structured organizations with international connections, abundant financial resources and comprised of members with significant and diverse expertise. One common practice among organized criminal groups is the use of forged documents that allow for the commission of illegal cross-border activities. Law enforcement can target these movements to identify counterfeits and establish links between these groups. Information on document falsification can become relevant to generate forensic intelligence and to design new strategies against criminal activities of this nature and magnitude. This article discusses a methodology for improving the development of forensic intelligence in the discipline of questioned document analysis. More specifically, it focuses on document forgeries and falsification types used by criminal groups. It also describes the structure of international criminal organizations that use document counterfeits as means to conduct unlawful activities. The model presented is partially based on practical applications of the system that have resulted in satisfactory outcomes in our laboratory. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  14. The psychology of intelligence analysis: drivers of prediction accuracy in world politics.

    Science.gov (United States)

    Mellers, Barbara; Stone, Eric; Atanasov, Pavel; Rohrbaugh, Nick; Metz, S Emlen; Ungar, Lyle; Bishop, Michael M; Horowitz, Michael; Merkle, Ed; Tetlock, Philip

    2015-03-01

    This article extends psychological methods and concepts into a domain that is as profoundly consequential as it is poorly understood: intelligence analysis. We report findings from a geopolitical forecasting tournament that assessed the accuracy of more than 150,000 forecasts of 743 participants on 199 events occurring over 2 years. Participants were above average in intelligence and political knowledge relative to the general population. Individual differences in performance emerged, and forecasting skills were surprisingly consistent over time. Key predictors were (a) dispositional variables of cognitive ability, political knowledge, and open-mindedness; (b) situational variables of training in probabilistic reasoning and participation in collaborative teams that shared information and discussed rationales (Mellers, Ungar, et al., 2014); and (c) behavioral variables of deliberation time and frequency of belief updating. We developed a profile of the best forecasters; they were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness. They had greater understanding of geopolitics, training in probabilistic reasoning, and opportunities to succeed in cognitively enriched team environments. Last but not least, they viewed forecasting as a skill that required deliberate practice, sustained effort, and constant monitoring of current affairs. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  15. Risk and size estimation of debris flow caused by storm rainfall in mountain regions

    Institute of Scientific and Technical Information of China (English)

    CHENG; Genwei

    2003-01-01

    Debris flow is a common disaster in mountain regions. The valley slope, storm rainfall and amassed sand-rock materials in a watershed may influence the types of debris flow. The bursting of debris flow is not a pure random event. Field investigations show the periodicity of its burst, but no directive evidence has been found yet. A risk definition of debris flow is proposed here based upon the accumulation and the starting conditions of loose material in channel. According to this definition, the risk of debris flow is of quasi-periodicity. A formula of risk estimation is derived. Analysis of relative factors reveals the relationship between frequency and size of debris flow. For a debris flow creek, the longer the time interval between two occurrences of debris flows is, the bigger the bursting event will be.

  16. Laser Remediation of Threats Posed by Small Orbital Debris

    Science.gov (United States)

    Fork, Richard L.; Rogers, Jan R.; Hovater, Mary A.

    2012-01-01

    The continually increasing amount of orbital debris in near Earth space poses an increasing challenge to space situational awareness. Recent collisions of spacecraft caused abrupt increases in the density of both large and small debris in near Earth space. An especially challenging class of threats is that due to the increasing density of small (1 mm to 10 cm dimension) orbital debris. This small debris poses a serious threat since: (1) The high velocity enables even millimeter dimension debris to cause serious damage to vulnerable areas of space assets, e.g., detector windows; (2) The small size and large number of debris elements prevent adequate detection and cataloguing. We have identified solutions to this threat in the form of novel laser systems and novel ways of using these laser systems. While implementation of the solutions we identify is challenging we find approaches offering threat mitigation within time frames and at costs of practical interest. We base our analysis on the unique combination of coherent light specifically structured in both space and time and applied in novel ways entirely within the vacuum of space to deorbiting small debris. We compare and contrast laser based small debris removal strategies using ground based laser systems with strategies using space based laser systems. We find laser systems located and used entirely within space offer essential and decisive advantages over groundbased laser systems.

  17. Active Debris Removal and the Challenges for Environment Remediation

    Science.gov (United States)

    Liou, J. C.

    2012-01-01

    Recent modeling studies on the instability of the debris population in the low Earth orbit (LEO) region and the collision between Iridium 33 and Cosmos 2251 have underlined the need for active debris removal. A 2009 analysis by the NASA Orbital Debris Program Office shows that, in order to maintain the LEO debris population at a constant level for the next 200 years, an active debris removal of about five objects per year is needed. The targets identified for removal are those with the highest mass and collision probability products in the environment. Many of these objects are spent upper stages with masses ranging from 1 to more than 8 metric tons, residing in several altitude regions and concentrated in about 7 inclination bands. To remove five of those objects on a yearly basis, in a cost-effective manner, represents many challenges in technology development, engineering, and operations. This paper outlines the fundamental rationale for considering active debris removal and addresses the two possible objectives of the operations -- removing large debris to stabilize the environment and removing small debris to reduce the threat to operational spacecraft. Technological and engineering challenges associated with the two different objectives are also discussed.

  18. The application and development of artificial intelligence in smart clothing

    Science.gov (United States)

    Wei, Xiong

    2018-03-01

    This paper mainly introduces the application of artificial intelligence in intelligent clothing. Starting from the development trend of artificial intelligence, analysis the prospects for development in smart clothing with artificial intelligence. Summarize the design key of artificial intelligence in smart clothing. Analysis the feasibility of artificial intelligence in smart clothing.

  19. Meta-analysis of fluid intelligence tests of children from the Chinese mainland with learning difficulties.

    Directory of Open Access Journals (Sweden)

    Fang Tong

    Full Text Available OBJECTIVE: To evaluate the differences in fluid intelligence tests between normal children and children with learning difficulties in China. METHOD: PubMed, MD Consult, and other Chinese Journal Database were searched from their establishment to November 2012. After finding comparative studies of Raven measurements of normal children and children with learning difficulties, full Intelligent Quotation (FIQ values and the original values of the sub-measurement were extracted. The corresponding effect model was selected based on the results of heterogeneity and parallel sub-group analysis was performed. RESULTS: Twelve documents were included in the meta-analysis, and the studies were all performed in mainland of China. Among these, two studies were performed at child health clinics, the other ten sites were schools and control children were schoolmates or classmates. FIQ was evaluated using a random effects model. WMD was -13.18 (95% CI: -16.50- -9.85. Children with learning difficulties showed significantly lower FIQ scores than controls (P<0.00001; Type of learning difficulty and gender differences were evaluated using a fixed-effects model (I² = 0%. The sites and purposes of the studies evaluated here were taken into account, but the reasons of heterogeneity could not be eliminated; The sum IQ of all the subgroups showed considerable heterogeneity (I² = 76.5%. The sub-measurement score of document A showed moderate heterogeneity among all documents, and AB, B, and E showed considerable heterogeneity, which was used in a random effect model. Individuals with learning difficulties showed heterogeneity as well. There was a moderate delay in the first three items (-0.5 to -0.9, and a much more pronounced delay in the latter three items (-1.4 to -1.6. CONCLUSION: In the Chinese mainland, the level of fluid intelligence of children with learning difficulties was lower than that of normal children. Delayed development in sub-items of C, D

  20. Intelligible Artificial Intelligence

    OpenAIRE

    Weld, Daniel S.; Bansal, Gagan

    2018-01-01

    Since Artificial Intelligence (AI) software uses techniques like deep lookahead search and stochastic optimization of huge neural networks to fit mammoth datasets, it often results in complex behavior that is difficult for people to understand. Yet organizations are deploying AI algorithms in many mission-critical settings. In order to trust their behavior, we must make it intelligible --- either by using inherently interpretable models or by developing methods for explaining otherwise overwh...

  1. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  2. Comparison of two solution ways of district heating control: Using analysis methods, using artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Balate, J.; Sysala, T. [Technical Univ., Zlin (Czech Republic). Dept. of Automation and Control Technology

    1997-12-31

    The District Heating Systems - DHS (Centralized Heat Supply Systems - CHSS) are being developed in large cities in accordance with their growth. The systems are formed by enlarging networks of heat distribution to consumers and at the same time they interconnect the heat sources gradually built. The heat is distributed to the consumers through the circular networks, that are supplied by several cooperating heat sources, that means by power and heating plants and heating plants. The complicated process of heat production technology and supply requires the system approach when solving the concept of automatized control. The paper deals with comparison of the solution way using the analysis methods and using the artificial intelligence methods. (orig.)

  3. Confirmatory Factor Analysis of the Wechsler Intelligence Scale for Children-Third Edition in an Australian Clinical Sample

    Science.gov (United States)

    Cockshott, Felicity C.; Marsh, Nigel V.; Hine, Donald W.

    2006-01-01

    A confirmatory factor analysis was conducted on the Wechsler Intelligence Scale for Children-Third Edition (WISC-III; D. Wechsler, 1991) with a sample of 579 Australian children referred for assessment because of academic difficulties in the classroom. The children were administered the WISC-III as part of the initial eligibility determination…

  4. Do Narcissism and Emotional Intelligence Win Us Friends? Modeling Dynamics of Peer Popularity Using Inferential Network Analysis.

    Science.gov (United States)

    Czarna, Anna Z; Leifeld, Philip; Śmieja, Magdalena; Dufner, Michael; Salovey, Peter

    2016-09-27

    This research investigated effects of narcissism and emotional intelligence (EI) on popularity in social networks. In a longitudinal field study, we examined the dynamics of popularity in 15 peer groups in two waves (N = 273). We measured narcissism, ability EI, and explicit and implicit self-esteem. In addition, we measured popularity at zero acquaintance and 3 months later. We analyzed the data using inferential network analysis (temporal exponential random graph modeling, TERGM) accounting for self-organizing network forces. People high in narcissism were popular, but increased less in popularity over time than people lower in narcissism. In contrast, emotionally intelligent people increased more in popularity over time than less emotionally intelligent people. The effects held when we controlled for explicit and implicit self-esteem. These results suggest that narcissism is rather disadvantageous and that EI is rather advantageous for long-term popularity. © 2016 by the Society for Personality and Social Psychology, Inc.

  5. Analysis of Computer-Aided and Artificial Intelligence Technologies and Solutions in Service Industries in Russia

    OpenAIRE

    Rezanov, Vladislav

    2013-01-01

    The primary objective of this research study was to investigate the relationship between Computer-Aided and Artificial Intelligence Technologies and customer satisfaction in the context of businesses in Russia. The research focuses on methods of Artificial Intelligence technology application in business and its effect on customer satisfaction. The researcher introduces Artificial Intelligence and studies the forecasting approaches in relation to business operations. The rese...

  6. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  7. Debris flow-induced topographic changes: effects of recurrent debris flow initiation.

    Science.gov (United States)

    Chen, Chien-Yuan; Wang, Qun

    2017-08-12

    Chushui Creek in Shengmu Village, Nantou County, Taiwan, was analyzed for recurrent debris flow using numerical modeling and geographic information system (GIS) spatial analysis. The two-dimensional water flood and mudflow simulation program FLO-2D were used to simulate debris flow induced by rainfall during typhoon Herb in 1996 and Mindulle in 2004. Changes in topographic characteristics after the debris flows were simulated for the initiation of hydrological characteristics, magnitude, and affected area. Changes in topographic characteristics included those in elevation, slope, aspect, stream power index (SPI), topographic wetness index (TWI), and hypsometric curve integral (HI), all of which were analyzed using GIS spatial analysis. The results show that the SPI and peak discharge in the basin increased after a recurrence of debris flow. The TWI was higher in 2003 than in 2004 and indicated higher potential of landslide initiation when the slope of the basin was steeper. The HI revealed that the basin was in its mature stage and was shifting toward the old stage. Numerical simulation demonstrated that the parameters' mean depth, maximum depth, affected area, mean flow rate, maximum flow rate, and peak flow discharge were increased after recurrent debris flow, and peak discharge occurred quickly.

  8. Characterizing Longitude-Dependent Orbital Debris Congestion in the Geosynchronous Orbit Regime

    Science.gov (United States)

    Anderson, Paul V.

    The geosynchronous orbit (GEO) is a unique commodity of the satellite industry that is becoming increasingly contaminated with orbital debris, but is heavily populated with high-value assets from the civil, commercial, and defense sectors. The GEO arena is home to hundreds of communications, data transmission, and intelligence satellites collectively insured for an estimated 18.3 billion USD. As the lack of natural cleansing mechanisms at the GEO altitude renders the lifetimes of GEO debris essentially infinite, conjunction and risk assessment must be performed to safeguard operational assets from debris collisions. In this thesis, longitude-dependent debris congestion is characterized by predicting the number of near-miss events per day for every longitude slot at GEO, using custom debris propagation tools and a torus intersection metric. Near-miss events with the present-day debris population are assigned risk levels based on GEO-relative position and speed, and this risk information is used to prioritize the population for debris removal target selection. Long-term projections of debris growth under nominal launch traffic, mitigation practices, and fragmentation events are also discussed, and latitudinal synchronization of the GEO debris population is explained via node variations arising from luni-solar gravity. In addition to characterizing localized debris congestion in the GEO ring, this thesis further investigates the conjunction risk to operational satellites or debris removal systems applying low-thrust propulsion to raise orbit altitude at end-of-life to a super-synchronous disposal orbit. Conjunction risks as a function of thrust level, miss distance, longitude, and semi-major axis are evaluated, and a guidance method for evading conjuncting debris with continuous thrust by means of a thrust heading change via single-shooting is developed.

  9. Persistent marine debris

    International Nuclear Information System (INIS)

    Levy, E.M.

    1992-01-01

    In this paper the distribution of persistent marine debris, adrift on world oceans and stranded on beaches globally, is reviewed and related to the known inputs and transport by the major surface currents. Since naturally occurring processes eventually degrade petroleum in the environment, international measures to reduce the inputs have been largely successful in alleviating oil pollution on a global, if not on a local, scale. Many plastics, however, are so resistant to natural degradation that merely controlling inputs will be insufficient, and more drastic and costly measures will be needed to cope with the emerging global problem posed by these materials

  10. A probabilistic approach for debris impact risk with numerical simulations of debris behaviors

    International Nuclear Information System (INIS)

    Kihara, Naoto; Matsuyama, Masafumi; Fujii, Naoki

    2013-01-01

    We propose a probabilistic approach for evaluating the impact risk of tsunami debris through Monte Carlo simulations with a combined system comprising a depth-averaged two-dimensional shallow water model and a discrete element model customized to simulate the motions of floating objects such as vessels. In the proposed method, first, probabilistic tsunami hazard analysis is carried out, and the exceedance probability of tsunami height and numerous tsunami time series for various hazard levels on the offshore side of a target site are estimated. Second, a characteristic tsunami time series for each hazard level is created by cluster analysis. Third, using the Monte Carlo simulation model the debris impact probability with the buildings of interest and the exceedance probability of debris impact speed are evaluated. (author)

  11. Safe disposal and recycling of water disaster debris in pakistan

    International Nuclear Information System (INIS)

    Latif, A.

    2014-01-01

    Depending upon the nature, the disaster may produce large masses of debris. Waste masses from single disaster integrate to larger magnitude annually. This will ultimately causes the extra work load on personnel and reflects the poor existing debris management facilities. Besides, it will take longer time to rehabilitate the debris exaggerated regions. The study focuses on 2 main cases of disaster i.e. earthquake of 2005 and flood of 2010 in Pakistan. Complete analysis involve two stages: the first stage involve development of disaster and disaster debris effects guidance whereas the second stage involves the development of set of criteria to make efficient environment and positive impacts of successful debris managing scheme. Such principles were employed to evaluate efficiency of debris managing scheme for detailed analysis. The discussion of the detailed analysis depicts methodology which assists the disaster managers, planners and researcher to simply multitude of work. Moreover, the disaster and disaster debris influence direction, the effect evaluation criterion and managing criteria have been established having the effect they can be virtually put into service for prospect debris managing scheme, planning and retort. With respect to character and strictness, calamity may make high magnitude of waste. By keeping in view the precedent calamities in the United States (US), concluded that in few situations produced waste masses approximately five to fifteen times more than yearly waste production rate from a single occasion. Same results were revealed by subsequent tsunami of Indian Ocean. Such kind of large masses may effects the existing solid debris management system and human resources. Major disaster yields large masses of debris in few hours or sometimes even in minutes. The volume of disaster debris depends upon the magnitude of trees ball up, indemnity to houses, business, services etc. The disaster remaining may be equally large in metropolitan and non

  12. The use of artificial intelligence methods for visual analysis of properties of surface layers

    Directory of Open Access Journals (Sweden)

    Tomasz Wójcicki

    2014-12-01

    Full Text Available [b]Abstract[/b]. The article presents a selected area of research on the possibility of automatic prediction of material properties based on the analysis of digital images. Original, holistic model of forecasting properties of surface layers based on a multi-step process that includes the selected methods of processing and analysis of images, inference with the use of a priori knowledge bases and multi-valued fuzzy logic, and simulation with the use of finite element methods is presented. Surface layers characteristics and core technologies of their production processes such as mechanical, thermal, thermo-mechanical, thermo-chemical, electrochemical, physical are discussed. Developed methods used in the model for the classification of images of the surface layers are shown. The objectives of the use of selected methods of processing and analysis of digital images, including techniques for improving the quality of images, segmentation, morphological transformation, pattern recognition and simulation of physical phenomena in the structures of materials are described.[b]Keywords[/b]: image analysis, surface layer, artificial intelligence, fuzzy logic

  13. The neural determinants of age-related changes in fluid intelligence: a pre-registered, longitudinal analysis in UK Biobank

    Science.gov (United States)

    Kievit, Rogier A.; Fuhrmann, Delia; Henson, Richard N. A.

    2018-01-01

    Background: Fluid intelligence declines with advancing age, starting in early adulthood. Within-subject declines in fluid intelligence are highly correlated with contemporaneous declines in the ability to live and function independently. To support healthy aging, the mechanisms underlying these declines need to be better understood. Methods: In this pre-registered analysis, we applied latent growth curve modelling to investigate the neural determinants of longitudinal changes in fluid intelligence across three time points in 185,317 individuals (N=9,719 two waves, N=870 three waves) from the UK Biobank (age range: 39-73 years). Results: We found a weak but significant effect of cross-sectional age on the mean fluid intelligence score, such that older individuals scored slightly lower. However, the mean longitudinal slope was positive, rather than negative, suggesting improvement across testing occasions. Despite the considerable sample size, the slope variance was non-significant, suggesting no reliable individual differences in change over time. This null-result is likely due to the nature of the cognitive test used. In a subset of individuals, we found that white matter microstructure (N=8839, as indexed by fractional anisotropy) and grey-matter volume (N=9931) in pre-defined regions-of-interest accounted for complementary and unique variance in mean fluid intelligence scores. The strongest effects were such that higher grey matter volume in the frontal pole and greater white matter microstructure in the posterior thalamic radiations were associated with higher fluid intelligence scores. Conclusions: In a large preregistered analysis, we demonstrate a weak but significant negative association between age and fluid intelligence. However, we did not observe plausible longitudinal patterns, instead observing a weak increase across testing occasions, and no significant individual differences in rates of change, likely due to the suboptimal task design. Finally

  14. Numerical modeling of the debris flows runout

    Directory of Open Access Journals (Sweden)

    Federico Francesco

    2017-01-01

    Full Text Available Rapid debris flows are identified among the most dangerous of all landslides. Due to their destructive potential, the runout length has to be predicted to define the hazardous areas and design safeguarding measures. To this purpose, a continuum model to predict the debris flows mobility is developed. It is based on the well known depth-integrated avalanche model proposed by Savage and Hutter (S&H model to simulate the dry granular materials flows. Conservation of mass and momentum equations, describing the evolving geometry and the depth averaged velocity distribution, are re-written taking into account the effects of the interstitial pressures and the possible variation of mass along the motion due to erosion/deposition processes. Furthermore, the mechanical behaviour of the debris flow is described by a recently developed rheological law, which allows to take into account the dissipative effects of the grain inelastic collisions and friction, simultaneously acting within a ‘shear layer’, typically at the base of the debris flows. The governing PDEs are solved by applying the finite difference method. The analysis of a documented case is finally carried out.

  15. Emotionally Intelligent Leadership: An Analysis of Targeted Interventions for Aspiring School Leaders in Texas

    Science.gov (United States)

    Kearney, W. Sean; Kelsey, Cheryl; Sinkfield, Carolin

    2014-01-01

    This study measures the impact of targeted interventions on the emotional intelligence of aspiring principals. The interventions utilized were designed by Nelson and Low (2011) to increase emotionally intelligent leadership skills in the following six areas: social awareness/active listening; anxiety management; decision making; appropriate use of…

  16. A Dynamic Analysis of the Effects of Intelligence and Socioeconomic Background on Job-Market Success

    Science.gov (United States)

    Ganzach, Yoav

    2011-01-01

    We compare the effects of socioeconomic background (SEB) and intelligence on wage trajectories in a dynamic growth modeling framework in a sample that had completed just 12 years of education. I show that the main difference between the two is that SEB affected wages solely by its effect on entry pay whereas intelligence affected wages primarily…

  17. Analysis on the Chinese Urban Sustainable Development Demands for the Management Plan of Intelligent Transportation Systems

    Institute of Scientific and Technical Information of China (English)

    赵历男

    2002-01-01

    This article analyzes the demands of the sustainable development and Chinese urban environmental protection for the management plan of intelligent transportation systems. The article also comments on how to work out the management plan of intelligent transportation systems with China's own characteristics.

  18. Humanitarian Intelligence : A Practitioner's Guide to Crisis Analysis and Project Design

    NARCIS (Netherlands)

    Zwitter, Andrej

    2016-01-01

    Humanitarian aid workers are faced with many challenges, from possible terrorist attacks to dealing with difficult stakeholders and securing operational space free from violence. To do their work properly and safely, they need effective intelligence. Humanitarian intelligence refers to the use of

  19. Emotional Intelligence: An Analysis between Implementing The Leader In Me and Fifth-Grade Achievement

    Science.gov (United States)

    Wilkens, Coral L.

    2013-01-01

    Goleman, Boyatzis, and McKee (2002) stated, "Leaders are made, not born" (p. 100). The quote is indicative of the shift in skills necessary to be a successful 21st-century learner. Instead of mere academic competencies, the 21st Century learner will need a different type of intelligence to be successful. Emotional intelligence may be…

  20. A Comparative Analysis of the Use of Competitive Intelligence Tools in a Multinational Corporation

    Science.gov (United States)

    Breese-Vitelli, Jennifer

    2011-01-01

    With the growth of the global economy, organizations large and small are increasingly recognizing that competitive intelligence (CI) is essential to compete in industry. Competitive intelligence is used to gain an advantage in commerce and is useful for analyzing a company's strategic industry position. To remain current and profitable,…

  1. Mitigation of Debris Flow Damage--­ A Case Study of Debris Flow Damage

    Science.gov (United States)

    Lin, J. C.; Jen, C. H.

    Typhoon Toraji caused more than 30 casualties in Central Taiwan on the 31st July 2001. It was the biggest Typhoon since the Chi-Chi earthquake of 1999 with huge amounts of rainfall. Because of the influence of the earthquake, loose debris falls and flows became major hazards in Central Taiwan. Analysis of rainfall data and sites of slope failure show that damage from these natural hazards were enhanced as a result of the Chi-Chi earthquake. Three main types of hazard occurred in Central Taiwan: land- slides, debris flows and gully erosion. Landslides occurred mainly along hill slopes and banks of channels. Many dams and houses were destroyed by flooding. Debris flows occurred during typhoon periods and re-activated ancient debris depositions. Many new gullies were therefore developed from deposits loosened and shaken by the earthquake. This paper demonstrates the geological/geomorphological background of the hazard area, and reviews methods of damage mitigation in central Taiwan. A good example is Hsi-Tou, which had experienced no gully erosion for more than 40 years. The area experienced much gully erosion as a result of the combined effects of earth- quake and typhoon. Although Typhoon Toraji produced only 30% of the rainfall of Typhoon Herb of 1996, it caused more damage in the Hsi-Tou area. The mitigation of debris flow hazards in Hsi-tou area is discussed in this paper.

  2. NEPHRUS: model of intelligent multilayers expert system for evaluation of the renal system based on scintigraphic images analysis

    International Nuclear Information System (INIS)

    Silva, Jose W.E. da; Schirru, Roberto; Boasquevisque, Edson M.

    1997-01-01

    This work develops a prototype for the system model based on Artificial Intelligence devices able to perform functions related to scintigraphic image analysis of the urinary system. Criteria used by medical experts for analysis images obtained with 99m Tc+DTPA and/or 99m Tc+DMSA were modeled and a multi resolution diagnosis technique was implemented. Special attention was given to the programs user interface design. Human Factor Engineering techniques were considered so as to ally friendliness and robustness. Results obtained using Artificial Neural Networks for the qualitative image analysis and the knowledge model constructed shows the feasibility of Artificial Intelligence implementation that use 'inherent' abilities of each technique in the resolution of diagnosis image analysis problems. (author). 12 refs., 2 figs., 2 tabs

  3. Low birth weight and intelligence in adolescence and early adulthood: a meta-analysis.

    Science.gov (United States)

    Kormos, C E; Wilkinson, A J; Davey, C J; Cunningham, A J

    2014-06-01

    Research has demonstrated an association between low birth weight (LBW; intelligence quotient (IQ) outcomes in childhood and early adolescence. We systematically evaluated whether this association persists into late adolescence and early adulthood and also assessed the influence of age of IQ assessment on effect size. During Stage 1 (meta-analysis of data on adolescents/adults), we searched for relevant articles in PsychINFO, PubMed, Ovid, CINAHL, ProQuest and ERIC until February 2011 (no lower limit). Studies which assessed full-scale IQ among LBW individuals (analysis provided a pooled estimate of the difference in IQ scores between LBW and NBW individuals. Publication bias was assessed using Rosenthal's classic fail-safe N and Duval and Tweedie's Trim and Fill. During Stage 2, we added data from the Kerr-Wilson et al. meta-analysis (which included data from children; in Meta-analysis of the association between preterm delivery and intelligence. Journal Public Health 2011;33:1-8) to our sample from Stage 1 and conducted a meta-regression to evaluate the effect of age of IQ assessment. Using a total of 15 studies in Stage 1, it was demonstrated that NBW individuals scored an average of 7.63 IQ points higher than LBW individuals, CI = 5.95-9.31. After adjusting for publication bias, NBW samples demonstrated an IQ of 4.98 points higher than LBW samples, CI = 3.20-6.77. Furthermore, age at IQ assessment was a significant moderator of the association between birth weight and IQ, in that the effect size decreased from childhood into young adulthood. Cognitive impairments associated with LBW persist into adolescence and early adulthood; however, the influence of LBW on IQ decreases from childhood to young adulthood. These conclusions must be interpreted with caution due to unmeasured variables and possible influence from publication bias. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions

  4. Superstorm Sandy marine debris wash-ups on Long Island - What happened to them?

    Science.gov (United States)

    Swanson, R Lawrence; Lwiza, Kamazima; Willig, Kaitlin; Morris, Kaitlin

    2016-07-15

    Superstorm Sandy generated huge quantities of debris in the Long Island, NY coastal zone. However, little appears to have been washed offshore to eventually be returned to Long Island's beaches as marine debris wash-ups. Information for our analysis includes debris collection statistics, very high resolution satellite images, along with wind and sea level data. Rigorous debris collection efforts along with meteorological conditions following the storm appear to have reduced the likelihood of debris wash-ups. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Wholesale debris removal from LEO

    Science.gov (United States)

    Levin, Eugene; Pearson, Jerome; Carroll, Joseph

    2012-04-01

    Recent advances in electrodynamic propulsion make it possible to seriously consider wholesale removal of large debris from LEO for the first time since the beginning of the space era. Cumulative ranking of large groups of the LEO debris population and general limitations of passive drag devices and rocket-based removal systems are analyzed. A candidate electrodynamic debris removal system is discussed that can affordably remove all debris objects over 2 kg from LEO in 7 years. That means removing more than 99% of the collision-generated debris potential in LEO. Removal is performed by a dozen 100-kg propellantless vehicles that react against the Earth's magnetic field. The debris objects are dragged down and released into short-lived orbits below ISS. As an alternative to deorbit, some of them can be collected for storage and possible in-orbit recycling. The estimated cost per kilogram of debris removed is a small fraction of typical launch costs per kilogram. These rates are low enough to open commercial opportunities and create a governing framework for wholesale removal of large debris objects from LEO.

  6. Intelligent Design and Intelligent Failure

    Science.gov (United States)

    Jerman, Gregory

    2015-01-01

    Good Evening, my name is Greg Jerman and for nearly a quarter century I have been performing failure analysis on NASA's aerospace hardware. During that time I had the distinct privilege of keeping the Space Shuttle flying for two thirds of its history. I have analyzed a wide variety of failed hardware from simple electrical cables to cryogenic fuel tanks to high temperature turbine blades. During this time I have found that for all the time we spend intelligently designing things, we need to be equally intelligent about understanding why things fail. The NASA Flight Director for Apollo 13, Gene Kranz, is best known for the expression "Failure is not an option." However, NASA history is filled with failures both large and small, so it might be more accurate to say failure is inevitable. It is how we react and learn from our failures that makes the difference.

  7. Artificial intelligence

    CERN Document Server

    Hunt, Earl B

    1975-01-01

    Artificial Intelligence provides information pertinent to the fundamental aspects of artificial intelligence. This book presents the basic mathematical and computational approaches to problems in the artificial intelligence field.Organized into four parts encompassing 16 chapters, this book begins with an overview of the various fields of artificial intelligence. This text then attempts to connect artificial intelligence problems to some of the notions of computability and abstract computing devices. Other chapters consider the general notion of computability, with focus on the interaction bet

  8. Intelligent mechatronics; Intelligent mechatronics

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, H. [The University of Tokyo, Tokyo (Japan). Institute of Industrial Science

    1995-10-01

    Intelligent mechatronics (IM) was explained as follows: a study of IM essentially targets realization of a robot namely, but in the present stage the target is a creation of new values by intellectualization of machine, that is, a combination of the information infrastructure and the intelligent machine system. IM is also thought to be constituted of computers positively used and micromechatronics. The paper next introduces examples of IM study, mainly those the author is concerned with as shown below: sensor gloves, robot hands, robot eyes, tele operation, three-dimensional object recognition, mobile robot, magnetic bearing, construction of remote controlled unmanned dam, robot network, sensitivity communication using neuro baby, etc. 27 figs.

  9. Space debris: modeling and detectability

    Science.gov (United States)

    Wiedemann, C.; Lorenz, J.; Radtke, J.; Kebschull, C.; Horstmann, A.; Stoll, E.

    2017-01-01

    High precision orbit determination is required for the detection and removal of space debris. Knowledge of the distribution of debris objects in orbit is necessary for orbit determination by active or passive sensors. The results can be used to investigate the orbits on which objects of a certain size at a certain frequency can be found. The knowledge of the orbital distribution of the objects as well as their properties in accordance with sensor performance models provide the basis for estimating the expected detection rates. Comprehensive modeling of the space debris environment is required for this. This paper provides an overview of the current state of knowledge about the space debris environment. In particular non-cataloged small objects are evaluated. Furthermore, improvements concerning the update of the current space debris model are addressed. The model of the space debris environment is based on the simulation of historical events, such as fragmentations due to explosions and collisions that actually occurred in Earth orbits. The orbital distribution of debris is simulated by propagating the orbits considering all perturbing forces up to a reference epoch. The modeled object population is compared with measured data and validated. The model provides a statistical distribution of space objects, according to their size and number. This distribution is based on the correct consideration of orbital mechanics. This allows for a realistic description of the space debris environment. Subsequently, a realistic prediction can be provided concerning the question, how many pieces of debris can be expected on certain orbits. To validate the model, a software tool has been developed which allows the simulation of the observation behavior of ground-based or space-based sensors. Thus, it is possible to compare the results of published measurement data with simulated detections. This tool can also be used for the simulation of sensor measurement campaigns. It is

  10. Laser space debris removal: now, not later

    Science.gov (United States)

    Phipps, Claude R.

    2015-02-01

    Small (1-10cm) debris in low Earth orbit (LEO) are extremely dangerous, because they spread the breakup cascade depicted in the movie "Gravity." Laser-Debris-Removal (LDR) is the only solution that can address both large and small debris. In this paper, we briefly review ground-based LDR, and discuss how a polar location can dramatically increase its effectiveness for the important class of sun-synchronous orbit (SSO) objects. No other solutions address the whole problem of large ( 1000cm, 1 ton) as well as small debris. Physical removal of small debris (by nets, tethers and so on) is impractical because of the energy cost of matching orbits. We also discuss a new proposal which uses a space-based station in low Earth orbit (LEO), and rapid, head-on interaction in 10- 40s rather than 4 minutes, with high-power bursts of 100ps, 355nm pulses from a 1.5m diameter aperture. The orbiting station employs "heat-capacity" laser mode with low duty cycle to create an adaptable, robust, dualmode system which can lower or raise large derelict objects into less dangerous orbits, as well as clear out the small debris in a 400-km thick LEO band. Time-average laser optical power is less than 15kW. The combination of short pulses and UV wavelength gives lower required energy density (fluence) on target as well as higher momentum coupling coefficient. This combination leads to much smaller mirrors and lower average power than the ground-based systems we have considered previously. Our system also permits strong defense of specific assets. Analysis gives an estimated cost of about 1k each to re-enter most small debris in a few months, and about 280k each to raise or lower 1-ton objects by 40km. We believe it can do this for 2,000 such large objects in about four years. Laser ablation is one of the few interactions in nature that propel a distant object without any significant reaction on the source.

  11. A real two-phase submarine debris flow and tsunami

    International Nuclear Information System (INIS)

    Pudasaini, Shiva P.; Miller, Stephen A.

    2012-01-01

    The general two-phase debris flow model proposed by Pudasaini is employed to study subaerial and submarine debris flows, and the tsunami generated by the debris impact at lakes and oceans. The model, which includes three fundamentally new and dominant physical aspects such as enhanced viscous stress, virtual mass, and generalized drag (in addition to buoyancy), constitutes the most generalized two-phase flow model to date. The advantage of this two-phase debris flow model over classical single-phase, or quasi-two-phase models, is that the initial mass can be divided into several parts by appropriately considering the solid volume fraction. These parts include a dry (landslide or rock slide), a fluid (water or muddy water; e.g., dams, rivers), and a general debris mixture material as needed in real flow simulations. This innovative formulation provides an opportunity, within a single framework, to simultaneously simulate the sliding debris (or landslide), the water lake or ocean, the debris impact at the lake or ocean, the tsunami generation and propagation, the mixing and separation between the solid and fluid phases, and the sediment transport and deposition process in the bathymetric surface. Applications of this model include (a) sediment transport on hill slopes, river streams, hydraulic channels (e.g., hydropower dams and plants); lakes, fjords, coastal lines, and aquatic ecology; and (b) submarine debris impact and the rupture of fiber optic, submarine cables and pipelines along the ocean floor, and damage to offshore drilling platforms. Numerical simulations reveal that the dynamics of debris impact induced tsunamis in mountain lakes or oceans are fundamentally different than the tsunami generated by pure rock avalanches and landslides. The analysis includes the generation, amplification and propagation of super tsunami waves and run-ups along coastlines, debris slide and deposition at the bottom floor, and debris shock waves. It is observed that the

  12. A real two-phase submarine debris flow and tsunami

    Energy Technology Data Exchange (ETDEWEB)

    Pudasaini, Shiva P.; Miller, Stephen A. [Department of Geodynamics and Geophysics, Steinmann Institute, University of Bonn Nussallee 8, D-53115, Bonn (Germany)

    2012-09-26

    The general two-phase debris flow model proposed by Pudasaini is employed to study subaerial and submarine debris flows, and the tsunami generated by the debris impact at lakes and oceans. The model, which includes three fundamentally new and dominant physical aspects such as enhanced viscous stress, virtual mass, and generalized drag (in addition to buoyancy), constitutes the most generalized two-phase flow model to date. The advantage of this two-phase debris flow model over classical single-phase, or quasi-two-phase models, is that the initial mass can be divided into several parts by appropriately considering the solid volume fraction. These parts include a dry (landslide or rock slide), a fluid (water or muddy water; e.g., dams, rivers), and a general debris mixture material as needed in real flow simulations. This innovative formulation provides an opportunity, within a single framework, to simultaneously simulate the sliding debris (or landslide), the water lake or ocean, the debris impact at the lake or ocean, the tsunami generation and propagation, the mixing and separation between the solid and fluid phases, and the sediment transport and deposition process in the bathymetric surface. Applications of this model include (a) sediment transport on hill slopes, river streams, hydraulic channels (e.g., hydropower dams and plants); lakes, fjords, coastal lines, and aquatic ecology; and (b) submarine debris impact and the rupture of fiber optic, submarine cables and pipelines along the ocean floor, and damage to offshore drilling platforms. Numerical simulations reveal that the dynamics of debris impact induced tsunamis in mountain lakes or oceans are fundamentally different than the tsunami generated by pure rock avalanches and landslides. The analysis includes the generation, amplification and propagation of super tsunami waves and run-ups along coastlines, debris slide and deposition at the bottom floor, and debris shock waves. It is observed that the

  13. Intelligence, Race, and Genetics

    Science.gov (United States)

    Sternberg, Robert J.; Grigorenko, Elena L.; Kidd, Kenneth K.

    2005-01-01

    In this article, the authors argue that the overwhelming portion of the literature on intelligence, race, and genetics is based on folk taxonomies rather than scientific analysis. They suggest that because theorists of intelligence disagree as to what it is, any consideration of its relationships to other constructs must be tentative at best. They…

  14. Intelligent robot action planning

    Energy Technology Data Exchange (ETDEWEB)

    Vamos, T; Siegler, A

    1982-01-01

    Action planning methods used in intelligent robot control are discussed. Planning is accomplished through environment understanding, environment representation, task understanding and planning, motion analysis and man-machine communication. These fields are analysed in detail. The frames of an intelligent motion planning system are presented. Graphic simulation of the robot's environment and motion is used to support the planning. 14 references.

  15. The Intelligence Dilemma: Proximity and Politicization–Analysis of External Influences

    Directory of Open Access Journals (Sweden)

    Beth Eisenfeld

    2017-06-01

    Full Text Available The relationship between policy-making and strategic intelligence is a source of ongoing discourse. Although there is an abundance of literature about the relationship between consumers and producers of intelligence, consensus as to the relationship between policy makers and intelligence producers is lacking. The two concepts–proximity and politicization–represent the intelligence dilemma that leads to claims of politicization, a word with many interpretations. Most observers of the democratic policy-making process are familiar with the traditional potential sources of politicization yet those sources are not the only potential sources of politicization and there is a paucity of literature about external influences and the politicization of intelligence. In democracies, governed by the people through their elected representatives, many individuals and groups interact with policymakers to influence decisions. This article provides a framework for understanding sources of politicization external to the intelligence community. It identifies an outside-in influence and uses three examples to show how this type of stimulus contributes to the politicization of intelligence.

  16. Analysis of rainfall preceding debris flows on the Smědavská hora Mt., Jizerské hory Mts., Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Smolíková, J.; Blahůt, Jan; Vilímek, V.

    2016-01-01

    Roč. 13, č. 4 (2016), s. 683-696 ISSN 1612-510X Institutional support: RVO:67985891 Keywords : debris flow * rainfall pattern * rainfall thresholds * Jizerské hory Mts. * Czech Republic Subject RIV: DE - Earth Magnetism, Geodesy, Geography Impact factor: 3.657, year: 2016

  17. Sustainable Technology Analysis of Artificial Intelligence Using Bayesian and Social Network Models

    Directory of Open Access Journals (Sweden)

    Juhwan Kim

    2018-01-01

    Full Text Available Recent developments in artificial intelligence (AI have led to a significant increase in the use of AI technologies. Many experts are researching and developing AI technologies in their respective fields, often submitting papers and patent applications as a result. In particular, owing to the characteristics of the patent system that is used to protect the exclusive rights to registered technology, patent documents contain detailed information on the developed technology. Therefore, in this study, we propose a statistical method for analyzing patent data on AI technology to improve our understanding of sustainable technology in the field of AI. We collect patent documents that are related to AI technology, and then analyze the patent data to identify sustainable AI technology. In our analysis, we develop a statistical method that combines social network analysis and Bayesian modeling. Based on the results of the proposed method, we provide a technological structure that can be applied to understand the sustainability of AI technology. To show how the proposed method can be applied to a practical problem, we apply the technological structure to a case study in order to analyze sustainable AI technology.

  18. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  19. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    Science.gov (United States)

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Children with unilateral hearing loss may have lower intelligence quotient scores: A meta-analysis.

    Science.gov (United States)

    Purcell, Patricia L; Shinn, Justin R; Davis, Greg E; Sie, Kathleen C Y

    2016-03-01

    In this meta-analysis, we reviewed observational studies investigating differences in intelligence quotient (IQ) scores of children with unilateral hearing loss compared to children with normal hearing. PubMed Medline, Cumulative Index to Nursing and Allied Health Literature, Embase, PsycINFO. A query identified all English-language studies related to pediatric unilateral hearing loss published between January 1980 and December 2014. Titles, abstracts, and articles were reviewed to identify observational studies reporting IQ scores. There were 261 unique titles, with 29 articles undergoing full review. Four articles were identified, which included 173 children with unilateral hearing loss and 202 children with normal hearing. Ages ranged from 6 to 18 years. Three studies were conducted in the United States and one in Mexico. All were of high quality. All studies reported full-scale IQ results; three reported verbal IQ results; and two reported performance IQ results. Children with unilateral hearing loss scored 6.3 points lower on full-scale IQ, 95% confidence interval (CI) [-9.1, -3.5], P value analysis suggests children with unilateral hearing loss have lower full-scale and performance IQ scores than children with normal hearing. There also may be disparity in verbal IQ scores. Laryngoscope, 126:746-754, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  1. Problems of Small Debris

    Directory of Open Access Journals (Sweden)

    V. V. Zelentsov

    2015-01-01

    Full Text Available During the exploration of outer space (as of 1/1 2011 6853 was launched spacecraft (SC are successful 6264, representing 95% of the total number of starts. The most intensively exploited space Russia (USSR (3701 starts, 94% successful, USA (2774 starts, 90% successful, China (234 starts, 96% successful and India (89 starts, 90% successful. A small part of running the spacecraft returned to Earth (manned spacecraft and transport, and the rest remained in orbit. Some of them are descended from orbit and burned up in the atmosphere, the rest remained in the OCP and turned into space debris (SD.The composition of the Cabinet is diverse: finish the job spacecraft; boosters and the last stage of launch vehicles left in orbit after SC injection; technological waste arising during the opening drop-down structures and fragments of the destroyed spacecraft. The resulting explosion orbital SD forms ellipsoidal region which orbits blasted object. Then, as a result of precession, is the distribution of objects in orbit explosion exploding spacecraft.The whole Cabinet is divided into two factions: the observed (larger than 100 mm and not observed (less than 100 mm. Observed debris katalogalizirovan and 0.2% of the total number of SD, there was no SD is the bulk - 99.8%.SC meeting working with a fragment observed SD predictable and due to changes in altitude spacecraft avoids a possible meeting. Contact spacecraft with large fragment lead to disaster (which took place at a meeting of the Russian communications satellite "Cosmos-2251" and the American machine "Iridium". Meeting with small SD is not predictable, especially if it was formed by an explosion or collision fragments together. Orbit that KM is not predictable, and the speed can be up to 10 km / s. Meeting with small particle SD no less dangerous for the spacecraft. The impact speed of spacecraft with space debris particles can reach up to 10 ... 15 km / s at such speeds the breakdown probability thin

  2. ORDEM2010 and MASTER-2009 Modeled Small Debris Population Comparison

    Science.gov (United States)

    Krisko, Paula H.; Flegel, S.

    2010-01-01

    The latest versions of the two premier orbital debris engineering models, NASA s ORDEM2010 and ESA s MASTER-2009, have been publicly released. Both models have gone through significant advancements since inception, and now represent the state-of-the-art in orbital debris knowledge of their respective agencies. The purpose of these models is to provide satellite designers/operators and debris researchers with reliable estimates of the artificial debris environment in near-Earth orbit. The small debris environment within the size range of 1 mm to 1 cm is of particular interest to both human and robotic spacecraft programs. These objects are much more numerous than larger trackable debris but are still large enough to cause significant, if not catastrophic, damage to spacecraft upon impact. They are also small enough to elude routine detection by existing observation systems (radar and telescope). Without reliable detection the modeling of these populations has always coupled theoretical origins with supporting observational data in different degrees. This paper details the 1 mm to 1 cm orbital debris populations of both ORDEM2010 and MASTER-2009; their sources (both known and presumed), current supporting data and theory, and methods of population analysis. Fluxes on spacecraft for chosen orbits are also presented and discussed within the context of each model.

  3. SCDAP/RELAP5 Modeling of Heat Transfer and Flow Losses in Lower Head Porous Debris

    International Nuclear Information System (INIS)

    Coryell, E.W.; Siefken, L.J.; Paik, S.

    1998-01-01

    Designs are described for implementing models for calculating the heat transfer and flow losses in porous debris in the lower head of a reactor vessel. The COUPLE model in SCDAP/RELAP5 represents both the porous and non-porous debris that results from core material slumping into the lower head. Currently, the COUPLE model has the capability to model convective and radiative heat transfer from the surfaces of non-porous debris in a detailed manner and to model only in a simplistic manner the heat transfer from porous debris. In order to advance beyond the simplistic modeling for porous debris, designs are developed for detailed calculations of heat transfer and flow losses in porous debris. Correlations are identified for convective heat transfer in porous debris for the following modes of heat transfer; (1) forced convection to liquid, (2) forced convection to gas, (3) nucleate boiling, (4) transition boiling, and (5) film boiling. Interphase heat transfer is modeled in an approximate manner. A design is also described for implementing a model of heat transfer by radiation from debris to the interstitial fluid. A design is described for implementation of models for flow losses and interphase drag in porous debris. Since the models for heat transfer and flow losses in porous debris in the lower head are designed for general application, a design is also described for implementation of these models to the analysis of porous debris in the core region. A test matrix is proposed for assessing the capability of the implemented models to calculate the heat transfer and flow losses in porous debris. The implementation of the models described in this report is expected to improve the COUPLE code calculation of the temperature distribution in porous debris and in the lower head that supports the debris. The implementation of these models is also expected to improve the calculation of the temperature and flow distribution in porous debris in the core region

  4. Emotional Intelligence, Motivational Climate and Levels of Anxiety in Athletes from Different Categories of Sports: Analysis through Structural Equations

    Science.gov (United States)

    López-Gutiérrez, Carlos Javier; Zafra-Santos, Edson

    2018-01-01

    (1) Background: Psychological factors can strongly affect the athletes’ performance. Therefore, currently the role of the sports psychologist is particularly relevant, being in charge of training the athlete’s psychological factors. This study aims at analysing the connections between motivational climate in sport, anxiety and emotional intelligence depending on the type of sport practised (individual/team) by means of a multigroup structural equations analysis. (2) 372 semi-professional Spanish athletes took part in this investigation, analysing motivational climate (PMCSQ-2), emotional intelligence (SSRI) and levels of anxiety (STAI). A model of multigroup structural equations was carried out which fitted accordingly (χ2 = 586.77; df = 6.37; p sports. The most influential indicator in ego oriented climate is intra-group rivalry, exerting greater influence in individual sports. For task-oriented climate the strongest indicator is having an important role in individual sports, while in team sports it is cooperative learning. Emotional intelligence dimensions correlate more strongly in team sports than in individual sports. In addition, there was a negative and indirect relation between task oriented climate and trait-anxiety in both categories of sports. (4) Conclusions: This study shows how the task-oriented motivational climate or certain levels of emotional intelligence can act preventively in the face of anxiety states in athletes. Therefore, the development of these psychological factors could prevent anxiety states and improve performance in athletes. PMID:29724008

  5. Cost Analysis of Spatial Data Production as Part of Business Intelligence Within the Mapping Department

    Science.gov (United States)

    Kisa, A.; Erkek, B.; Çolak, S.

    2012-07-01

    performance critters are redefined, improvement of existing software are defined, cost analysis implemented as a part of business intelligence. This paper indicated some activities such as cost analysis and its reflection in Mapping Department as an example to share in the concept of reorganization.

  6. Validity of transactional analysis and emotional intelligence in training nursing students.

    Science.gov (United States)

    Whitley-Hunter, Brandi L

    2014-10-01

    Emotional intelligence (EI) is considered a critical component of a nurse's characteristic trait which is known as a significant predictor of a person's job performance and life success. Transactional Analysis (TA) plays a fundamental role in nurse-patient communication and managing emotions during difficult dialect with patients. The aim of this review is to discuss the framework of EI and TA, and how the combined theories can be utilized to further educate nurses and enhance the patient's experience. Exploring the idea of combining EI, TA, and other theories and adding these addendums to the nursing curriculum may advance the empathy and communication skills of nursing students. The method used in this review is a literature search using databases, such as Medline, EBSCO, and Google Scholar, etc. to form a critical discussion of this area. Key words such as emotional intelligence, transactional analysis, nursing curriculum, and relating theoretical models were used to identify applicable documents. Four studies involving EI and TA were sampled. A combination of data collection tools, such as lecture series and intervention programs, were used to authenticate the results. Other instruments used were ego state questionnaires, empathy, and five point Likert scales. No study design or type of literature was excluded in healthcare to substantiate the application of EI and TA into the nursing curriculum. Sixteen nurses attended a six-week psycho-education program using communication and empathy scales, and patient satisfaction surveys to improve their empathetic and communication skills. The result of the mean communication score (177.8±20) increased to (198.8±15) after training (p=0.001). The empathy score increased from 25.7±7 to 32.6±6 (p=0.001). The overall result reflects that training can improve emergency nurse's communication and empathy skills. The data suggests there are under-researched theories with futuristic topics that have value to the nursing

  7. Validity of transactional analysis and emotional intelligence in training nursing students

    Directory of Open Access Journals (Sweden)

    BRANDI L WHITLEY-HUNTER

    2014-10-01

    Full Text Available Introduction: Emotional intelligence (EI is considered a critical component of a nurse’s characteristic trait which is known as a significant predictor of a person’s job performance and life success. Transactional Analysis (TA plays a fundamental role in nurse-patient communication and managing emotions during difficult dialect with patients. The aim of this review is to discuss the framework of EI and TA, and how the combined theories can be utilized to further educate nurses and enhance the patient’s experience. Exploring the idea of combining EI, TA, and other theories and adding these addendums to the nursing curriculum may advance the empathy and communication skills of nursing students. Methods: The method used in this review is a literature search using databases, such as Medline, EBSCO, and Google Scholar, etc. to form a critical discussion of this area. Key words such as emotional intelligence, transactional analysis, nursing curriculum, and relating theoretical models were used to identify applicable documents. Four studies involving EI and TA were sampled. A combination of data collection tools, such as lecture series and intervention programs, were used to authenticate the results. Other instruments used were ego state questionnaires, empathy, and five point Likert scales. No study design or type of literature was excluded in healthcare to substantiate the application of EI and TA into the nursing curriculum. Results: Sixteen nurses attended a six-week psycho-education program using communication and empathy scales, and patient satisfaction surveys to improve their empathetic and communication skills. The result of the mean communication score (177.8±20 increased to (198.8±15 after training (p=0.001. The empathy score increased from 25.7±7 to 32.6±6 (p=0.001. The overall result reflects that training can improve emergency nurse’s communication and empathy skills. Conclusion: The data suggests there are under

  8. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    International Nuclear Information System (INIS)

    Hummel, J. R.; Silver, R. B.

    2002-01-01

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  9. Artificial Intelligence and Moral intelligence

    OpenAIRE

    Laura Pana

    2008-01-01

    We discuss the thesis that the implementation of a moral code in the behaviour of artificial intelligent systems needs a specific form of human and artificial intelligence, not just an abstract intelligence. We present intelligence as a system with an internal structure and the structural levels of the moral system, as well as certain characteristics of artificial intelligent agents which can/must be treated as 1- individual entities (with a complex, specialized, autonomous or selfdetermined,...

  10. Recent advances in modeling landslides and debris flows

    CERN Document Server

    2015-01-01

    Landslides and debris flows belong to the most dangerous natural hazards in many parts of the world. Despite intensive research, these events continue to result in human suffering, property losses, and environmental degradation every year. Better understanding of the mechanisms and processes of landslides and debris flows will help make reliable predictions, develop mitigation strategies and reduce vulnerability of infrastructure. This book presents contributions to the workshop on Recent Developments in the Analysis, Monitoring and Forecast of Landslides and Debris Flow, in Vienna, Austria, September 9, 2013. The contributions cover a broad spectrum of topics from material behavior, physical modelling over numerical simulation to applications and case studies. The workshop is a joint event of three research projects funded by the European Commission within the 7th Framework Program: MUMOLADE (Multiscale modelling of landslides and debris flows, www.mumolade.com), REVENUES (Numerical Analysis of Slopes with V...

  11. Numerical simulation for debris bed behavior in sodium cooled fast reactor

    International Nuclear Information System (INIS)

    Tagami, Hirotaka; Tobita, Yoshiharu

    2014-01-01

    For safety analysis of SFR, it is necessary to evaluate behavior along with coolability of debris bed in lower plenum which is formed in severe accident. In order to analyze debris behavior, model for dense sediment particles behavior was proposed and installed in SFR safety analysis code SIMMER. SIMMER code could adequately reproduce experimental results simulating the self-leveling phenomena with appropriate model parameters for bed stiffness. In reactor condition, the self-leveling experiment for prototypical debris bed has not been performed. Additionally, the prototypical debris bed consists of non-spherical particles and it is difficult to quantify model parameters. This situation brings sensitivity analysis to investigate effect of model parameters on the self-leveling phenomena of prototypical debris bed in present paper. As initial condition for sensitivity analysis, simple mound-like debris bed in sodium-filled lower plenum in reactor vessel is considered. The bed consists of the mixture of fuel debris of 3,300 kg and steel debris of 1,570 kg. Decay heat is given to this fuel debris. The model parameter is chosen as sensitivity parameter. Sensitivity analysis shows that the model parameters can effect on intensity of self-leveling phenomena and eventual flatness of bed. In all analyses, however, coolant and sodium vapor break the debris bed at mainly center part of bed and the debris is relocated to outside of bed. Through this process, the initial debris bed is almost planarized before re-melting of debris. This result shows that the model parameters affect the self-leveling phenomena, but its effect in the safety analysis of SFRs is limited. (author)

  12. Active Space Debris Removal System

    Directory of Open Access Journals (Sweden)

    Gabriele GUERRA

    2017-06-01

    Full Text Available Since the start of the space era, more than 5000 launches have been carried out, each carrying satellites for many disparate uses, such as Earth observation or communication. Thus, the space environment has become congested and the problem of space debris is now generating some concerns in the space community due to our long-lived belief that “space is big”. In the last few years, solutions to this problem have been proposed, one of those is Active Space Debris Removal: this method will reduce the increasing debris growth and permit future sustainable space activities. The main idea of the method proposed below is a drag augmentation system: use a system capable of putting an expanded foam on a debris which will increase the area-to-mass ratio to increase the natural atmospheric drag and solar pressure. The drag augmentation system proposed here requires a docking system; the debris will be pushed to its release height and then, after un-docking, an uncontrolled re-entry takes place ending with a burn up of the object and the foam in the atmosphere within a given time frame. The method requires an efficient way to change the orbit between two debris. The present paper analyses such a system in combination with an Electric Propulsion system, and emphasizes the choice of using two satellites to remove five effective rockets bodies debris within a year.

  13. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  14. An Approach to Predict Debris Flow Average Velocity

    Directory of Open Access Journals (Sweden)

    Chen Cao

    2017-03-01

    Full Text Available Debris flow is one of the major threats for the sustainability of environmental and social development. The velocity directly determines the impact on the vulnerability. This study focuses on an approach using radial basis function (RBF neural network and gravitational search algorithm (GSA for predicting debris flow velocity. A total of 50 debris flow events were investigated in the Jiangjia gully. These data were used for building the GSA-based RBF approach (GSA-RBF. Eighty percent (40 groups of the measured data were selected randomly as the training database. The other 20% (10 groups of data were used as testing data. Finally, the approach was applied to predict six debris flow gullies velocities in the Wudongde Dam site area, where environmental conditions were similar to the Jiangjia gully. The modified Dongchuan empirical equation and the pulled particle analysis of debris flow (PPA approach were used for comparison and validation. The results showed that: (i the GSA-RBF predicted debris flow velocity values are very close to the measured values, which performs better than those using RBF neural network alone; (ii the GSA-RBF results and the MDEE results are similar in the Jiangjia gully debris flow velocities prediction, and GSA-RBF performs better; (iii in the study area, the GSA-RBF results are validated reliable; and (iv we could consider more variables in predicting the debris flow velocity by using GSA-RBF on the basis of measured data in other areas, which is more applicable. Because the GSA-RBF approach was more accurate, both the numerical simulation and the empirical equation can be taken into consideration for constructing debris flow mitigation works. They could be complementary and verified for each other.

  15. Development of a debris flow model in a geotechnical centrifuge

    Science.gov (United States)

    Cabrera, Miguel Angel; Wu, Wei

    2013-04-01

    Debris flows occur in three main stages. At first the initial soil mass, which rests in a rigid configuration, reaches a critic state releasing a finite mass over a failure surface. In the second stage the released mass starts being transported downhill in a dynamic motion. Segregation, erosion, entrainment, and variable channel geometry are among the more common characteristics of this stage. Finally, at the third stage the transported mass plus the mass gained or loosed during the transportation stage reach a flat and/or a wide area and its deposition starts, going back to a rigid configuration. The lack of understanding and predictability of debris flow from the traditional theoretical approaches has lead that in the last two decades the mechanics of debris flows started to be analysed around the world. Nevertheless, the validation of recent numerical advances with experimental data is required. Centrifuge modelling is an experimental tool that allows the test of natural processes under defined boundary conditions in a small scale configuration, with a good level of accuracy in comparison with a full scale test. This paper presents the development of a debris flow model in a geotechnical centrifuge focused on the second stage of the debris flow process explained before. A small scale model of an inclined flume will be developed, with laboratory instrumentation able to measure the pore pressure, normal stress, and velocity path, developed in a scaled debris flow in motion. The model aims to reproduce in a controlled environment the main parameters of debris flow motion. This work is carried under the EC 7th Framework Programme as part of the MUMOLADE project. The dataset and data-analysis obtained from the tests will provide a qualitative description of debris flow motion-mechanics and be of valuable information for MUMOLADE co-researchers and for the debris flow research community in general.

  16. Application of artificial intelligence (AI) methods for designing and analysis of reconfigurable cellular manufacturing system (RCMS)

    CSIR Research Space (South Africa)

    Xing, B

    2009-12-01

    Full Text Available This work focuses on the design and control of a novel hybrid manufacturing system: Reconfigurable Cellular Manufacturing System (RCMS) by using Artificial Intelligence (AI) approach. It is hybrid as it combines the advantages of Cellular...

  17. Practical Approach of the PEST Analysis from the Perspective of the Territorial Intelligence

    Directory of Open Access Journals (Sweden)

    Alexandru Bîrsan

    2016-01-01

    Digging deeper in the Knowledge Economy, we propose as the subject of this paper and as apart of our research, a theoretical approach in assessing and analyzing a region from theperspective of both territorial intelligence and smart developing.

  18. Dynamic mobility applications policy analysis : policy and institutional issues for intelligent network flow optimization (INFLO).

    Science.gov (United States)

    2014-12-01

    The report documents policy considerations for the Intelligent Network Flow Optimization (INFLO) connected vehicle applications : bundle. INFLO aims to optimize network flow on freeways and arterials by informing motorists of existing and impen...

  19. Space Debris Mitigation CONOPS Development

    Science.gov (United States)

    2013-06-01

    literature search and review a lone article was found with any discussion of it. As with any net, the concept is to catch space debris objects in the net...travel along the track of the orbit and collect debris along its path. The lone article found contends that the idea “does not work”. Bonnal and...100,000 pieces of debris orbiting the planet , [as] NASA estimated -- 2,600 of them more than [four] inches across. [NASA] called the breakup of the

  20. Intelligent Approach for Analysis of Respiratory Signals and Oxygen Saturation in the Sleep Apnea/Hypopnea Syndrome

    Science.gov (United States)

    Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena

    2014-01-01

    This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712

  1. Multidimensional Analysis and Location Intelligence Application for Spatial Data Warehouse Hotspot in Indonesia using SpagoBI

    Science.gov (United States)

    Uswatun Hasanah, Gamma; Trisminingsih, Rina

    2016-01-01

    Spatial data warehouse refers to data warehouse which has a spatial component that represents the geographic location of the position or an object on the Earth's surface. Spatial data warehouse can be visualized in the form of a crosstab tables, graphs, and maps. Spatial data warehouse of hotspot in Indonesia has been constructed by researchers from FIRM NASA 2006-2015. This research develops multidimensional analysis module and location intelligence module using SpagoBI. The multidimensional analysis module is able to visualize online analytical processing (OLAP). The location intelligence module creates dynamic map visualization in map zone and map point. Map zone can display the different colors based on the number of hotspot in each region and map point can display different sizes of the point to represent the number of hotspots in each region. This research is expected to facilitate users in the presentation of hotspot data as needed.

  2. Descriptive business intelligence analysis: utting edge strategic asset for SMEs, is it really worth it?

    OpenAIRE

    Sivave Mashingaidze

    2014-01-01

    The purpose of this article is to provide a framework for understanding and adoption of Business Intelligence by (SMEs) within the Zimbabwean economy. The article explores every facet of Business Intelligence, including internal and external BI as cutting edge strategic asset. A descriptive research methodology has been adopted. The article revealed some BI critical success factors for better BI implementation. Findings revealed that organizations which have the greatest success with BI trave...

  3. An analysis of Turkish students' perception of intelligence from primary school to university

    OpenAIRE

    Beyaztas, Dilek Ilhan; Hymer, Barry

    2016-01-01

    The aim of this descriptive study was to determine the features of intelligence perceptions according to age, gender, class level, school success level and university departments. Two different scales by Dweck (2000) for both adults and children were translated into Turkish. These scales were then applied to 1350 Turkish students ranging from 4th grade primary school to 4th year university. Results showed that student scores relating to the perception that intelligence is an unchangeable feat...

  4. An intelligent fault diagnosis method of rolling bearings based on regularized kernel Marginal Fisher analysis

    International Nuclear Information System (INIS)

    Jiang Li; Shi Tielin; Xuan Jianping

    2012-01-01

    Generally, the vibration signals of fault bearings are non-stationary and highly nonlinear under complicated operating conditions. Thus, it's a big challenge to extract optimal features for improving classification and simultaneously decreasing feature dimension. Kernel Marginal Fisher analysis (KMFA) is a novel supervised manifold learning algorithm for feature extraction and dimensionality reduction. In order to avoid the small sample size problem in KMFA, we propose regularized KMFA (RKMFA). A simple and efficient intelligent fault diagnosis method based on RKMFA is put forward and applied to fault recognition of rolling bearings. So as to directly excavate nonlinear features from the original high-dimensional vibration signals, RKMFA constructs two graphs describing the intra-class compactness and the inter-class separability, by combining traditional manifold learning algorithm with fisher criteria. Therefore, the optimal low-dimensional features are obtained for better classification and finally fed into the simplest K-nearest neighbor (KNN) classifier to recognize different fault categories of bearings. The experimental results demonstrate that the proposed approach improves the fault classification performance and outperforms the other conventional approaches.

  5. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  6. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  7. Analysis of intelligent green building policy and developing status in Taiwan

    International Nuclear Information System (INIS)

    Kuo, Chung-Feng Jeffrey; Lin, Chieh-Hung; Hsu, Ming-Wen

    2016-01-01

    In 2010, Taiwan launched a program dubbed “four emerging intellectual industries” that lists intelligent green buildings. The aim of promoting of intelligent green building is to stimulate the architecture technology industry. This has been combined with Information and Communication Technology (ICT) and the concept of green building to provide a safe and healthy living environment. While doing this it will also aim to reduce carbon emissions and save energy. This study investigates intelligent green building policies and the promotion of progress in Taiwan. It probes into cases from 1988 to 2014. Key success factors are derived from analyzing and summarizing intelligent green building experiences in Taiwan. This is done through Secondary Data Analyses by: 1. Establishing clear norms and standards for intelligent green building design and improvement; 2. First carrying out policies in public sector, in order to provide field trial and safeguarded market opportunities for industries; 3. Implementing rating-based assessments, in order to raise the quality of design; 4. Mandatory or incentive policies are introduced, depending on local specialties and conditions; 5. Respectively planning incentives for relevant interested parties in industrial chain; 6. Strengthening marketing efforts and proactively promoting policies. - Highlights: •Aggregate and analyze the results of Intelligent Green Building policy in Taiwan. •Chi-square Test of Independence is used for inspecting successful factors. •Organize experiences and propose recommended feasible scheme for future.

  8. Do narcissism and emotional intelligence win us friends? : modeling dynamics of peer popularity using inferential network analysis

    OpenAIRE

    Czarna, Anna; Leifeld, Philip; Śmieja-Nęcka, Magdalena; Dufner, Michael; Salovey, Peter

    2016-01-01

    This research investigated effects of narcissism and emotional intelligence (EI) on popularity in social networks. In a longitudinal field study, we examined the dynamics of popularity in 15 peer groups in two waves (N = 273). We measured narcissism, ability EI, and explicit and implicit self-esteem. In addition, we measured popularity at zero acquaintance and 3 months later. We analyzed the data using inferential network analysis (temporal exponential random graph modeling, TERGM) accounting...

  9. Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers

    Directory of Open Access Journals (Sweden)

    Bogdan-Andrei IONESCU

    2014-09-01

    Full Text Available The aim of this paper is to synthesize the concepts behind Business Intelligence, by studying the solutions available on the market provided by the main players. We will present the software solutions already provided by them emphasizing the main advantages and benefits of each of them, but also as a comparative analysis, designed to reveal the area in which each provider is more remarkable than the others.

  10. NASA Orbital Debris Baseline Populations

    Science.gov (United States)

    Krisko, Paula H.; Vavrin, A. B.

    2013-01-01

    The NASA Orbital Debris Program Office has created high fidelity populations of the debris environment. The populations include objects of 1 cm and larger in Low Earth Orbit through Geosynchronous Transfer Orbit. They were designed for the purpose of assisting debris researchers and sensor developers in planning and testing. This environment is derived directly from the newest ORDEM model populations which include a background derived from LEGEND, as well as specific events such as the Chinese ASAT test, the Iridium 33/Cosmos 2251 accidental collision, the RORSAT sodium-potassium droplet releases, and other miscellaneous events. It is the most realistic ODPO debris population to date. In this paper we present the populations in chart form. We describe derivations of the background population and the specific populations added on. We validate our 1 cm and larger Low Earth Orbit population against SSN, Haystack, and HAX radar measurements.

  11. DebriSat Laboratory Analyses

    Science.gov (United States)

    2015-01-05

    droplets. Fluorine from Teflon wire insulation was also common in the SEM stub and witness plates deposits. Nano droplets of metallic materials...and Debris-LV debris. Aluminum was from the Al honeycomb, nadir and zenith panels, structural core and COPV liner. Aluminum oxide particles were...three pieces: Outer Nylon shell (sabot) with 2 part hollow aluminum insert. • ~600 grams, 8.6 cm diameter X 10.3 cm long – size of a soup can

  12. Backwater development by woody debris

    Science.gov (United States)

    Geertsema, Tjitske; Torfs, Paul; Teuling, Ryan; Hoitink, Ton

    2017-04-01

    Placement of woody debris is a common method for increasing ecological values in river and stream restoration, and is thus widely used in natural environments. Water managers, however, are afraid to introduce wood in channels draining agricultural and urban areas. Upstream, it may create backwater, depending on hydrodynamic characteristics including the obstruction ratio, the Froude number and the surface level gradient. Patches of wood may trigger or counter morphological activity, both laterally, through bank erosion and protection, and vertically, with pool and riffle formation. Also, a permeable construction composed of wood will weather over time. Both morphodynamic activity and weathering cause backwater effects to change in time. The purpose of this study is to quantify the time development of backwater effects caused by woody debris. Hourly water levels gauged upstream and downstream of patches and discharge are collected for five streams in the Netherlands. The water level drop over the woody debris patch relates to discharge in the streams. This relation is characterized by an increasing water level difference for an increasing discharge, up to a maximum. If the discharge increases beyond this level, the water level difference reduces to the value that may represent the situation without woody debris. This reduction depends primarily on the obstruction ratio of the woody debris in the channel cross-section. Morphologic adjustments in the stream and reorientation of the woody material reduce the water level drop over the patches in time. Our results demonstrate that backwater effects can be reduced by optimizing the location where woody debris is placed and manipulating the obstruction ratio. Current efforts are focussed on representing woody debris in a one-dimensional numerical model, aiming to obtain a generic tool to achieve a stream design with woody debris that minimizes backwater.

  13. Debris Disks: Probing Planet Formation

    OpenAIRE

    Wyatt, Mark C.

    2018-01-01

    Debris disks are the dust disks found around ~20% of nearby main sequence stars in far-IR surveys. They can be considered as descendants of protoplanetary disks or components of planetary systems, providing valuable information on circumstellar disk evolution and the outcome of planet formation. The debris disk population can be explained by the steady collisional erosion of planetesimal belts; population models constrain where (10-100au) and in what quantity (>1Mearth) planetesimals (>10km i...

  14. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Sippel; William C. Carrigan; Kenneth D. Luff; Lyn Canter

    2003-11-12

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). The software tools in ICS have been developed for characterization of reservoir properties and evaluation of hydrocarbon potential using a combination of inter-disciplinary data sources such as geophysical, geologic and engineering variables. The ICS tools provide a means for logical and consistent reservoir characterization and oil reserve estimates. The tools can be broadly characterized as (1) clustering tools, (2) neural solvers, (3) multiple-linear regression, (4) entrapment-potential calculator and (5) file utility tools. ICS tools are extremely flexible in their approach and use, and applicable to most geologic settings. The tools are primarily designed to correlate relationships between seismic information and engineering and geologic data obtained from wells, and to convert or translate seismic information into engineering and geologic terms or units. It is also possible to apply ICS in a simple framework that may include reservoir characterization using only engineering, seismic, or geologic data in the analysis. ICS tools were developed and tested using geophysical, geologic and engineering data obtained from an exploitation and development project involving the Red River Formation in Bowman County, North Dakota and Harding County, South Dakota. Data obtained from 3D seismic surveys, and 2D seismic lines encompassing nine prospective field areas were used in the analysis. The geologic setting of the Red River Formation in Bowman and Harding counties is that of a shallow-shelf, carbonate system. Present-day depth of the Red River formation is approximately 8000 to 10,000 ft below ground surface. This report summarizes production results from well demonstration activity, results of reservoir characterization of the Red River Formation at demonstration sites, descriptions of ICS tools and strategies for their application.

  15. A combined morphometric, sedimentary, GIS and modelling analysis of flooding and debris flow hazard on a composite alluvial fan, Caveside, Tasmania

    Science.gov (United States)

    Kain, Claire L.; Rigby, Edward H.; Mazengarb, Colin

    2018-02-01

    Two episodes of intense flooding and sediment movement occurred in the Westmorland Stream alluvial system near Caveside, Australia in January 2011 and June 2016. The events were investigated in order to better understand the drivers and functioning of this composite alluvial system on a larger scale, so as to provide awareness of the potential hazard from future flood and debris flow events. A novel combination of methods was employed, including field surveys, catchment morphometry, GIS mapping from LiDAR and aerial imagery, and hydraulic modelling using RiverFlow-2D software. Both events were initiated by extreme rainfall events (events on the farmland appeared similar; however, there were differences in sediment source and transport processes that have implications for understanding recurrence probabilities. A debris flow was a key driver in the 2011 event, by eroding the stream channel in the forested watershed and delivering a large volume of sediment downstream to the alluvial fan. In contrast, modelled flooding velocities suggest the impacts of the 2016 event were the result of an extended period of extreme stream flooding and consequent erosion of alluvium directly above the current fan apex. The morphometry of the catchment is better aligned with values from fluvially dominated fans found elsewhere, which suggests that flooding represents a more frequent future risk than debris flows. These findings have wider implications for the estimation of debris flow and flood hazard on alluvial fans in Tasmania and elsewhere, as well as further demonstrating the capacity of combined hydraulic modelling and geomorphologic investigation as a predictive tool to inform hazard management practices in environments affected by flooding and sediment movement.

  16. SCDAP/RELAP5 Modeling of Heat Transfer and Flow Losses in Lower Head Porous Debris

    International Nuclear Information System (INIS)

    Siefken, Larry James; Coryell, Eric Wesley; Paik, Seungho; Kuo, Han Hsiung

    1999-01-01

    Designs are described for implementing models for calculating the heat transfer and flow losses in porous debris in the lower head of a reactor vessel. The COUPLE model in SCDAP/RELAP5 represents both the porous and nonporous debris that results from core material slumping into the lower head. Currently, the COUPLE model has the capability to model convective and radiative heat transfer from the surfaces of nonporous debris in a detailed manner and to model only in a simplistic manner the heat transfer from porous debris. In order to advance beyond the simplistic modeling for porous debris, designs are developed for detailed calculations of heat transfer and flow losses in porous debris. Correlations are identified for convective heat transfer in porous debris for the following modes of heat transfer; (1) forced convection to liquid, (2) forced convection to gas, (3) nucleate boiling, (4) transition boiling, and (5) film boiling. Interphase heat transfer is modeled in an approximate manner. Designs are described for models to calculate the flow losses and interphase drag of fluid flowing through the interstices of the porous debris, and to apply these variables in the momentum equations in the RELAP5 part of the code. Since the models for heat transfer and flow losses in porous debris in the lower head are designed for general application, a design is also described for implementation of these models to the analysis of porous debris in the core region. A test matrix is proposed for assessing the capability of the implemented models to calculate the heat transfer and flow losses in porous debris. The implementation of the models described in this report is expected to improve the COUPLE code calculation of the temperature distribution in porous debris and in the lower head that supports the debris. The implementation of these models is also expected to improve the calculation of the temperature and flow distribution in porous debris in the core region

  17. An Ontological Architecture for Orbital Debris Data

    OpenAIRE

    Rovetto, Robert J.

    2017-01-01

    The orbital debris problem presents an opportunity for inter-agency and international cooperation toward the mutually beneficial goals of debris prevention, mitigation, remediation, and improved space situational awareness (SSA). Achieving these goals requires sharing orbital debris and other SSA data. Toward this, I present an ontological architecture for the orbital debris domain, taking steps in the creation of an orbital debris ontology (ODO). The purpose of this ontological system is to ...

  18. Textural analysis of particles from El Zaguán debris avalanche deposit, Nevado de Toluca volcano, Mexico: Evidence of flow behavior during emplacement

    Science.gov (United States)

    Caballero, Lizeth; Capra, Lucia

    2011-02-01

    El Zaguán deposit originated at 28,000 yrs. B.P. from the flank collapse of Nevado de Toluca, a dacitic stratovolcano of the Transmexican Volcanic Belt. Scanning Electron Microprobe analyses (SEM) were performed on some particles from this deposit to observe microtextures produced during transport and emplacement of the debris avalanche flow. Particles from 2ϕ (250 μm), 0ϕ (1 mm) and - 2ϕ (4 mm) granulometric classes were randomly selected at different outcrops, and their surface textures were described. The observed textures are divided in two groups, Basal and Upper textures, each one indicating different clast interactions. Basal textures are observed predominantly in the lower part of the deposit and consist of parallel ridges, parallel grooves, scratches and lips. Upper textures are mainly present in the upper part of the deposit and consisted of fractures, percussion marks, and broken or grinded crystals. These characteristics, coupled with field observations such as the presence of clastic dikes and deformed lacustrine mega-blocks, indicate that the basal part of the debris avalanche was moving in a partially liquefied state. By contrast, the particles in the upper part were able to move freely, interacting by collision. These microscopic textures are in agreement with previously described emplacement behaviors in debris avalanches of volcanic origin, suggesting a stratified flow dominated by different transport and depositional mechanisms depending upon flow depth and possible fluid content at their base.

  19. Artificial Intelligence.

    Science.gov (United States)

    Information Technology Quarterly, 1985

    1985-01-01

    This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

  20. Competitive Intelligence.

    Science.gov (United States)

    Bergeron, Pierrette; Hiller, Christine A.

    2002-01-01

    Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…

  1. The comparing analysis of simulation of emergent dispatch of cars for intelligent driving autos in crossroads

    Science.gov (United States)

    Zheng, Ziao

    2018-03-01

    It is widely acknowledged that it is important for the development of intelligent cars to be widely accepted by the majority of car users. While most of the intelligent cars have the system of monitoring itself whether it is on the good situation to drive, it is also clear that studies should be performed on the way of cars for the emergent rescue of the intelligent vehicles. In this study, writer focus mainly on how to derive a separate system for the car caring teams to arrive as soon as they get the signal sent out by the intelligent driving autos. This simulation measure the time for the rescuing team to arrive, the cost it spent on arriving on the site of car problem happens, also how long the queue is when the rescuing auto is waiting to cross a road. This can be definitely in great use when there are a team of intelligent cars with one car immediately having problems causing its not moving and can be helpful in other situations. Through this way, the interconnection of cars can be a safety net for the drivers encountering difficulties in any time.

  2. The effect of long chain polyunsaturated fatty acid supplementation on intelligence in low birth weight infant during lactation: A meta-analysis

    Science.gov (United States)

    Song, Yuan; Liu, Ya; Pan, Yun; Yuan, Xiaofeng; Chang, Pengyu; Tian, Yuan; Cui, Weiwei

    2018-01-01

    Background Low birth weight infant (LBWIs) are prone to mental and behavioural problems. As an important constituent of the brain and retina, long chain polyunsaturated fatty acids are essential for foetal infant mental and visual development. The effect of lactation supplemented with long chain polyunsaturated fatty acids (LCPUFA) on the improvement of intelligence in low birth weight children requires further validation. Methods In this study, a comprehensive search of multiple databases was performed to identify studies focused the association between intelligence and long chain polyunsaturated fatty acid supplementation in LBWIs. Studies that compared the Bayley Scales of Infant Development (BSID) or the Wechsler Abbreviated Scale of Intelligence for Children (WISC) scores between LBWIs who were supplemented and controls that were not supplemented with LCPUFA during lactation were selected for inclusion in the meta-analysis. Results The main outcome was the mean difference in the mental development index (MDI) and psychomotor development index (PDI) of the BSID and the full scale intelligence quotient (FSIQ), verbal intelligence quotient (VIQ) and performance intelligence quotient (PIQ) of the WISC between LBWIs and controls. Our findings indicated that the mean BSID or WISC scores in LBWIs did not differ between the supplemented groups and controls. Conclusion This meta-analysis does not reveal that LCPUFA supplementation has a significant impact on the level of intelligence in LBWIs. PMID:29634752

  3. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2007-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  4. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2006-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st Century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  5. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best, Jr, Richard A

    2008-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  6. Intelligence Issues for Congress

    National Research Council Canada - National Science Library

    Best. Jr, Richard A

    2006-01-01

    To address the challenges facing the U.S. Intelligence Community in the 21st century, congressional and executive branch initiatives have sought to improve coordination among the different agencies and to encourage better analysis...

  7. Emotional Intelligence and ADHD: A Comparative Analysis in Students of Lima Metropolitan Area

    Directory of Open Access Journals (Sweden)

    Luciana M. Barahona

    2016-04-01

    Full Text Available The following study aims to identify statistically significant differences between adolescent students with and without Attention Deficit Disorder and Hyperactivity (ADHD in emotional intelligence skills. The study sample was composed of 44 students with ADHD diagnosis and 192 students without ADHD; both groups were obtained by an intentional process. The participants were evaluated with the Emotional Intelligence Inventory of BarOn ICE: NA, Peruvian adaptation and standardization (Ugarriza & Pajares, 2003. The results showed that there are statistically significant differences in intrapersonal skills (U = 3306.50, z = -2.25, p = .024, r = -.15 and positive impression (U = 3369.00, z = -2.10, p = .036, r = -.14 of emotional intelligence between students with ADHD and students without ADHD. Thus, the first group got higher scores than the second one in both aspects.

  8. My Home - analysis of the easy and intelligent way to save energy

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Goeran (The Danish Electricity Saving Trust (Denmark))

    2009-07-01

    The concept of the intelligent home is not new, but translating theory into reality presents many challenges. Launched in Denmark in October 2008, My Home is an interactive web portal offering consumers the opportunity to calculate their household energy consumption, receive advice on possible savings, and control and monitor their indoor environments by managing their energy consumption appropriately. This paper describes the software functionality of My Home and examines the energy saving benefits available to consumers who use this system to control and monitor electricity consumption in their homes. The paper further discusses the necessity of having a common infrastructure and standard for use by all producers and suppliers in order to ensure a mass market for compatible equipment and solutions. The analysis focuses on how My Home allows users to map energy consumption in their homes by using drag-and-drop icons to organise and equip a floorplan with their own electrical appliances. My Home features a calculator which automatically works out the annual electricity consumption and displays appropriate savings advice based on manually inputted readings or remote readings supplied by an electricity provider. Detailed statistical evaluation of metered values is also available. The extent to which the use of My Home has resulted in lower energy consumption in the homes of individual users is currently being determined by ongoing research. This involves examining users' motivations and needs, and testing for usability on the basis of ethnographic field work. The results showed that My Home facilitates easy configuration of home control and monitoring systems, which is normally very difficult to perform.

  9. Intelligence in youth and all-cause-mortality: systematic review with meta-analysis.

    Science.gov (United States)

    Calvin, Catherine M; Deary, Ian J; Fenton, Candida; Roberts, Beverly A; Der, Geoff; Leckenby, Nicola; Batty, G David

    2011-06-01

    A number of prospective cohort studies have examined the association between intelligence in childhood or youth and life expectancy in adulthood; however, the effect size of this association is yet to be quantified and previous reviews require updating. The systematic review included an electronic search of EMBASE, MEDLINE and PSYCHINFO databases. This yielded 16 unrelated studies that met inclusion criteria, comprising 22,453 deaths among 1,107,022 participants. Heterogeneity was assessed, and fixed effects models were applied to the aggregate data. Publication bias was evaluated, and sensitivity analyses were conducted. A 1-standard deviation (SD) advantage in cognitive test scores was associated with a 24% (95% confidence interval 23-25) lower risk of death, during a 17- to 69-year follow-up. There was little evidence of publication bias (Egger's intercept = 0.10, P = 0.81), and the intelligence-mortality association was similar for men and women. Adjustment for childhood socio-economic status (SES) in the nine studies containing these data had almost no impact on this relationship, suggesting that this is not a confounder of the intelligence-mortality association. Controlling for adult SES in five studies and for education in six studies attenuated the intelligence-mortality hazard ratios by 34 and 54%, respectively. Future investigations should address the extent to which attenuation of the intelligence-mortality link by adult SES indicators is due to mediation, over-adjustment and/or confounding. The explanation(s) for association between higher early-life intelligence and lower risk of adult mortality require further elucidation.

  10. The physics of debris flows

    Science.gov (United States)

    Iverson, Richard M.

    1997-08-01

    Recent advances in theory and experimentation motivate a thorough reassessment of the physics of debris flows. Analyses of flows of dry, granular solids and solid-fluid mixtures provide a foundation for a comprehensive debris flow theory, and experiments provide data that reveal the strengths and limitations of theoretical models. Both debris flow materials and dry granular materials can sustain shear stresses while remaining static; both can deform in a slow, tranquil mode characterized by enduring, frictional grain contacts; and both can flow in a more rapid, agitated mode characterized by brief, inelastic grain collisions. In debris flows, however, pore fluid that is highly viscous and nearly incompressible, composed of water with suspended silt and clay, can strongly mediate intergranular friction and collisions. Grain friction, grain collisions, and viscous fluid flow may transfer significant momentum simultaneously. Both the vibrational kinetic energy of solid grains (measured by a quantity termed the granular temperature) and the pressure of the intervening pore fluid facilitate motion of grains past one another, thereby enhancing debris flow mobility. Granular temperature arises from conversion of flow translational energy to grain vibrational energy, a process that depends on shear rates, grain properties, boundary conditions, and the ambient fluid viscosity and pressure. Pore fluid pressures that exceed static equilibrium pressures result from local or global debris contraction. Like larger, natural debris flows, experimental debris flows of ˜10 m³ of poorly sorted, water-saturated sediment invariably move as an unsteady surge or series of surges. Measurements at the base of experimental flows show that coarse-grained surge fronts have little or no pore fluid pressure. In contrast, finer-grained, thoroughly saturated debris behind surge fronts is nearly liquefied by high pore pressure, which persists owing to the great compressibility and moderate

  11. The physics of debris flows

    Science.gov (United States)

    Iverson, R.M.

    1997-01-01

    Recent advances in theory and experimentation motivate a thorough reassessment of the physics of debris flows. Analyses of flows of dry, granular solids and solid-fluid mixtures provide a foundation for a comprehensive debris flow theory, and experiments provide data that reveal the strengths and limitations of theoretical models. Both debris flow materials and dry granular materials can sustain shear stresses while remaining static; both can deform in a slow, tranquil mode characterized by enduring, frictional grain contacts; and both can flow in a more rapid, agitated mode characterized by brief, inelastic grain collisions. In debris flows, however, pore fluid that is highly viscous and nearly incompressible, composed of water with suspended silt and clay, can strongly mediate intergranular friction and collisions. Grain friction, grain collisions, and viscous fluid flow may transfer significant momentum simultaneously. Both the vibrational kinetic energy of solid grains (measured by a quantity termed the granular temperature) and the pressure of the intervening pore fluid facilitate motion of grains past one another, thereby enhancing debris flow mobility. Granular temperature arises from conversion of flow translational energy to grain vibrational energy, a process that depends on shear rates, grain properties, boundary conditions, and the ambient fluid viscosity and pressure. Pore fluid pressures that exceed static equilibrium pressures result from local or global debris contraction. Like larger, natural debris flows, experimental debris flows of ???10 m3 of poorly sorted, water-saturated sediment invariably move as an unsteady surge or series of surges. Measurements at the base of experimental flows show that coarse-grained surge fronts have little or no pore fluid pressure. In contrast, finer-grained, thoroughly saturated debris behind surge fronts is nearly liquefied by high pore pressure, which persists owing to the great compressibility and moderate

  12. CIRCUMSTELLAR DEBRIS DISKS: DIAGNOSING THE UNSEEN PERTURBER

    Energy Technology Data Exchange (ETDEWEB)

    Nesvold, Erika R. [Department of Terrestrial Magnetism, Carnegie Institution for Science, 5241 Broad Branch Rd., Washington, DC 20015 (United States); Naoz, Smadar; Vican, Laura [Department of Physics and Astronomy, UCLA, 475 Portola Plaza, Los Angeles, CA 90095 (United States); Farr, Will M. [School of Physics and Astronomy, University of Birmingham, Birmingham, B15 2TT (United Kingdom)

    2016-07-20

    The first indication of the presence of a circumstellar debris disk is usually the detection of excess infrared emission from the population of small dust grains orbiting the star. This dust is short-lived, requiring continual replenishment, and indicating that the disk must be excited by an unseen perturber. Previous theoretical studies have demonstrated that an eccentric planet orbiting interior to the disk will stir the larger bodies in the belt and produce dust via interparticle collisions. However, motivated by recent observations, we explore another possible mechanism for heating a debris disk: a stellar-mass perturber orbiting exterior to and inclined to the disk and exciting the disk particles’ eccentricities and inclinations via the Kozai–Lidov mechanism. We explore the consequences of an exterior perturber on the evolution of a debris disk using secular analysis and collisional N -body simulations. We demonstrate that a Kozai–Lidov excited disk can generate a dust disk via collisions and we compare the results of the Kozai–Lidov excited disk with a simulated disk perturbed by an interior eccentric planet. Finally, we propose two observational tests of a dust disk that can distinguish whether the dust was produced by an exterior brown dwarf or stellar companion or an interior eccentric planet.

  13. Postdetonation nuclear debris for attribution.

    Science.gov (United States)

    Fahey, A J; Zeissler, C J; Newbury, D E; Davis, J; Lindstrom, R M

    2010-11-23

    On the morning of July 16, 1945, the first atomic bomb was exploded in New Mexico on the White Sands Proving Ground. The device was a plutonium implosion device similar to the device that destroyed Nagasaki, Japan, on August 9 of that same year. Recently, with the enactment of US public law 111-140, the "Nuclear Forensics and Attribution Act," scientists in the government and academia have been able, in earnest, to consider what type of forensic-style information may be obtained after a nuclear detonation. To conduct a robust attribution process for an exploded device placed by a nonstate actor, forensic analysis must yield information about not only the nuclear material in the device but about other materials that went into its construction. We have performed an investigation of glassed ground debris from the first nuclear test showing correlations among multiple analytical techniques. Surprisingly, there is strong evidence, obtainable only through microanalysis, that secondary materials used in the device can be identified and positively associated with the nuclear material.

  14. Preliminary investigation for the development of surrogate debris from nuclear detonations in marine-urban environments

    International Nuclear Information System (INIS)

    Seybert, A.G.; Auxier II, J.D.; University of Tennessee, Knoxville, TN; Hall, H.L.; University of Tennessee, Knoxville, TN; University of Tennessee, Knoxville, TN

    2017-01-01

    Since no nuclear weapon surface detonations have occurred in urban harbor environments, the nuclear forensic community has no actual debris from which to develop and validate analytical methods for radiochemistry analysis, making the development of surrogate debris representative of this a marine-urban detonation a vital undertaking. This work seeks to build a robust model that accounts for natural and manmade environmental variations in harbor environments and vessel compositions to statistically define the elemental composition of vaporized debris from a marine-urban nuclear detonation. This initial work is necessary for follow-on neutron-activation and debris formation analysis. (author)

  15. Quantitative assessment of apical debris extrusion and intracanal debris in the apical third, using hand instrumentation and three rotary instrumentation systems.

    Science.gov (United States)

    H K, Sowmya; T S, Subhash; Goel, Beena Rani; T N, Nandini; Bhandi, Shilpa H

    2014-02-01

    Decreased apical extrusion of debris and apical one third debris have strong implications for decreased incidence of postoperative inflammation and pain. Thus, the aim of this study was to assess quantitatively the apical extrusion of debris and intracanal debris in the apical third during root canal instrumentation using hand and three different types of rotary instruments. Sixty freshly extracted single rooted human teeth were randomly divided into four groups. Canal preparation was done using step-back with hand instrumentation, crown-down technique with respect to ProTaper and K3, and hybrid technique with LightSpeed LSX. Irrigation was done with NaOCl, EDTA, and normal saline and for final irrigation, EndoVac system was used. The apically extruded debris was collected on the pre-weighed Millipore plastic filter disk and weighed using microbalance. The teeth were submitted to the histological processing. Sections from the apical third were analyzed by a trinocular research microscope that was coupled to a computer where the images were captured and analyzed using image proplus V4.1.0.0 software. The mean weight of extruded debris for each group and intracanal debris in the root canal was statistically analyzed by a Kruskal-Wallis one-way analysis of variance and Mann-Whitney U test. The result showed that, hand instrumentation using K files showed the highest amount of debris extrusion apically when compared to ProTaper, K3 and LightSpeed LSX. The result also showed that there was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third. Based on the results, all instrumentation techniques produced debris extrusion. The engine driven Ni-Ti systems extruded significantly less apical debris than hand instrumentation. There was no statistically significant difference between the groups in relation to presence of intracanal debris in the apical one third.

  16. Analysis of the Impact of Emotional Intelligence Employees on Organizational Performance

    Directory of Open Access Journals (Sweden)

    Tina Vukasović

    2013-01-01

    Full Text Available Modern company needs as the foundation of its successful operationclearly defined vision, employees values and high, but still reachablebusiness goals, which are based on high level involvement of employeesin the company and on the basis of their participation. Knowledge,skills and experience, that employees possess, are strong arguments ofsuccessful company, but it is not all the same, with what kind of emotionsarguments are expressed in their own practice. Research showsthat emotional intelligence respondents attach great influence in thesuccess of the company. Emotional intelligence should help both tobetter work performance, as well as to improve the physical well-beingand emotional stability.

  17. Analysis of optoelectronic strategic planning in Taiwan by artificial intelligence portfolio tool

    Science.gov (United States)

    Chang, Rang-Seng

    1992-05-01

    Taiwan ROC has achieved significant advances in the optoelectronic industry with some Taiwan products ranked high in the world market and technology. Six segmentations of optoelectronic were planned. Each one was divided into several strategic items, design artificial intelligent portfolio tool (AIPT) to analyze the optoelectronic strategic planning in Taiwan. The portfolio is designed to provoke strategic thinking intelligently. This computer- generated strategy should be selected and modified by the individual. Some strategies for the development of the Taiwan optoelectronic industry also are discussed in this paper.

  18. Intelligence Ethics:

    DEFF Research Database (Denmark)

    Rønn, Kira Vrist

    2016-01-01

    Questions concerning what constitutes a morally justified conduct of intelligence activities have received increased attention in recent decades. However, intelligence ethics is not yet homogeneous or embedded as a solid research field. The aim of this article is to sketch the state of the art...... of intelligence ethics and point out subjects for further scrutiny in future research. The review clusters the literature on intelligence ethics into two groups: respectively, contributions on external topics (i.e., the accountability of and the public trust in intelligence agencies) and internal topics (i.......e., the search for an ideal ethical framework for intelligence actions). The article concludes that there are many holes to fill for future studies on intelligence ethics both in external and internal discussions. Thus, the article is an invitation – especially, to moral philosophers and political theorists...

  19. Investigation of debris bed formation, spreading and coolability

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Konovalenko, A.; Grishchenko, D.; Yakush, S.; Basso, S.; Lubchenko, N.; Karbojian, A. [Royal Institute of Technology, KTH. Div. of Nuclear Power Safety, Stockholm (Sweden)

    2013-08-15

    The work is motivated by the severe accident management strategy adopted in Nordic type BWRs. It is assumed that core melt ejected from the vessel will fragment, quench and form a coolable debris bed in a deep water pool below the vessel. In this work we consider phenomena relevant to the debris bed formation and coolability. Several DEFOR-A (Debris Bed Formation - Agglomeration) tests have been carried out with new corium melt material and a melt releasing nozzle mockup. The influence of the melt material, melt superheat, jet free fall height on the (i) faction of agglomerated debris, (ii) particle size distribution, (iii) ablation/plugging of the nozzle mockup has been addressed. Results of the DECOSIM (Debris Coolability Simulator) code validation against available COOLOCE data are presented in the report. The dependence of DHF on system pressure from COOLOCE experiments can be reproduced quite accurately if either the effective particle diameter or debris bed porosity is increased. For a cylindrical debris bed, good agreement is achieved in DECOSIM simulations for the particle diameter 0.89 mm and porosity 0.4. The results obtained are consistent with MEWA simulation where larger particle diameters and porosities were found to be necessary to reproduce the experimental data on DHF. It is instructive to note that results of DHF prediction are in better agreement with POMECO-HT data obtained for the same particles. It is concluded that further clarification of the discrepancies between different experiments and model predictions. In total 13 exploratory tests were carried out in PDS (particulate debris spreading) facility to clarify potential influence of the COOLOCE (VTT) facility heaters and TCs on particle self-leveling process. Results of the preliminary analysis suggest that there is no significant influence of the pins on self-leveling, at least for the air superficial velocities ranging from 0.17 up to 0.52 m/s. Further confirmatory tests might be needed

  20. Investigation of debris bed formation, spreading and coolability

    International Nuclear Information System (INIS)

    Kudinov, P.; Konovalenko, A.; Grishchenko, D.; Yakush, S.; Basso, S.; Lubchenko, N.; Karbojian, A.

    2013-08-01

    The work is motivated by the severe accident management strategy adopted in Nordic type BWRs. It is assumed that core melt ejected from the vessel will fragment, quench and form a coolable debris bed in a deep water pool below the vessel. In this work we consider phenomena relevant to the debris bed formation and coolability. Several DEFOR-A (Debris Bed Formation - Agglomeration) tests have been carried out with new corium melt material and a melt releasing nozzle mockup. The influence of the melt material, melt superheat, jet free fall height on the (i) faction of agglomerated debris, (ii) particle size distribution, (iii) ablation/plugging of the nozzle mockup has been addressed. Results of the DECOSIM (Debris Coolability Simulator) code validation against available COOLOCE data are presented in the report. The dependence of DHF on system pressure from COOLOCE experiments can be reproduced quite accurately if either the effective particle diameter or debris bed porosity is increased. For a cylindrical debris bed, good agreement is achieved in DECOSIM simulations for the particle diameter 0.89 mm and porosity 0.4. The results obtained are consistent with MEWA simulation where larger particle diameters and porosities were found to be necessary to reproduce the experimental data on DHF. It is instructive to note that results of DHF prediction are in better agreement with POMECO-HT data obtained for the same particles. It is concluded that further clarification of the discrepancies between different experiments and model predictions. In total 13 exploratory tests were carried out in PDS (particulate debris spreading) facility to clarify potential influence of the COOLOCE (VTT) facility heaters and TCs on particle self-leveling process. Results of the preliminary analysis suggest that there is no significant influence of the pins on self-leveling, at least for the air superficial velocities ranging from 0.17 up to 0.52 m/s. Further confirmatory tests might be needed

  1. Nephrus: expert system model in intelligent multilayers for evaluation of urinary system based on scintigraphic image analysis

    International Nuclear Information System (INIS)

    Silva, Jorge Wagner Esteves da; Schirru, Roberto; Boasquevisque, Edson Mendes

    1999-01-01

    Renal function can be measured noninvasively with radionuclides in a extremely safe way compared to other diagnosis techniques. Nevertheless, due to the fact that radioactive materials are used in this procedure, it is necessary to maximize its benefits, therefore all efforts are justifiable in the development of data analysis support tools for this diagnosis modality. The objective of this work is to develop a prototype for a system model based on Artificial Intelligence devices able to perform functions related to cintilographic image analysis of the urinary system. Rules used by medical experts in the analysis of images obtained with 99m Tc+DTPA and /or 99m Tc+DMSA were modeled and a Neural Network diagnosis technique was implemented. Special attention was given for designing programs user-interface. Human Factor Engineering techniques were taking in account allowing friendliness and robustness. The image segmentation adopts a model based on Ideal ROIs, which represent the normal anatomic concept for urinary system organs. Results obtained using Artificial Neural Networks for qualitative image analysis and knowledge model constructed show the feasibility of Artificial Neural Networks for qualitative image analysis and knowledge model constructed show feasibility of Artificial Intelligence implementation that uses inherent abilities of each technique in the medical diagnosis image analysis. (author)

  2. Debris Flow Occurrence and Sediment Persistence, Upper Colorado River Valley, CO.

    Science.gov (United States)

    Grimsley, K J; Rathburn, S L; Friedman, J M; Mangano, J F

    2016-07-01

    Debris flow magnitudes and frequencies are compared across the Upper Colorado River valley to assess influences on debris flow occurrence and to evaluate valley geometry effects on sediment persistence. Dendrochronology, field mapping, and aerial photographic analysis are used to evaluate whether a 19th century earthen, water-conveyance ditch has altered the regime of debris flow occurrence in the Colorado River headwaters. Identifying any shifts in disturbance processes or changes in magnitudes and frequencies of occurrence is fundamental to establishing the historical range of variability (HRV) at the site. We found no substantial difference in frequency of debris flows cataloged at eleven sites of deposition between the east (8) and west (11) sides of the Colorado River valley over the last century, but four of the five largest debris flows originated on the west side of the valley in association with the earthen ditch, while the fifth is on a steep hillslope of hydrothermally altered rock on the east side. These results suggest that the ditch has altered the regime of debris flow activity in the Colorado River headwaters as compared to HRV by increasing the frequency of debris flows large enough to reach the Colorado River valley. Valley confinement is a dominant control on response to debris flows, influencing volumes of aggradation and persistence of debris flow deposits. Large, frequent debris flows, exceeding HRV, create persistent effects due to valley geometry and geomorphic setting conducive to sediment storage that are easily delineated by valley confinement ratios which are useful to land managers.

  3. Emotional Intelligence, Motivational Climate and Levels of Anxiety in Athletes from Different Categories of Sports: Analysis through Structural Equations

    Directory of Open Access Journals (Sweden)

    Manuel Castro-Sánchez

    2018-05-01

    Full Text Available (1 Background: Psychological factors can strongly affect the athletes’ performance. Therefore, currently the role of the sports psychologist is particularly relevant, being in charge of training the athlete’s psychological factors. This study aims at analysing the connections between motivational climate in sport, anxiety and emotional intelligence depending on the type of sport practised (individual/team by means of a multigroup structural equations analysis. (2 372 semi-professional Spanish athletes took part in this investigation, analysing motivational climate (PMCSQ-2, emotional intelligence (SSRI and levels of anxiety (STAI. A model of multigroup structural equations was carried out which fitted accordingly (χ2 = 586.77; df = 6.37; p < 0.001; Comparative Fit Index (CFI = 0.951; Normed Fit Index (NFI = 0.938; Incremental Fit Index (IFI = 0.947; Root Mean Square Error of Approximation (RMSEA = 0.069. (3 Results: A negative and direct connection has been found between ego oriented climate and task oriented climate, which is stronger and more differentiated in team sports. The most influential indicator in ego oriented climate is intra-group rivalry, exerting greater influence in individual sports. For task-oriented climate the strongest indicator is having an important role in individual sports, while in team sports it is cooperative learning. Emotional intelligence dimensions correlate more strongly in team sports than in individual sports. In addition, there was a negative and indirect relation between task oriented climate and trait-anxiety in both categories of sports. (4 Conclusions: This study shows how the task-oriented motivational climate or certain levels of emotional intelligence can act preventively in the face of anxiety states in athletes. Therefore, the development of these psychological factors could prevent anxiety states and improve performance in athletes.

  4. Ruhsal Zeka ve Çalışma Algısı Üzerine Bir Analiz(The Analysis on Spiritual Intelligence and Working Perception

    Directory of Open Access Journals (Sweden)

    Ümmühan YİĞİT SEYFİ

    2016-12-01

    Full Text Available In today’s world, companies have mosaic organization structure in that employees with different cultures and generations work together. In this structure, to sustain existence of the companies with their employees as a whole, it is important to improve spiritual intelligence of employees. Spiritual intelligence is a kind of connective thinking that provides holistic approach. Spiritual intelligence is being aware of who you are and is living life with this awareness. This research is designed to understand the nature of the relationships between spiritual intelligence and working perception. First of all the related literature were examined aiming for research purpose. The survey instrument through which the research were conducted consists of ‘The Spiritual Intelligence Self-Report Inventory (SISRI-24, Work Mentality(IGA Questionnaire. The data were analyzed using Statistical Package for Social Sciences (SPSS 22. After profiles were determined benefiting from demographic data of the sample, the reliability and validity of the instruments, factor analysis were performed. Correlation, regression, t-test and ANOVA techniques have been used to analyse the data. Results of the study indicate that there is a statistically significant positive relationship between spiritual intelligence and work perception and established model was adopted. The increase in spiritual intelligence level positively affects the work perception.

  5. Comparing individuals with learning disability and those with borderline IQ: a confirmatory factor analysis of the Wechsler Adult Intelligence Scale (3rd edition).

    OpenAIRE

    MacLean, Hannah Ng On-Nar

    2011-01-01

    Background: Support for the four factor construct validity of the third edition of the Wechsler Adult Intelligence Scale (WAIS-III) has been found in clinical and non clinical populations but some studies question whether more complex models consistent with the concepts of fluid and crystallised intelligence provide a better explanation of the data. The WAIS-III is frequently used in the diagnosis of learning disability, however, previous exploratory factor analysis of data from a population ...

  6. Intelligence Naturelle et Intelligence Artificielle

    OpenAIRE

    Dubois, Daniel

    2011-01-01

    Cet article présente une approche systémique du concept d’intelligence naturelle en ayant pour objectif de créer une intelligence artificielle. Ainsi, l’intelligence naturelle, humaine et animale non-humaine, est une fonction composée de facultés permettant de connaître et de comprendre. De plus, l'intelligence naturelle reste indissociable de la structure, à savoir les organes du cerveau et du corps. La tentation est grande de doter les systèmes informatiques d’une intelligence artificielle ...

  7. Energy efficiency analysis considering the use of an intelligent systems management in a smart home

    OpenAIRE

    Morais, H.; Fernandes, Filipe; Faria, Pedro; Vale, Zita

    2012-01-01

    In this abstract is presented an energy management system included in a SCADA system existent in a intelligent home. The system control the home energy resources according to the players definitions (electricity consumption and comfort levels), the electricity prices variation in real time mode and the DR events proposed by the aggregators.

  8. Emotional Intelligence and ADHD: A Comparative Analysis in Students of Lima Metropolitan Area

    Science.gov (United States)

    Barahona, Luciana M.; Alegre, Alberto A.

    2016-01-01

    The following study aims to identify statistically significant differences between adolescent students with and without Attention Deficit Disorder and Hyperactivity (ADHD) in emotional intelligence skills. The study sample was composed of 44 students with ADHD diagnosis and 192 students without ADHD; both groups were obtained by an intentional…

  9. A Comparative Analysis of the Emotional Intelligence Levels of American and Chinese Business Students

    Science.gov (United States)

    Margavio, Thomas M.; Margavio, Geanie W.; Hignite, Michael A.; Moses, Duane R.

    2012-01-01

    Emotional Intelligence (EI) is a characteristic of business students that has been the subject of significant research. This study was designed to extend that prior research by comparing the EI scores of American business students with those of Chinese business students. The study further focuses on those factors which may be related to ways in…

  10. Analysis of Students' Online Learning Readiness Based on Their Emotional Intelligence Level

    Science.gov (United States)

    Engin, Melih

    2017-01-01

    The objective of the present study is to determine whether there is a significant relationship between the students' readiness in online learning and their emotional intelligence levels. Correlational research method was used in the study. Online Learning Readiness Scale which was developed by Hung et al. (2010) has been used and Trait Emotional…

  11. Applications of artificial intelligence systems in the analysis of epidemiological data.

    Science.gov (United States)

    Flouris, Andreas D; Duffy, Jack

    2006-01-01

    A brief review of the germane literature suggests that the use of artificial intelligence (AI) statistical algorithms in epidemiology has been limited. We discuss the advantages and disadvantages of using AI systems in large-scale sets of epidemiological data to extract inherent, formerly unidentified, and potentially valuable patterns that human-driven deductive models may miss.

  12. An Analysis of Business Intelligence Maturity, Enterprise Size, and Environmental Factors

    Science.gov (United States)

    Walker, Karen M.

    2017-01-01

    Business intelligence (BI) maturity for small and medium-sized enterprises (SMEs) is significantly behind larger companies that utilize BI solutions. Successful data oriented business environments require knowledge and insight to understand organizational capabilities. This quantitative correlational study assessed the relationship between…

  13. Implementing dashboards as a business intelligence tool in the forest inventory and analysis program

    Science.gov (United States)

    Scott A. Pugh; Randall S. Morin; Barbara A. Johnson

    2015-01-01

    Today is the era of “big data” where businesses have access to enormous amounts of often complex and sometimes unwieldy data. Businesses are using business intelligence (BI) systems to transform this data into useful information for management decisions. BI systems integrate applications, processes, data, and people to deliver prompt and robust analyses. A number of...

  14. An Analysis of Turkish Students' Perception of Intelligence from Primary School to University

    Science.gov (United States)

    Beyaztas, Dilek Ilhan; Hymer, Barry

    2018-01-01

    The aim of this descriptive study was to determine the features of intelligence perceptions according to age, gender, class level, school success level and university departments. Two different scales by Dweck (2000) for both adults and children were translated into Turkish. These scales were then applied to 1350 Turkish students ranging from…

  15. Presenting the networked home: a content analysis of promotion material of Ambient Intelligence applications

    NARCIS (Netherlands)

    Ben Allouch, Soumaya; van Dijk, Johannes A.G.M.; Peters, O.

    2006-01-01

    Ambient Intelligence (AmI) for the home uses information and communication technologies to make users’ everyday life more comfortable. AmI is still in its developmental phase and is headed towards the first stages of diffusion. Characteristics of AmI design can be observed, among others, in the

  16. The Relation between Intelligence and Adaptive Behavior: A Meta-Analysis

    Science.gov (United States)

    Alexander, Ryan M.

    2017-01-01

    Intelligence tests and adaptive behavior scales measure vital aspects of the multidimensional nature of human functioning. Assessment of each is a required component in the diagnosis or identification of intellectual disability, and both are frequently used conjointly in the assessment and identification of other developmental disabilities. The…

  17. Analysis of actual state on enterprise competitive intelligence at internal and abroad

    International Nuclear Information System (INIS)

    Li Nansheng

    2010-01-01

    Based on the actual state of the enterprise competitive intelligence at internal and abroad and on the point of view of practice and theory, it sets forth the character of ci and the influence on the enterprise, analyzes objectivity the questions and the challenges in using CI in our country, brings up the methods and channels in setting up the CIS. (author)

  18. Space debris mitigation - engineering strategies

    Science.gov (United States)

    Taylor, E.; Hammond, M.

    The problem of space debris pollution is acknowledged to be of growing concern by space agencies, leading to recent activities in the field of space debris mitigation. A review of the current (and near-future) mitigation guidelines, handbooks, standards and licensing procedures has identified a number of areas where further work is required. In order for space debris mitigation to be implemented in spacecraft manufacture and operation, the authors suggest that debris-related criteria need to become design parameters (following the same process as applied to reliability and radiation). To meet these parameters, spacecraft manufacturers and operators will need processes (supported by design tools and databases and implementation standards). A particular aspect of debris mitigation, as compared with conventional requirements (e.g. radiation and reliability) is the current and near-future national and international regulatory framework and associated liability aspects. A framework for these implementation standards is presented, in addition to results of in-house research and development on design tools and databases (including collision avoidance in GTO and SSTO and evaluation of failure criteria on composite and aluminium structures).

  19. Elementary epistemological features of machine intelligence

    OpenAIRE

    Horvat, Marko

    2008-01-01

    Theoretical analysis of machine intelligence (MI) is useful for defining a common platform in both theoretical and applied artificial intelligence (AI). The goal of this paper is to set canonical definitions that can assist pragmatic research in both strong and weak AI. Described epistemological features of machine intelligence include relationship between intelligent behavior, intelligent and unintelligent machine characteristics, observable and unobservable entities and classification of in...

  20. EDDA 1.0: integrated simulation of debris flow erosion, deposition and property changes

    Science.gov (United States)

    Chen, H. X.; Zhang, L. M.

    2015-03-01

    Debris flow material properties change during the initiation, transportation and deposition processes, which influences the runout characteristics of the debris flow. A quasi-three-dimensional depth-integrated numerical model, EDDA (Erosion-Deposition Debris flow Analysis), is presented in this paper to simulate debris flow erosion, deposition and induced material property changes. The model considers changes in debris flow density, yield stress and dynamic viscosity during the flow process. The yield stress of the debris flow mixture determined at limit equilibrium using the Mohr-Coulomb equation is applicable to clear water flow, hyper-concentrated flow and fully developed debris flow. To assure numerical stability and computational efficiency at the same time, an adaptive time stepping algorithm is developed to solve the governing differential equations. Four numerical tests are conducted to validate the model. The first two tests involve a one-dimensional debris flow with constant properties and a two-dimensional dam-break water flow. The last two tests involve erosion and deposition, and the movement of multi-directional debris flows. The changes in debris flow mass and properties due to either erosion or deposition are shown to affect the runout characteristics significantly. The model is also applied to simulate a large-scale debris flow in Xiaojiagou Ravine to test the performance of the model in catchment-scale simulations. The results suggest that the model estimates well the volume, inundated area, and runout distance of the debris flow. The model is intended for use as a module in a real-time debris flow warning system.

  1. Rainfall threshold calculation for debris flow early warning in areas with scarcity of data

    Science.gov (United States)

    Pan, Hua-Li; Jiang, Yuan-Jun; Wang, Jun; Ou, Guo-Qiang

    2018-05-01

    Debris flows are natural disasters that frequently occur in mountainous areas, usually accompanied by serious loss of lives and properties. One of the most commonly used approaches to mitigate the risk associated with debris flows is the implementation of early warning systems based on well-calibrated rainfall thresholds. However, many mountainous areas have little data regarding rainfall and hazards, especially in debris-flow-forming regions. Therefore, the traditional statistical analysis method that determines the empirical relationship between rainstorms and debris flow events cannot be effectively used to calculate reliable rainfall thresholds in these areas. After the severe Wenchuan earthquake, there were plenty of deposits deposited in the gullies, which resulted in several debris flow events. The triggering rainfall threshold has decreased obviously. To get a reliable and accurate rainfall threshold and improve the accuracy of debris flow early warning, this paper developed a quantitative method, which is suitable for debris flow triggering mechanisms in meizoseismal areas, to identify rainfall threshold for debris flow early warning in areas with a scarcity of data based on the initiation mechanism of hydraulic-driven debris flow. First, we studied the characteristics of the study area, including meteorology, hydrology, topography and physical characteristics of the loose solid materials. Then, the rainfall threshold was calculated by the initiation mechanism of the hydraulic debris flow. The comparison with other models and with alternate configurations demonstrates that the proposed rainfall threshold curve is a function of the antecedent precipitation index (API) and 1 h rainfall. To test the proposed method, we selected the Guojuanyan gully, a typical debris flow valley that during the 2008-2013 period experienced several debris flow events, located in the meizoseismal areas of the Wenchuan earthquake, as a case study. The comparison with other

  2. Driving the brain towards creativity and intelligence: A network control theory analysis.

    Science.gov (United States)

    Kenett, Yoed N; Medaglia, John D; Beaty, Roger E; Chen, Qunlin; Betzel, Richard F; Thompson-Schill, Sharon L; Qiu, Jiang

    2018-01-04

    High-level cognitive constructs, such as creativity and intelligence, entail complex and multiple processes, including cognitive control processes. Recent neurocognitive research on these constructs highlight the importance of dynamic interaction across neural network systems and the role of cognitive control processes in guiding such a dynamic interaction. How can we quantitatively examine the extent and ways in which cognitive control contributes to creativity and intelligence? To address this question, we apply a computational network control theory (NCT) approach to structural brain imaging data acquired via diffusion tensor imaging in a large sample of participants, to examine how NCT relates to individual differences in distinct measures of creative ability and intelligence. Recent application of this theory at the neural level is built on a model of brain dynamics, which mathematically models patterns of inter-region activity propagated along the structure of an underlying network. The strength of this approach is its ability to characterize the potential role of each brain region in regulating whole-brain network function based on its anatomical fingerprint and a simplified model of node dynamics. We find that intelligence is related to the ability to "drive" the brain system into easy to reach neural states by the right inferior parietal lobe and lower integration abilities in the left retrosplenial cortex. We also find that creativity is related to the ability to "drive" the brain system into difficult to reach states by the right dorsolateral prefrontal cortex (inferior frontal junction) and higher integration abilities in sensorimotor areas. Furthermore, we found that different facets of creativity-fluency, flexibility, and originality-relate to generally similar but not identical network controllability processes. We relate our findings to general theories on intelligence and creativity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Drone Use in Monioring Open Ocean Surface Debris, Including Paired Manta and Tucker Trawls for Relateing Sea State to Vertical Debris Distribution

    Science.gov (United States)

    Lattin, G.

    2016-02-01

    Monitoring debris at sea presents challenges not found in beach or riverine habitats, and is typically done with trawl nets of various apertures and mesh sizes, which limits the size of debris captured and the area surveyed. To partially overcome these limitations in monitoring floating debris, a Quadcopter drone with video transmitting and recording capabilities was deployed at the beginning and the end of manta trawl transects within the North Pacific Subtropical Gyre's eastern convergence zone. Subsurface tucker trawls at 10 meters were conducted at the same time as the manta trawls, in order to assess the effect of sea state on debris dispersal. Trawls were conducted on an 11 station grid used repeatedly since 1999. For drone observations, the operator and observer were stationed on the mother ship while two researchers collected observed debris using a rigid inflatable boat (RIB). The drone was flown to a distance of approximately 100 meters from the vessel in a zigzag or circular search pattern. Here we examine issues arising from drone deployment during the survey: 1) relation of area surveyed by drone to volume of water passing through trawl; 2) retrieval of drone-spotted and associated RIB spotted debris. 3) integrating post- flight image analysis into retrieved debris quantification; and 4) factors limiting drone effectiveness at sea. During the survey, debris too large for the manta trawl was spotted by the drone, and significant debris not observed using the drone was recovered by the RIB. The combination of drone sightings, RIB retrieval, and post flight image analysis leads to improved monitoring of debris at sea. We also examine the issue of the distribution of floating debris during sea states varying from 0-5 by comparing quantities from surface manta trawls to the tucker trawls at a nominal depth of 10 meters.

  4. Disaster Debris Recovery Database - Landfills

    Science.gov (United States)

    The US EPA Region 5 Disaster Debris Recovery Database includes public datasets of over 6,000 composting facilities, demolition contractors, transfer stations, landfills and recycling facilities for construction and demolition materials, electronics, household hazardous waste, metals, tires, and vehicles in the states of Illinois, Indiana, Iowa, Kentucky, Michigan, Minnesota, Missouri, North Dakota, Ohio, Pennsylvania, South Dakota, West Virginia and Wisconsin.In this update, facilities in the 7 states that border the EPA Region 5 states were added to assist interstate disaster debris management. Also, the datasets for composters, construction and demolition recyclers, demolition contractors, and metals recyclers were verified and source information added for each record using these sources: AGC, Biocycle, BMRA, CDRA, ISRI, NDA, USCC, FEMA Debris Removal Contractor Registry, EPA Facility Registry System, and State and local listings.

  5. Disaster Debris Recovery Database - Recovery

    Science.gov (United States)

    The US EPA Region 5 Disaster Debris Recovery Database includes public datasets of over 6,000 composting facilities, demolition contractors, transfer stations, landfills and recycling facilities for construction and demolition materials, electronics, household hazardous waste, metals, tires, and vehicles in the states of Illinois, Indiana, Iowa, Kentucky, Michigan, Minnesota, Missouri, North Dakota, Ohio, Pennsylvania, South Dakota, West Virginia and Wisconsin.In this update, facilities in the 7 states that border the EPA Region 5 states were added to assist interstate disaster debris management. Also, the datasets for composters, construction and demolition recyclers, demolition contractors, and metals recyclers were verified and source information added for each record using these sources: AGC, Biocycle, BMRA, CDRA, ISRI, NDA, USCC, FEMA Debris Removal Contractor Registry, EPA Facility Registry System, and State and local listings.

  6. New advances for modelling the debris avalanches

    Science.gov (United States)

    Cuomo, Sabatino; Cascini, Leonardo; Pastor, Manuel; Castorino, Giuseppe Claudio

    2013-04-01

    Flow-like landslides are a major global hazard and they occur worldwide causing a large number of casualties, significant structural damages to property and infrastructures as well as economic losses. When involving open slopes, these landslides often occur in triangular source areas where initial slides turn into avalanches through further failures and/or eventual soil entrainment. This paper deals with the numerical modelling of the propagation stage of debris avalanches which provides information such as the propagation pattern of the mobilized material, its velocity, thickness and run-out distance. In the paper, a "depth integrated" model is used which allows: i) adequately taking into account the irregular topography of real slopes which greatly affect the propagation stage and ii) using a less time consuming model than fully 3D approaches. The used model is named "GeoFlow_SPH" and it was formerly applied to theoretical, experimental and real case histories (Pastor et al., 2009; Cascini et al., 2012). In this work the behavior of debris avalanches is analyzed with special emphasis on the apical angle, one of the main features of this type of landslide, in relation to soil rheology, hillslope geometry and features of triggering area. Furthermore, the role of erosion has been investigated with reference to the uppermost parts of open slopes with a different steepness. These analyses are firstly carried out for simplified benchmark slopes, using both water-like materials (with no shear strength) and debris type materials. Then, three important case studies of Campania region (Cervinara, Nocera Inferiore e Sarno) are analyzed where debris avalanches involved pyroclastic soils originated from the eruptive products of Vesusius volcano. The results achieved for both benchmark slopes and real case histories outline the key role played by the erosion on the whole propagation stage of debris avalanches. The results are particularly satisfactory since they indicate the

  7. Debris Flows and Related Phenomena

    Science.gov (United States)

    Ancey, C.

    Torrential floods are a major natural hazard, claiming thousands of lives and millions of dollars in lost property each year in almost all mountain areas on the Earth. After a catastrophic eruption of Mount St. Helen in the USA in May 1980, water from melting snow, torrential rains from the eruption cloud, and water displaced from Spirit Lake mixed with deposited ash and debris to produce very large debris flows and cause extensive damage and loss of life [1]. During the 1985 eruption of Nevado del Ruiz in Colombia, more than 20,000 people perished when a large debris flow triggered by the rapid melting of snow and ice at the volcano summit, swept through the town of Armero [2]. In 1991, the eruption of Pinatubo volcano in the Philippines disperses more than 5 cubic kilometres of volcanic ash into surrounding valleys. Much of that sediment has subsequently been mobilised as debris flows by typhoon rains and has devastated more than 300 square kilometres of agricultural land. Even, in Eur opean countries, recent events that torrential floods may have very destructive effects (Sarno and Quindici in southern Italy in May 1998, where approximately 200 people were killed). The catastrophic character of these floods in mountainous watersheds is a consequence of significant transport of materials associated with water flows. Two limiting flow regimes can be distinguished. Bed load and suspension refer to dilute transport of sediments within water. This means that water is the main agent in the flow dynamics and that the particle concentration does not exceed a few percent. Such flows are typically two-phase flows. In contrast, debris flows are mas s movements of concentrated slurries of water, fine solids, rocks and boulders. As a first approximation, debris flows can be treated as one-phase flows and their flow properties can be studied using classical rheological methods. The study of debris flows is a very exciting albeit immature science, made up of disparate elements

  8. Transient debris freezing and potential wall melting during a severe reactivity initiated accident experiment

    International Nuclear Information System (INIS)

    El-Genk, M.S.; Moore, R.L.

    1981-01-01

    It is important to light water reactor (LWR) safety analysis to understand the transient freezing of molten core debris on cold structures following a hypothetical core meltdown accident. The purpose of this paper is to (a) present the results of a severe reactivity initiated accident (RIA) in-pile experiment with regard to molten debris distribution and freezing following test fuel rod failure, (b) analyze the transient freezing of molten debris (primarily a mixture of UO/sub 2/ fuel and Zircaloy cladding) deposited on the inner surface of the test shroud wall upon rod failure, and (c) assess the potential for wall melting upon being contacted by the molten debris. 26 refs

  9. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    International Nuclear Information System (INIS)

    Kazarov, A; Miotto, G Lehmann; Magnoni, L

    2012-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  10. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  11. Analysis and Modeling of the Galvanic Skin Response Spontaneous Component in the context of Intelligent Biofeedback Systems Development

    Science.gov (United States)

    Unakafov, A.

    2009-01-01

    The paper presents an approach to galvanic skin response (GSR) spontaneous component analysis and modeling. In the study a classification of biofeedback training methods is given, importance of intelligent methods development is shown. The INTENS method, which is perspective for intellectualization, is presented. An important problem of biofeedback training method intellectualization - estimation of the GSR spontaneous component - is solved in the main part of the work. Its main characteristics are described; results of GSR spontaneous component modeling are shown. Results of small research of an optimum material for GSR probes are presented.

  12. Artificial Intelligence Study (AIS).

    Science.gov (United States)

    1987-02-01

    ARTIFICIAL INTELLIGNECE HARDWARE ....... 2-50 AI Architecture ................................... 2-49 AI Hardware ....................................... 2...ftf1 829 ARTIFICIAL INTELLIGENCE STUDY (RIS)(U) MAY CONCEPTS 1/3 A~NLYSIS AGENCY BETHESA RD R B NOJESKI FED 6? CM-RP-97-1 NCASIFIED /01/6 M |K 1.0...p/ - - ., e -- CAA- RP- 87-1 SAOFŔ)11 I ARTIFICIAL INTELLIGENCE STUDY (AIS) tNo DTICFEBRUARY 1987 LECT 00 I PREPARED BY RESEARCH AND ANALYSIS

  13. Marine Debris Research, Prevention, and Reduction Act

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Marine Debris Research, Prevention, and Reduction Act legally establishes the National Oceanic and Atmospheric Administration's (NOAA) Marine Debris Program. The...

  14. The ecological impacts of marine debris

    NARCIS (Netherlands)

    Rochman, Chelsea M.; Browne, Mark Anthony; Underwood, A.J.; Franeker, Van Jan A.; Thompson, Richard C.; Amaral-Zettler, Linda A.

    2016-01-01

    Anthropogenic debris contaminates marine habitats globally, leading to several perceived ecological impacts. Here, we critically and systematically review the literature regarding impacts of debris from several scientific fields to understand the weight of evidence regarding the ecological

  15. Space Debris Elimination (SpaDE)

    Data.gov (United States)

    National Aeronautics and Space Administration — The amount of debris in low Earth orbit (LEO) has increased rapidly over the last twenty years. This prevalence of debris increases the likelihood of cascading...

  16. A Novel Biometric Identification Based on a User's Input Pattern Analysis for Intelligent Mobile Devices

    Directory of Open Access Journals (Sweden)

    Hojin Seo

    2012-07-01

    Full Text Available As intelligent mobile devices become more popular, security threats targeting them are increasing. The resource constraints of mobile devices, such as battery life and computing power, however, make it harder to handle such threats effectively. The existing physical and behavioural biometric identification methods - looked upon as good alternatives - are unsuitable for the current mobile environment. This paper proposes a specially designed biometric identification method for intelligent mobile devices by analysing the user's input patterns, such as a finger's touch duration, pressure level and the touching width of the finger on the touch screen. We collected the input pattern data of individuals to empirically test our method. Our testing results show that this method effectively identifies users with near a 100% rate of accuracy.

  17. Descriptive business intelligence analysis: utting edge strategic asset for SMEs, is it really worth it?

    Directory of Open Access Journals (Sweden)

    Sivave Mashingaidze

    2014-10-01

    Full Text Available The purpose of this article is to provide a framework for understanding and adoption of Business Intelligence by (SMEs within the Zimbabwean economy. The article explores every facet of Business Intelligence, including internal and external BI as cutting edge strategic asset. A descriptive research methodology has been adopted. The article revealed some BI critical success factors for better BI implementation. Findings revealed that organizations which have the greatest success with BI travel an evolutionary path, starting with basic data and analytical tools and transitioning to increasingly more sophisticated capabilities until BI becomes an intrinsic part of their business culture and ROI is realized. Findings are useful for managers, policy makers, business analysts, and IT specialists in dealing with planning and implementation of BI systems in SMEs.

  18. Difficulties in Defining Social-Emotional Intelligence, Competences and Skills - a Theoretical Analysis and Structural Suggestion

    Directory of Open Access Journals (Sweden)

    Moana Monnier

    2015-04-01

    Full Text Available Demands related to the frequency of and time required for interactional tasks in everyday occupational routines are continuously growing. When it comes to qualifying a person’s ability to interact with others, two prototypical concepts are often used: social competences and emotional intelligence. In connection to discussions about curriculum standards in Germany, these are viewed as important attributes that should be taught, supported and if possible assessed in educational pathways toward an occupation (KMK, 2007. However, in looking for a generally approved and widely used definition, many problems arise on the inter-conceptual and intra-conceptual level, triggering implementation difficulties in educational curricula. This article highlights these difficulties by selecting five well-established key theories and comparing their communalities and differences. Analyzing definitions of intelligence, competences and skills, taking an action regulation perspective and highlighting the interdependence of social and emotional aspects, a structural system to facilitate the transfer into the educational context is proposed.

  19. Application of artifical intelligence principles to the analysis of "crazy" speech.

    Science.gov (United States)

    Garfield, D A; Rapp, C

    1994-04-01

    Artificial intelligence computer simulation methods can be used to investigate psychotic or "crazy" speech. Here, symbolic reasoning algorithms establish semantic networks that schematize speech. These semantic networks consist of two main structures: case frames and object taxonomies. Node-based reasoning rules apply to object taxonomies and pathway-based reasoning rules apply to case frames. Normal listeners may recognize speech as "crazy talk" based on violations of node- and pathway-based reasoning rules. In this article, three separate segments of schizophrenic speech illustrate violations of these rules. This artificial intelligence approach is compared and contrasted with other neurolinguistic approaches and is discussed as a conceptual link between neurobiological and psychodynamic understandings of psychopathology.

  20. From curve fitting to machine learning an illustrative guide to scientific data analysis and computational intelligence

    CERN Document Server

    Zielesny, Achim

    2016-01-01

    This successful book provides in its second edition an interactive and illustrative guide from two-dimensional curve fitting to multidimensional clustering and machine learning with neural networks or support vector machines. Along the way topics like mathematical optimization or evolutionary algorithms are touched. All concepts and ideas are outlined in a clear cut manner with graphically depicted plausibility arguments and a little elementary mathematics. The major topics are extensively outlined with exploratory examples and applications. The primary goal is to be as illustrative as possible without hiding problems and pitfalls but to address them. The character of an illustrative cookbook is complemented with specific sections that address more fundamental questions like the relation between machine learning and human intelligence. All topics are completely demonstrated with the computing platform Mathematica and the Computational Intelligence Packages (CIP), a high-level function library developed with M...

  1. Online Behavior Analysis-Based Student Profile for Intelligent E-Learning

    OpenAIRE

    Liang, Kun; Zhang, Yiying; He, Yeshen; Zhou, Yilin; Tan, Wei; Li, Xiaoxia

    2017-01-01

    With the development of mobile platform, such as smart cellphone and pad, the E-Learning model has been rapidly developed. However, due to the low completion rate for E-Learning platform, it is very necessary to analyze the behavior characteristics of online learners to intelligently adjust online education strategy and enhance the quality of learning. In this paper, we analyzed the relation indicators of E-Learning to build the student profile and gave countermeasures. Adopting the similarit...

  2. The Impact of the Information Revolution on Policymakers’ Use of Intelligence Analysis

    Science.gov (United States)

    2005-01-01

    Revolution on the Market for Information The major effect of this explosion in technology is the proliferation of information consumers and providers. Their... technology . The State Department gives its policymakers the least connectivity to any real time or electronic information as its officials lack Internet ...connectivity than others to the intelligence community, as well as to the Internet and other sources of information age open sources. This will have a profound

  3. Analysis of Changes in Market Shares of Commercial Banks Operating in Turkey Using Computational Intelligence Algorithms

    OpenAIRE

    Amasyali, M. Fatih; Demırhan, Ayse; Bal, Mert

    2014-01-01

    This paper aims to model the change in market share of 30 domestic and foreign banks, which have been operating between the years 1990 and 2009 in Turkey by taking into consideration 20 financial ratios of those banks. Due to the fragile structure of the banking sector in Turkey, this study plays an important role for determining the changes in market share of banks and taking the necessary measures promptly. For this reason, computational intelligence methods have been used in the study. Acc...

  4. Mathematic Modeling and Performance Analysis of an Adaptive Congestion Control in Intelligent Transportation Systems

    OpenAIRE

    Naja, Rola; Université de Versailles

    2015-01-01

    In this paper, we develop a preventive congestion control mechanism applied at highway entrances and devised for Intelligent Transportation Systems (ITS). The proposed mechanism provides a vehicular admission control, regulates input traffic and performs vehicular traffic shaping. Our congestion control mechanism includes two classes of vehicles and is based on a specific priority ticket pool scheme with queue-length threshold scheduling policy, tailored to vehicular networks. In an attempt t...

  5. Economic intelligence of the modern state

    OpenAIRE

    Levytskyi, Valentyn

    2001-01-01

    The goal of the thesis is to explore economic intelligence. The work includes the analysis of open sources. Tile approach to the issue of economic intelligence is based on the analysis of the state's economic security. The research presents the views of politicians, intelligence professionals, and scientists. It proposes possible objectives and missions of economic intelligence. Additionally, the research investigates the usefulness and reliability of open sources of economic analysis. The se...

  6. DebriSat Hypervelocity Impact Test

    Science.gov (United States)

    2015-08-01

    public release; distribution unlimited.  Targets: Scaled Multishock Shield, DebrisLV, and DebriSat  500-600 g hollow aluminum and nylon projectile... insulation . DebriSat’s internal components were structurally similar to real flight hardware but were nonfunctional. AEDC-TR-15-S-2 6...structures with an AL 5052 honeycomb core and M55J carbon fiber face sheets. The basic system characteristics of the DebriSat are given in Table 1

  7. A UK-wide analysis of trait emotional intelligence within the radiography profession

    International Nuclear Information System (INIS)

    Mackay, S.J.; Hogg, P.; Cooke, G.; Baker, R.D.; Dawkes, T.

    2012-01-01

    The aim of this study was to profile the Trait emotional intelligence (EI) of the radiography profession, explore any differences between subgroups, compare the profession with a normative group and investigate the relationship between EI and the leaders of the profession. An online UK-wide survey was conducted using the Trait Emotional Intelligence Questionnaire, a self-report measure. Three main analyses were undertaken to investigate any differences between the sample and population, the radiographer subgroups and the sample and a normative group. The sample had similar characteristics to the population. There were differences between types of radiographer, with nuclear medicine radiographers scoring consistently lower than other groups. There were differences between the leaders and other members of the profession particularly in the Sociability factor. Radiographers scored higher than the TEIQue normative group for Global EI and three of the four factors. The study has benchmarked the Trait EI of one healthcare profession and identified areas for future research to develop our understanding of emotional intelligence.

  8. Participatory Sensing Marine Debris: Current Trends and Future Opportunities

    Science.gov (United States)

    Jambeck, J.; Johnsen, K.

    2016-02-01

    The monitoring of litter and debris is challenging at the global scale because of spatial and temporal variability, disconnected local organizations and the use of paper and pen for documentation. The Marine Debris Tracker mobile app and citizen science program allows for the collection of global standardized data at a scale, speed and efficiency that was not previously possible. The app itself also serves as an outreach and education tool, creating an engaged participatory sensing instrument. This instrument is characterized by several aspects including range and frequency, accuracy and precision, accessibility, measurement dimensions, participant performance, and statistical analysis. Also, important to Marine Debris Tracker is open data and transparency. A web portal provides data that users have logged allowing immediate feedback to users and additional education opportunities. The engagement of users through a top tracker competition and social media keeps participants interested in the Marine Debris Tracker community. Over half a million items have been tracked globally, and maps provide both global and local distribution of data. The Marine Debris Tracker community and dataset continues to grow daily. We will present current usage and engagement, participatory sensing data distributions, choropleth maps of areas of active tracking, and discuss future technologies and platforms to expand data collection and conduct statistical analysis.

  9. Charged Coupled Device Debris Telescope Observations of the Geosynchronous Orbital Debris Environment - Observing Year: 1998

    Science.gov (United States)

    Jarvis, K. S.; Thumm, T. L.; Matney, M. J.; Jorgensen, K.; Stansbery, E. G.; Africano, J. L.; Sydney, P. F.; Mulrooney, M. K.

    2002-01-01

    NASA has been using the charged coupled device (CCD) debris telescope (CDT)--a transportable 32-cm Schmidt telescope located near Cloudcroft, New Mexico-to help characterize the debris environment in geosynchronous Earth orbit (GEO). The CDT is equipped with a SITe 512 x 512 CCD camera whose 24 m2 (12.5 arc sec) pixels produce a 1.7 x 1.7-deg field of view. The CDT system can therefore detect l7th-magnitude objects in a 20-sec integration corresponding to an approx. 0.6-m diameter, 0.20 albedo object at 36,000 km. The telescope pointing and CCD operation are computer controlled to collect data automatically for an entire night. The CDT has collected more than 1500 hrs of data since November 1997. This report describes the collection and analysis of 58 nights (approx. 420 hrs) of data acquired in 1998.

  10. Photometric Studies of GEO Debris

    Science.gov (United States)

    Seitzer, Patrick; Cowardin, Heather M.; Barker, Edwin; Abercromby, Kira J.; Foreman, Gary; Horstman, Matt

    2009-01-01

    The photometric signature of a debris object can be useful in determining what the physical characteristics of a piece of debris are. We report on optical observations in multiple filters of debris at geosynchronous Earth orbit (GEO). Our sample is taken from GEO objects discovered in a survey with the University of Michigan's 0.6-m aperture Schmidt telescope MODEST (for Michigan Orbital DEbris Survey Telescope), and then followed up in real-time with the SMARTS (Small and Medium Aperture Research Telescope System) 0.9-m at CTIO for orbits and photometry. Our goal is to determine 6 parameter orbits and measure colors for all objects fainter than R = 15 th magnitude that are discovered in the MODEST survey. At this magnitude the distribution of observed angular rates changes significantly from that of brighter objects. There are two objectives: 1. Estimate the orbital distribution of objects selected on the basis of two observational criteria: brightness (magnitude) and angular rates. 2. Obtain magnitudes and colors in standard astronomical filters (BVRI) for comparison with reflectance spectra of likely spacecraft materials. What is the faint debris likely to be? In this paper we report on the photometric results. For a sample of 50 objects, more than 90 calibrated sequences of R-B-V-I-R magnitudes have been obtained with the CTIO 0.9-m. For objects that do not show large brightness variations, the colors are largely redder than solar in both B-R and R-I. The width of the color distribution may be intrinsic to the nature of the surfaces, but also could be that we are seeing irregularly shaped objects and measuring the colors at different times with just one telescope. For a smaller sample of objects we have observed with synchronized CCD cameras on the two telescopes. The CTIO 0.9-m observes in B, and MODEST in R. The CCD cameras are electronically linked together so that the start time and duration of observations are the same to better than 50 milliseconds. Thus

  11. Phase shifting-based debris effect detection in USV-assisted AFM nanomachining

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Jialin [State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences (CAS), Shenyang, Liaoning 110016 (China); University of the Chinese Academy of Sciences, Beijing 100049 (China); Beijing Advanced Innovation Center for Imaging Technology, Capital Normal University, Beijing 100049 (China); Liu, Lianqing, E-mail: lianqingliu@sia.cn [State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences (CAS), Shenyang, Liaoning 110016 (China); Yu, Peng; Cong, Yang [State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences (CAS), Shenyang, Liaoning 110016 (China); Li, Guangyong [Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15213 (United States)

    2017-08-15

    Highlights: • The mechanism of the debris effect on machining depth in force control mode operation is analyzed. • The relationship between phase shifting and pile-up of debris is investigated. • The phase shifting-based method is hardly affected by the pile-up of debris. • Debris effect detection by phase shifting-based method is achived. - Abstract: Atomic force microscopy (AFM) mechanical-based lithography attracts much attention in nanomanufacturing due to its advantages of low cost, high precision and high resolution. However, debris effects during mechanical lithography often lead to an unstable machining process and inaccurate results, which limits further applications of AFM-based lithography. There is a lack of a real-time debris detection approach, which is the prerequisite to eventually eliminating the influence of the debris, and of a method that can solve the above problems well. The ultrasonic vibration (USV)-assisted AFM has the ability to sense the machining depth in real time by detecting the phase shifting of cantilever. However, whether the pile-up of debris affect the phase response of cantilever is still lack of investigation. Therefore, we analyzed the mechanism of the debris effect on force control mode and investigated the relationship between phase shifting and pile-up of debris. Theoretical analysis and experimental results reveal that the pile-up of debris have negligible effect on phase shifting of cantilever. Therefore, the phase shifting-based method can detect the debris effect on machining depth in force control mode of AFM machining.

  12. Artificial Intelligence.

    Science.gov (United States)

    Wash, Darrel Patrick

    1989-01-01

    Making a machine seem intelligent is not easy. As a consequence, demand has been rising for computer professionals skilled in artificial intelligence and is likely to continue to go up. These workers develop expert systems and solve the mysteries of machine vision, natural language processing, and neural networks. (Editor)

  13. Intelligent Design

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    2005-01-01

    Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig.......Forestillingen om at naturen er designet af en guddommelig 'intelligens' er et smukt filosofisk princip. Teorier om Intelligent Design som en naturvidenskabeligt baseret teori er derimod helt forfærdelig....

  14. Sediment budget analysis from Landslide debris and river channel change during the extreme event - example of Typhoon Morakot at Laonong river, Taiwan

    Science.gov (United States)

    Chang, Kuo-Jen; Huang, Yu-Ting; Huang, Mei-Jen; Chiang, Yi-Lin; Yeh, En-Chao; Chao, Yu-Jui

    2014-05-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Typhoon Morakot brought extreme and long-time rainfall for Taiwan in August 2009. It further caused huge loss of life and property in central and southern Taiwan. Laonong River is the largest tributary of Gaoping River. It's length is 137 km, and the basin area is 1373 km2. More than 2000mm rainfall brought and maximum rainfall exceeded 100mm/hr in the region by Typhoon Morakot in Aug, 2009. Its heavy rains made many landslides and debris flew into the river and further brought out accumulation and erosion on river banks of different areas. It caused severe disasters within the Laonong River drainage. In the past, the study of sediment blockage of river channel usually relies on field investigation, but due to inconvenient transportation, topographical barriers, or located in remote areas, etc. the survey is hardly to be completed sometimes. In recent years, the rapid development of remote sensing technology improves image resolution and quality significantly. Remote sensing technology can provide a wide range of image data, and provide essential and precious information. Furthermore, although the amount of sediment transportation can be estimated by using data such as rainfall, river flux, and suspended loads, the situation of large debris migration cannot be studied via those data. However, landslides, debris flow and river sediment transportation model in catchment area can be evaluated easily through analyzing the digital terrain model (DTM) . The purpose of this study is to investigate the phenomenon of river migration and to evaluate the amount of migration along Laonong River by analyzing the DEM before and after the typhoon Morakot. The DEMs are built by using the aerial images taken by digital mapping camera (DMC) and by airborne digital scanner 40 (ADS 40) before and after typhoon event. The results show that lateral

  15. Analysis of operator support method based on intelligent dynamic interlock in lead-cooled fast reactor simulator

    International Nuclear Information System (INIS)

    Xu, Peng; Wang, Jianye; Yang, Minghan; Wang, Weitian; Bai, Yunqing; Song, Yong

    2017-01-01

    Highlights: • We development an operator support method based on intelligent dynamic interlock. • We offer an integrated aid system to reduce the working strength of operators. • The method can help operators avoid dangerous, irreversible operation. • This method can be used in the fusion research reactor in the further. - Abstract: In nuclear systems, operators have to carry out corrective actions when abnormal situations occur. However, operators might make mistakes under pressure. In order to avoid serious consequences of the human errors, a new method for operators support based on intelligent dynamic interlock was proposed. The new method based on full digital instrumentation and control system, contains real-time alarm analysis process, decision support process and automatic safety interlock process. Once abnormal conditions occur, necessary safety interlock parameter based on analysis of real-time alarm and decision support process can be loaded into human-machine interfaces and controllers automatically, and avoid human errors effectively. Furthermore, the new method can make recommendations for further use and development of this technique in nuclear power plant or fusion research reactor.

  16. A Comparative Analysis of Multiple Intelligence Theory with Relationship to Gender and Grade Level in Selected Schools in Ghana

    Science.gov (United States)

    Oteng, Ellen N.

    2012-01-01

    This dissertation examined the relationships between Howard Gardner's Multiple Intelligence Theory and students' gender, age, grade level, and enrollment into a public or private school. The research determined students' dominant intelligences and investigated whether students' intelligences may be influenced by demographic variables such as…

  17. The Professionalization of Intelligence Cooperation

    DEFF Research Database (Denmark)

    Svendsen, Adam David Morgan

    "Providing an in-depth insight into the subject of intelligence cooperation (officially known as liason), this book explores the complexities of this process. Towards facilitating a general understanding of the professionalization of intelligence cooperation, Svendsen's analysis includes risk...... management and encourages the realisation of greater resilience. Svendsen discusses the controversial, mixed and uneven characterisations of the process of the professionalization of intelligence cooperation and argues for a degree of 'fashioning method out of mayhem' through greater operational...

  18. Maximising Organisational Information Sharing and Effective Intelligence Analysis in Critical Data Sets. A case study on the information science needs of the Norwegian criminal intelligence and law enforcement community

    OpenAIRE

    Wilhelmsen, Sonja

    2009-01-01

    Organisational information sharing has become more and more important as the amount of information grows. In order to accomplish the most effective and efficient sharing of information, analysis of the information needs and the organisation needs are vital. This dissertation focuses on the information needs sourced through the critical data sets of law enforcement organisations; specifically the Norwegian criminal intelligence and law enforcement community represented by the Na...

  19. Detecting debris flows using ground vibrations

    Science.gov (United States)

    LaHusen, Richard G.

    1998-01-01

    Debris flows are rapidly flowing mixtures of rock debris, mud, and water that originate on steep slopes. During and following volcanic eruptions, debris flows are among the most destructive and persistent hazards. Debris flows threaten lives and property not only on volcanoes but far downstream in valleys that drain volcanoes where they arrive suddenly and inundate entire valley bottoms. Debris flows can destroy vegetation and structures in their path, including bridges and buildings. Their deposits can cover roads and railways, smother crops, and fill stream channels, thereby reducing their flood-carrying capacity and navigability.

  20. Gratitude mediates the effect of emotional intelligence on subjective well-being: A structural equation modeling analysis.

    Science.gov (United States)

    Geng, Yuan

    2016-11-01

    This study investigated the relationship among emotional intelligence, gratitude, and subjective well-being in a sample of university students. A total of 365 undergraduates completed the emotional intelligence scale, the gratitude questionnaire, and the subjective well-being measures. The results of the structural equation model showed that emotional intelligence is positively associated with gratitude and subjective well-being, that gratitude is positively associated with subjective well-being, and that gratitude partially mediates the positive relationship between emotional intelligence and subjective well-being. Bootstrap test results also revealed that emotional intelligence has a significant indirect effect on subjective well-being through gratitude.