WorldWideScience

Sample records for attack methodology analysis

  1. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  2. Attack methodology Analysis: SQL Injection Attacks and Their Applicability to Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-09-01

    Database applications have become a core component in control systems and their associated record keeping utilities. Traditional security models attempt to secure systems by isolating core software components and concentrating security efforts against threats specific to those computers or software components. Database security within control systems follows these models by using generally independent systems that rely on one another for proper functionality. The high level of reliance between the two systems creates an expanded threat surface. To understand the scope of a threat surface, all segments of the control system, with an emphasis on entry points, must be examined. The communication link between data and decision layers is the primary attack surface for SQL injection. This paper facilitates understanding what SQL injection is and why it is a significant threat to control system environments.

  3. Bluetooth security attacks comparative analysis, attacks, and countermeasures

    CERN Document Server

    Haataja, Keijo; Pasanen, Sanna; Toivanen, Pekka

    2013-01-01

    This overview of Bluetooth security examines network vulnerabilities and offers a comparative analysis of recent security attacks. It also examines related countermeasures and proposes a novel attack that works against all existing Bluetooth versions.

  4. Methodological Naturalism Under Attack | Ruse | South African ...

    African Journals Online (AJOL)

    Recently the Intelligent Design movement has been arguing against methodological naturalism, and in this project they have been joined by the Christian philosopher Alvin Plantinga. In this paper I examine Plantinga\\'s arguments and conclude not only that they are not well taken, but that he does no good service to his ...

  5. Managing Complex Battlespace Environments Using Attack the Network Methodologies

    DEFF Research Database (Denmark)

    Mitchell, Dr. William L.

    This paper examines the last 8 years of development and application of Attack the Network (AtN) intelligence methodologies for creating shared situational understanding of complex battlespace environment and the development of deliberate targeting frameworks. It will present a short history....... Including their possible application on a national security level for managing longer strategic endeavors....

  6. Defender-Attacker Decision Tree Analysis to Combat Terrorism.

    Science.gov (United States)

    Garcia, Ryan J B; von Winterfeldt, Detlof

    2016-12-01

    We propose a methodology, called defender-attacker decision tree analysis, to evaluate defensive actions against terrorist attacks in a dynamic and hostile environment. Like most game-theoretic formulations of this problem, we assume that the defenders act rationally by maximizing their expected utility or minimizing their expected costs. However, we do not assume that attackers maximize their expected utilities. Instead, we encode the defender's limited knowledge about the attacker's motivations and capabilities as a conditional probability distribution over the attacker's decisions. We apply this methodology to the problem of defending against possible terrorist attacks on commercial airplanes, using one of three weapons: infrared-guided MANPADS (man-portable air defense systems), laser-guided MANPADS, or visually targeted RPGs (rocket propelled grenades). We also evaluate three countermeasures against these weapons: DIRCMs (directional infrared countermeasures), perimeter control around the airport, and hardening airplanes. The model includes deterrence effects, the effectiveness of the countermeasures, and the substitution of weapons and targets once a specific countermeasure is selected. It also includes a second stage of defensive decisions after an attack occurs. Key findings are: (1) due to the high cost of the countermeasures, not implementing countermeasures is the preferred defensive alternative for a large range of parameters; (2) if the probability of an attack and the associated consequences are large, a combination of DIRCMs and ground perimeter control are preferred over any single countermeasure. © 2016 Society for Risk Analysis.

  7. Discovering Collaborative Cyber Attack Patterns Using Social Network Analysis

    Science.gov (United States)

    Du, Haitao; Yang, Shanchieh Jay

    This paper investigates collaborative cyber attacks based on social network analysis. An Attack Social Graph (ASG) is defined to represent cyber attacks on the Internet. Features are extracted from ASGs to analyze collaborative patterns. We use principle component analysis to reduce the feature space, and hierarchical clustering to group attack sources that exhibit similar behavior. Experiments with real world data illustrate that our framework can effectively reduce from large dataset to clusters of attack sources exhibiting critical collaborative patterns.

  8. A Study of Gaps in Attack Analysis

    Science.gov (United States)

    2016-10-12

    necessarily reflect the views of the Department of Defense. © 2016 MASSACHUSETTS INSTITUTE OF TECHNOLOGY Delivered to the U.S. Government with...and identify cyber attacks reflects the “arms race” na- ture of the cyber domain. While defenders develop new and improved techniques to detect known...Trost. Digging into ShellShock Exploitation attempts using ShockPot Data. https://www.threatstream.com/ blog /shockpot-exploitation-analysis, September

  9. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  10. Cyberprints: Identifying Cyber Attackers by Feature Analysis

    Science.gov (United States)

    Blakely, Benjamin A.

    2012-01-01

    The problem of attributing cyber attacks is one of increasing importance. Without a solid method of demonstrating the origin of a cyber attack, any attempts to deter would-be cyber attackers are wasted. Existing methods of attribution make unfounded assumptions about the environment in which they will operate: omniscience (the ability to gather,…

  11. Incident analysis methodology

    International Nuclear Information System (INIS)

    Libmann, J.

    1986-05-01

    The number of French nuclear power stations in operation and their division into standardized plant series very soon led to the requirement for a precise organization, within both the nuclear safety authorities and the operator, Electricite de France. The methods of analysis have been gradually extended and diversified and we shall speak of them, but it is evident that a very precise definition of the boundaries between what concerns safety and what does not concern it, is needed. This report first deals with the criteria on which declarations are based before outlining the main guidelines of analysis methodology [fr

  12. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  13. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    OpenAIRE

    Khaled Elleithy; Drazen Blagovic; Wang Cheng; Paul Sideleau

    2005-01-01

    A denial of service attack (DOS) is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack wil...

  14. Should we advise patients to treat migraine attacks early: methodologic issues.

    Science.gov (United States)

    Ferrari, Michel D

    2005-01-01

    In clinical trials of triptans in acute migraine, patients have traditionally been required to take their medication only when their pain reached moderate or severe intensity. This methodology better ensured that migraine attacks rather than nonmigraine headaches were treated, minimized the placebo response and simplified comparison of improvement as all patients start from the same baseline pain level. In clinical practice, patients do not take medication in this way, and there is some theoretical evidence that early treatment might be beneficial. There are increasing numbers of reports claiming advantages of 'early' treatment, when the pain is mild, over 'late' treatment, when pain is moderate or severe, but these studies raise significant methodologic issues. Treating 'early' may equate with treating 'mild' in slowly progressing attacks only but this may not always be the case in rapidly progressing attacks; these two types of migraine attacks should be distinguished carefully and investigated separately. Trials should be placebo-controlled, blinded, assess the therapeutic gain versus placebo rather than the absolute rates, and use the sustained pain-free endpoint. Early treatment may also increase the risk of medication overuse headaches. At present, there is no scientific support to advise patient to treat early. Patients should be advised to take their medication as soon as they are sure they are developing a migraine headache, but not during the aura phase. Copyright 2005 S. Karger AG, Basel.

  15. RFA: R-Squared Fitting Analysis Model for Power Attack

    Directory of Open Access Journals (Sweden)

    An Wang

    2017-01-01

    Full Text Available Correlation Power Analysis (CPA introduced by Brier et al. in 2004 is an important method in the side-channel attack and it enables the attacker to use less cost to derive secret or private keys with efficiency over the last decade. In this paper, we propose R-squared fitting model analysis (RFA which is more appropriate for nonlinear correlation analysis. This model can also be applied to other side-channel methods such as second-order CPA and collision-correlation power attack. Our experiments show that the RFA-based attacks bring significant advantages in both time complexity and success rate.

  16. Attack tree based cyber security analysis of nuclear digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Khand, P.A.

    2009-01-01

    To maintain the cyber security, nuclear digital Instrumentation and Control (I and C) systems must be analyzed for security risks because a single security breach due to a cyber attack can cause system failure, which can have catastrophic consequences on the environment and staff of a Nuclear Power Plant (NPP). Attack trees have been widely used to analyze the cyber security of digital systems due to their ability to capture system specific as well as attacker specific details. Therefore, a methodology based on attack trees has been proposed to analyze the cyber security of the systems. The methodology has been applied for the Cyber Security Analysis (CSA) of a Bistable Processor (BP) of a Reactor Protection System (RPS). Threats have been described according to their source. Attack scenarios have been generated using the attack tree and possible counter measures according to the Security Risk Level (SRL) of each scenario have been suggested. Moreover, cyber Security Requirements (SRs) have been elicited, and suitability of the requirements has been checked. (author)

  17. Cyber attack analysis on cyber-physical systems: Detectability, severity, and attenuation strategy

    Science.gov (United States)

    Kwon, Cheolhyeon

    Security of Cyber-Physical Systems (CPS) against malicious cyber attacks is an important yet challenging problem. Since most cyber attacks happen in erratic ways, it is usually intractable to describe and diagnose them systematically. Motivated by such difficulties, this thesis presents a set of theories and algorithms for a cyber-secure architecture of the CPS within the control theoretic perspective. Here, instead of identifying a specific cyber attack model, we are focused on analyzing the system's response during cyber attacks. Firstly, we investigate the detectability of the cyber attacks from the system's behavior under cyber attacks. Specifically, we conduct a study on the vulnerabilities in the CPS's monitoring system against the stealthy cyber attack that is carefully designed to avoid being detected by its detection scheme. After classifying three kinds of cyber attacks according to the attacker's ability to compromise the system, we derive the necessary and sufficient conditions under which such stealthy cyber attacks can be designed to cause the unbounded estimation error while not being detected. Then, the analytical design method of the optimal stealthy cyber attack that maximizes the estimation error is developed. The proposed stealthy cyber attack analysis is demonstrated with illustrative examples on Air Traffic Control (ATC) system and Unmanned Aerial Vehicle (UAV) navigation system applications. Secondly, in an attempt to study the CPSs' vulnerabilities in more detail, we further discuss a methodology to identify potential cyber threats inherent in the given CPSs and quantify the attack severity accordingly. We then develop an analytical algorithm to test the behavior of the CPS under various cyber attack combinations. Compared to a numerical approach, the analytical algorithm enables the prediction of the most effective cyber attack combinations without computing the severity of all possible attack combinations, thereby greatly reducing the

  18. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  19. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  20. A video-polygraphic analysis of the cataplectic attack

    DEFF Research Database (Denmark)

    Rubboli, G; d'Orsi, G; Zaniboni, A

    2000-01-01

    OBJECTIVES AND METHODS: To perform a video-polygraphic analysis of 11 cataplectic attacks in a 39-year-old narcoleptic patient, correlating clinical manifestations with polygraphic findings. Polygraphic recordings monitored EEG, EMG activity from several cranial, trunk, upper and lower limbs...... muscles, eye movements, EKG, thoracic respiration. RESULTS: Eleven attacks were recorded, all of them lasting less than 1 min and ending with the fall of the patient to the ground. We identified, based on the video-polygraphic analysis of the episodes, 3 phases: initial phase, characterized essentially...... with bradycardia, that was maximal during the atonic phase. CONCLUSIONS: Analysis of the muscular phenomena that characterize cataplectic attacks in a standing patient suggests that the cataplectic fall occurs with a pattern that might result from the interaction between neuronal networks mediating muscular atonia...

  1. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  2. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  3. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    OpenAIRE

    Tetyana KOVALCHUK

    2016-01-01

    The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted action...

  4. A video-polygraphic analysis of the cataplectic attack

    DEFF Research Database (Denmark)

    Rubboli, G; d'Orsi, G; Zaniboni, A

    2000-01-01

    OBJECTIVES AND METHODS: To perform a video-polygraphic analysis of 11 cataplectic attacks in a 39-year-old narcoleptic patient, correlating clinical manifestations with polygraphic findings. Polygraphic recordings monitored EEG, EMG activity from several cranial, trunk, upper and lower limbs musc...

  5. Recent Methodology in Ginseng Analysis

    Science.gov (United States)

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  6. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  7. METHODOLOGY OF MATHEMATICAL ANALYSIS IN POWER NETWORK

    OpenAIRE

    Jerzy Szkutnik; Mariusz Kawecki

    2008-01-01

    Power distribution network analysis is taken into account. Based on correlation coefficient authors establish methodology of mathematical analysis useful in finding substations bear responsibility for power stoppage. Also methodology of risk assessment will be carried out.

  8. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  9. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  10. Cyber Attacks During the War on Terrorism: A Predictive Analysis

    National Research Council Canada - National Science Library

    Vatis, Michael

    2001-01-01

    ... responsible for the attack. This paper examines case studies of political conflicts that have led to attacks on cyber systems, such as the recent clashes between India and Pakistan, Israel and the Palestinians, and NATO...

  11. Cyber Attacks During the War on Terrorism: A Predictive Analysis

    National Research Council Canada - National Science Library

    Vatis, Michael

    2001-01-01

    .... Just as the terrorist attacks of September 11, 2001 defied what many thought possible, cyber attacks could escalate in response to United States and allied retaliatory measures against the terrorists...

  12. Quantitative Attack Tree Analysis via Priced Timed Automata

    NARCIS (Netherlands)

    Kumar, Rajesh; Ruijters, Enno Jozef Johannes; Stoelinga, Mariëlle Ida Antoinette; Sankaranarayanan, Sriram; Vicario, Enrico

    The success of a security attack crucially depends on the resources available to an attacker: time, budget, skill level, and risk appetite. Insight in these dependencies and the most vulnerable system parts is key to providing effective counter measures. This paper considers attack trees, one of the

  13. Modeling and Analysis of Information Attack in Computer Networks

    National Research Council Canada - National Science Library

    Pepyne, David

    2003-01-01

    ... (as opposed to physical and other forms of attack) . Information based attacks are attacks that can be carried out from anywhere in the world, while sipping cappuccino at an Internet cafe' or while enjoying the comfort of a living room armchair...

  14. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  15. Analysis of Network Vulnerability Under Joint Node and Link Attacks

    Science.gov (United States)

    Li, Yongcheng; Liu, Shumei; Yu, Yao; Cao, Ting

    2018-03-01

    The security problem of computer network system is becoming more and more serious. The fundamental reason is that there are security vulnerabilities in the network system. Therefore, it’s very important to identify and reduce or eliminate these vulnerabilities before they are attacked. In this paper, we are interested in joint node and link attacks and propose a vulnerability evaluation method based on the overall connectivity of the network to defense this attack. Especially, we analyze the attack cost problem from the attackers’ perspective. The purpose is to find the set of least costs for joint links and nodes, and their deletion will lead to serious network connection damage. The simulation results show that the vulnerable elements obtained from the proposed method are more suitable for the attacking idea of the malicious persons in joint node and link attack. It is easy to find that the proposed method has more realistic protection significance.

  16. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  17. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  18. k-Nearest Neighbors Algorithm in Profiling Power Analysis Attacks

    Directory of Open Access Journals (Sweden)

    Z. Martinasek

    2016-06-01

    Full Text Available Power analysis presents the typical example of successful attacks against trusted cryptographic devices such as RFID (Radio-Frequency IDentifications and contact smart cards. In recent years, the cryptographic community has explored new approaches in power analysis based on machine learning models such as Support Vector Machine (SVM, RF (Random Forest and Multi-Layer Perceptron (MLP. In this paper, we made an extensive comparison of machine learning algorithms in the power analysis. For this purpose, we implemented a verification program that always chooses the optimal settings of individual machine learning models in order to obtain the best classification accuracy. In our research, we used three datasets, the first containing the power traces of an unprotected AES (Advanced Encryption Standard implementation. The second and third datasets are created independently from public available power traces corresponding to a masked AES implementation (DPA Contest v4. The obtained results revealed some interesting facts, namely, an elementary k-NN (k-Nearest Neighbors algorithm, which has not been commonly used in power analysis yet, shows great application potential in practice.

  19. Quantitative security and safety analysis with attack-fault trees

    NARCIS (Netherlands)

    Kumar, Rajesh; Stoelinga, Mariëlle Ida Antoinette

    2017-01-01

    Cyber physical systems, like power plants, medical devices and data centers have to meet high standards, both in terms of safety (i.e. absence of unintentional failures) and security (i.e. no disruptions due to malicious attacks). This paper presents attack fault trees (AFTs), a formalism that

  20. Anti-discrimination Analysis Using Privacy Attack Strategies

    KAUST Repository

    Ruggieri, Salvatore

    2014-09-15

    Social discrimination discovery from data is an important task to identify illegal and unethical discriminatory patterns towards protected-by-law groups, e.g., ethnic minorities. We deploy privacy attack strategies as tools for discrimination discovery under hard assumptions which have rarely tackled in the literature: indirect discrimination discovery, privacy-aware discrimination discovery, and discrimination data recovery. The intuition comes from the intriguing parallel between the role of the anti-discrimination authority in the three scenarios above and the role of an attacker in private data publishing. We design strategies and algorithms inspired/based on Frèchet bounds attacks, attribute inference attacks, and minimality attacks to the purpose of unveiling hidden discriminatory practices. Experimental results show that they can be effective tools in the hands of anti-discrimination authorities.

  1. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  2. A Comprehensive Taxonomy and Analysis of IEEE 802.15.4 Attacks

    Directory of Open Access Journals (Sweden)

    Yasmin M. Amin

    2016-01-01

    Full Text Available The IEEE 802.15.4 standard has been established as the dominant enabling technology for Wireless Sensor Networks (WSNs. With the proliferation of security-sensitive applications involving WSNs, WSN security has become a topic of great significance. In comparison with traditional wired and wireless networks, WSNs possess additional vulnerabilities which present opportunities for attackers to launch novel and more complicated attacks against such networks. For this reason, a thorough investigation of attacks against WSNs is required. This paper provides a single unified survey that dissects all IEEE 802.15.4 PHY and MAC layer attacks known to date. While the majority of existing references investigate the motive and behavior of each attack separately, this survey classifies the attacks according to clear metrics within the paper and addresses the interrelationships and differences between the attacks following their classification. The authors’ opinions and comments regarding the placement of the attacks within the defined classifications are also provided. A comparative analysis between the classified attacks is then performed with respect to a set of defined evaluation criteria. The first half of this paper addresses attacks on the IEEE 802.15.4 PHY layer, whereas the second half of the paper addresses IEEE 802.15.4 MAC layer attacks.

  3. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  4. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    Directory of Open Access Journals (Sweden)

    Juan Carlos Rincón-Vásquez

    2011-12-01

    Full Text Available This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1 Structure Research, 2 Collection of informationand, 3 Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  5. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  6. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis.

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim' based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks.

  7. Stealthy false data injection attacks using matrix recovery and independent component analysis in smart grid

    Science.gov (United States)

    JiWei, Tian; BuHong, Wang; FuTe, Shang; Shuaiqi, Liu

    2017-05-01

    Exact state estimation is vital important to maintain common operations of smart grids. Existing researches demonstrate that state estimation output could be compromised by malicious attacks. However, to construct the attack vectors, a usual presumption in most works is that the attacker has perfect information regarding the topology and so on even such information is difficult to acquire in practice. Recent research shows that Independent Component Analysis (ICA) can be used for inferring topology information which can be used to originate undetectable attacks and even to alter the price of electricity for the profits of attackers. However, we found that the above ICA-based blind attack tactics is merely feasible in the environment with Gaussian noises. If there are outliers (device malfunction and communication errors), the Bad Data Detector will easily detect the attack. Hence, we propose a robust ICA based blind attack strategy that one can use matrix recovery to circumvent the outlier problem and construct stealthy attack vectors. The proposed attack strategies are tested with IEEE representative 14-bus system. Simulations verify the feasibility of the proposed method.

  8. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  9. Debate on vaccines and autoimmunity: Do not attack the author, yet discuss it methodologically.

    Science.gov (United States)

    Bragazzi, Nicola Luigi; Watad, Abdulla; Amital, Howard; Shoenfeld, Yehuda

    2017-10-09

    Since Jenner, vaccines and vaccinations have stirred a hot, highly polarized debate, leading to contrasting positions and feelings, ranging from acritical enthusiasm to blind denial. On the one hand, we find anti-vaccination movements which divulge and disseminate misleading information, myths, prejudices, and even frauds, with the main aim of denying that vaccination practices represent a major public health measure, being effective in controlling infectious diseases and safeguarding the wellbeing of entire communities. Recently, the authors of many vaccine safety investigations are being personally criticized rather than the actual science being methodologically assessed and critiqued. Unfortunately, this could result in making vaccine safety science a "hazardous occupation". Critiques should focus on the science and not on the authors and on the scientists that publish reasonably high-quality science suggesting a problem with a given vaccine. These scientists require adequate professional protection so there are not disincentives to publish and to carry out researches in the field. The issues for vaccine safety are not dissimilar to other areas such as medical errors and drug safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Rat sperm motility analysis: methodologic considerations

    Science.gov (United States)

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  11. Angle-of-attack estimation for analysis of CAT encounters

    Science.gov (United States)

    Bach, R. E., Jr.; Parks, E. K.

    1985-01-01

    Recent studies of clear-air turbulence (CAT) encounters involving wide-body airliners have been based upon flight-path wind estimates made by analyzing digital flight-data-recorder (DFDR) records and radar records. Such estimates require a time history of the aircraft angle of attack, a record that is not usually included in the DFDR measurement set. This paper describes a method for reconstructing angle of attack that utilizes available flight record and aircraft-specific information associated with an aerodynamic model of the lift coefficient. Results from two wide-body incidents in which vane measurements of angle of attack were recorded show good agreement between measured and calculated time histories. This research has been performed in cooperation with the National Transportation Safety Board to provide a better understanding of the CAT phenomenon.

  12. Ancestry analysis in the 11-M Madrid bomb attack investigation.

    Directory of Open Access Journals (Sweden)

    Christopher Phillips

    Full Text Available The 11-M Madrid commuter train bombings of 2004 constituted the second biggest terrorist attack to occur in Europe after Lockerbie, while the subsequent investigation became the most complex and wide-ranging forensic case in Spain. Standard short tandem repeat (STR profiling of 600 exhibits left certain key incriminatory samples unmatched to any of the apprehended suspects. A judicial order to perform analyses of unmatched samples to differentiate European and North African ancestry became a critical part of the investigation and was instigated to help refine the search for further suspects. Although mitochondrial DNA (mtDNA and Y-chromosome markers routinely demonstrate informative geographic differentiation, the populations compared in this analysis were known to show a proportion of shared mtDNA and Y haplotypes as a result of recent gene-flow across the western Mediterranean, while any two loci can be unrepresentative of the ancestry of an individual as a whole. We based our principal analysis on a validated 34plex autosomal ancestry-informative-marker single nucleotide polymorphism (AIM-SNP assay to make an assignment of ancestry for DNA from seven unmatched case samples including a handprint from a bag containing undetonated explosives together with personal items recovered from various locations in Madrid associated with the suspects. To assess marker informativeness before genotyping, we predicted the probable classification success for the 34plex assay with standard error estimators for a naïve Bayesian classifier using Moroccan and Spanish training sets (each n = 48. Once misclassification error was found to be sufficiently low, genotyping yielded seven near-complete profiles (33 of 34 AIM-SNPs that in four cases gave probabilities providing a clear assignment of ancestry. One of the suspects predicted to be North African by AIM-SNP analysis of DNA from a toothbrush was identified late in the investigation as Algerian in origin. The

  13. Ancestry analysis in the 11-M Madrid bomb attack investigation.

    Science.gov (United States)

    Phillips, Christopher; Prieto, Lourdes; Fondevila, Manuel; Salas, Antonio; Gómez-Tato, Antonio; Alvarez-Dios, José; Alonso, Antonio; Blanco-Verea, Alejandro; Brión, María; Montesino, Marta; Carracedo, Angel; Lareu, María Victoria

    2009-08-11

    The 11-M Madrid commuter train bombings of 2004 constituted the second biggest terrorist attack to occur in Europe after Lockerbie, while the subsequent investigation became the most complex and wide-ranging forensic case in Spain. Standard short tandem repeat (STR) profiling of 600 exhibits left certain key incriminatory samples unmatched to any of the apprehended suspects. A judicial order to perform analyses of unmatched samples to differentiate European and North African ancestry became a critical part of the investigation and was instigated to help refine the search for further suspects. Although mitochondrial DNA (mtDNA) and Y-chromosome markers routinely demonstrate informative geographic differentiation, the populations compared in this analysis were known to show a proportion of shared mtDNA and Y haplotypes as a result of recent gene-flow across the western Mediterranean, while any two loci can be unrepresentative of the ancestry of an individual as a whole. We based our principal analysis on a validated 34plex autosomal ancestry-informative-marker single nucleotide polymorphism (AIM-SNP) assay to make an assignment of ancestry for DNA from seven unmatched case samples including a handprint from a bag containing undetonated explosives together with personal items recovered from various locations in Madrid associated with the suspects. To assess marker informativeness before genotyping, we predicted the probable classification success for the 34plex assay with standard error estimators for a naïve Bayesian classifier using Moroccan and Spanish training sets (each n = 48). Once misclassification error was found to be sufficiently low, genotyping yielded seven near-complete profiles (33 of 34 AIM-SNPs) that in four cases gave probabilities providing a clear assignment of ancestry. One of the suspects predicted to be North African by AIM-SNP analysis of DNA from a toothbrush was identified late in the investigation as Algerian in origin. The results achieved

  14. Performance analysis of DoS LAND attack detection

    Directory of Open Access Journals (Sweden)

    Deepak Kshirsagar

    2016-09-01

    This paper proposes the intrusion detection mechanism for DoS detection such as Local Area Network Denial (LAND, which classified into the Network Traffic Analyzer, Traffic Features Identification and Extraction, IP spoofing based attack detection and Intruder Information. This system efficiently detects DoS LAND based on IP spoofing. This system analyzes the network resources consumed by an attacker. The system is implemented and tested using open source tools. The experimental result shows that, the proposed system produces better performance in comparison with state-of-art existing system and result into a low level of memory and CPU usage.

  15. Bridging Two Worlds: Reconciling Practical Risk Assessment Methodologies with Theory of Attack Trees

    NARCIS (Netherlands)

    Gadyatskaya, Olga; Harpes, Carlo; Mauw, Sjouke; Muller, Cedric; Muller, Steve

    2016-01-01

    Security risk treatment often requires a complex cost-benefit analysis to be carried out in order to select countermeasures that optimally reduce risks while having minimal costs. According to ISO/IEC 27001, risk treatment relies on catalogues of countermeasures, and the analysts are expected to

  16. Methodology of human factor analysis

    International Nuclear Information System (INIS)

    Griffon-Fouco, M.

    1988-01-01

    The paper describes the manner in which the Heat Production Department of Electricite de France analyses the human factors in nuclear power plants. After describing the teams and structures set up to deal with this subject, the paper emphasizes two types of methods which are used, most often in complementary fashion: (1) an a posteriori analysis, which consists in studying the events which have taken place at nuclear power plants and in seeking the deepseated causes so as to prevent their reoccurrence in future; (2) an a priori analysis, which consists in analysing a work situation and in detecting all its potential failure factors so as to prevent their resulting once again in dysfunctions of the facility. To illustrate these two types of analysis, two examples are given: first, a study of the telephonic communications between operators in one plant (in which the a posteriori and a priori analysis are developed) and, second, a study of stress in a plant (in which only the a priori analysis is used). (author). 1 tab

  17. TREsPASS: Plug-and-Play Attacker Profiles for Security Risk Analysis (Poster)

    NARCIS (Netherlands)

    Pieters, Wolter; Hadziosmanovic, D.; Lenin, Aleksandr; Montoya, L.; Willemson, Jan

    Existing methods for security risk analysis typically estimate time, cost, or likelihood of success of attack steps. When the threat environment changes, such values have to be updated as well. However, the estimated values reflect both system properties and attacker properties: the time required

  18. Denial-of-service attack detection based on multivariate correlation analysis

    NARCIS (Netherlands)

    Tan, Zhiyuan; Jamdagni, Aruna; He, Xiangjian; Nanda, Priyadarsi; Liu, Ren Ping; Lu, Bao-Liang; Zhang, Liqing; Kwok, James

    2011-01-01

    The reliability and availability of network services are being threatened by the growing number of Denial-of-Service (DoS) attacks. Effective mechanisms for DoS attack detection are demanded. Therefore, we propose a multivariate correlation analysis approach to investigate and extract second-order

  19. Modeling and Analysis of Information Attack in Computer Networks

    National Research Council Canada - National Science Library

    Pepyne, David

    2003-01-01

    .... Such attacks are particularly problematic because they take place in a "virtual cyber world" that lacks the social, economic, legal, and physical barriers and protections that control and limit crime in the material world. Research outcomes include basic theory, a modeling framework for Internet worms and email viruses, a sensor for user profiling, and a simple protocol for enhancing wireless security.

  20. Securing Cooperative Spectrum Sensing Against Collusive SSDF Attack using XOR Distance Analysis in Cognitive Radio Networks.

    Science.gov (United States)

    Feng, Jingyu; Zhang, Man; Xiao, Yun; Yue, Hongzhou

    2018-01-27

    Cooperative spectrum sensing (CSS) is considered as a powerful approach to improve the utilization of scarce spectrum resources. However, if CSS assumes that all secondary users (SU) are honest, it may offer opportunities for attackers to conduct a spectrum sensing data falsification (SSDF) attack. To suppress such a threat, recent efforts have been made to develop trust mechanisms. Currently, some attackers can collude with each other to form a collusive clique, and thus not only increase the power of SSDF attack but also avoid the detection of a trust mechanism. Noting the duality of sensing data, we propose a defense scheme called XDA from the perspective of XOR distance analysis to suppress a collusive SSDF attack. In the XDA scheme, the XOR distance calculation in line with the type of "0" and "1" historical sensing data is used to measure the similarity between any two SUs. Noting that collusive SSDF attackers hold high trust value and the minimum XOR distance, the algorithm to detect collusive SSDF attackers is designed. Meanwhile, the XDA scheme can perfect the trust mechanism to correct collusive SSDF attackers' trust value. Simulation results show that the XDA scheme can enhance the accuracy of trust evaluation, and thus successfully reduce the power of collusive SSDF attack against CSS.

  1. An Analysis of Cyber-Attack on NPP Considering Physical Impact

    International Nuclear Information System (INIS)

    Lee, In Hyo; Kang, Hyun Gook; Son, Han Seong

    2016-01-01

    Some research teams performed related works on cyber-physical system which is a system that cyber-attack can lead to serious consequences including product loss, damage, injury and death when it is attacked. They investigated the physical impact on cyber-physical system due to the cyber-attack. But it is hard to find the research about NPP cyber security considering the physical impact or safety. In this paper, to investigate the relationship between physical impact and cyber-attack, level 1 PSA results are utilized in chapter 2 and cyber-attack analysis is performed in chapter 3. The cyber security issue on NPP is inevitable issue. Unlike general cyber security, cyber-physical system like NPP can induce serious consequences such as core damage by cyber-attack. So in this paper, to find how hacker can attack the NPP, (1) PSA results were utilized to find the relationship between physical system and cyber-attack and (2) vulnerabilities on digital control systems were investigated to find how hacker can implement the possible attack. It is expected that these steps are utilized when establishing penetration test plans or cyber security drill plans

  2. An Analysis of Cyber-Attack on NPP Considering Physical Impact

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joonbu University, Geumsan (Korea, Republic of)

    2016-05-15

    Some research teams performed related works on cyber-physical system which is a system that cyber-attack can lead to serious consequences including product loss, damage, injury and death when it is attacked. They investigated the physical impact on cyber-physical system due to the cyber-attack. But it is hard to find the research about NPP cyber security considering the physical impact or safety. In this paper, to investigate the relationship between physical impact and cyber-attack, level 1 PSA results are utilized in chapter 2 and cyber-attack analysis is performed in chapter 3. The cyber security issue on NPP is inevitable issue. Unlike general cyber security, cyber-physical system like NPP can induce serious consequences such as core damage by cyber-attack. So in this paper, to find how hacker can attack the NPP, (1) PSA results were utilized to find the relationship between physical system and cyber-attack and (2) vulnerabilities on digital control systems were investigated to find how hacker can implement the possible attack. It is expected that these steps are utilized when establishing penetration test plans or cyber security drill plans.

  3. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  4. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  5. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  6. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  7. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  8. Use of forensic analysis to better understand shark attack behaviour.

    Science.gov (United States)

    Ritter, E; Levine, M

    2004-12-01

    Shark attacks have primarily been analyzed from wound patterns, with little knowledge of a shark's approach, behaviour and intention leading to such wounds. For the first time, during a shark-human interaction project in South Africa, a white shark, Carcharodon carcharias, was filmed biting a vertically positioned person at the water surface, and exhibiting distinct approach patterns leading to the bite. This bite was compared to ten white shark attacks that occurred (i) in the same geographical area of South Africa, and (ii) where the same body parts were bitten. Close similarity of some of these wound patterns to the bite imprint of the videotaped case indicate that the observed behaviour of the white shark may represent a common pattern of approaching and biting humans.

  9. Ancestry Analysis in the 11-M Madrid Bomb Attack Investigation

    OpenAIRE

    Phillips, Christopher; Prieto, Lourdes; Fondevila, Manuel; Salas, Antonio; G?mez-Tato, Antonio; ?lvarez-Dios, Jos?; Alonso, Antonio; Blanco-Verea, Alejandro; Bri?n, Mar?a; Montesino, Marta; Carracedo, ?ngel; Lareu, Mar?a Victoria

    2009-01-01

    The 11-M Madrid commuter train bombings of 2004 constituted the second biggest terrorist attack to occur in Europe after Lockerbie, while the subsequent investigation became the most complex and wide-ranging forensic case in Spain. Standard short tandem repeat (STR) profiling of 600 exhibits left certain key incriminatory samples unmatched to any of the apprehended suspects. A judicial order to perform analyses of unmatched samples to differentiate European and North African ancestry became a...

  10. Statistical Meta-Analysis of Presentation Attacks for Secure Multibiometric Systems.

    Science.gov (United States)

    Biggio, Battista; Fumera, Giorgio; Marcialis, Gian Luca; Roli, Fabio

    2017-03-01

    Prior work has shown that multibiometric systems are vulnerable to presentation attacks, assuming that their matching score distribution is identical to that of genuine users, without fabricating any fake trait. We have recently shown that this assumption is not representative of current fingerprint and face presentation attacks, leading one to overestimate the vulnerability of multibiometric systems, and to design less effective fusion rules. In this paper, we overcome these limitations by proposing a statistical meta-model of face and fingerprint presentation attacks that characterizes a wider family of fake score distributions, including distributions of known and, potentially, unknown attacks. This allows us to perform a thorough security evaluation of multibiometric systems against presentation attacks, quantifying how their vulnerability may vary also under attacks that are different from those considered during design, through an uncertainty analysis. We empirically show that our approach can reliably predict the performance of multibiometric systems even under never-before-seen face and fingerprint presentation attacks, and that the secure fusion rules designed using our approach can exhibit an improved trade-off between the performance in the absence and in the presence of attack. We finally argue that our method can be extended to other biometrics besides faces and fingerprints.

  11. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  12. Exploring participatory methodologies in organizational discourse analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork...

  13. Analysis for Ad Hoc Network Attack-Defense Based on Stochastic Game Model

    Directory of Open Access Journals (Sweden)

    Yuanjie LI

    2014-06-01

    Full Text Available The attack actions analysis for Ad Hoc networks can provide a reference for the design security mechanisms. This paper presents an analysis method of security of Ad Hoc networks based on Stochastic Game Nets (SGN. This method can establish a SGN model of Ad Hoc networks and calculate to get the Nash equilibrium strategy. After transforming the SGN model into a continuous-time Markov Chain (CTMC, the security of Ad Hoc networks can be evaluated and analyzed quantitatively by calculating the stationary probability of CTMC. Finally, the Matlab simulation results show that the probability of successful attack is related to the attack intensity and expected payoffs, but not attack rate.

  14. Cyber-Attacks on Smart Meters in Household Nanogrid: Modeling, Simulation and Analysis

    Directory of Open Access Journals (Sweden)

    Denise Tellbach

    2018-02-01

    Full Text Available The subject of cyber-security and therefore cyber-attacks on smart grid (SG has become subject of many publications in the last years, emphasizing its importance in research, as well as in practice. One especially vulnerable part of SG are smart meters (SMs. The major contribution of simulating a variety of cyber-attacks on SMs that have not been done in previous studies is the identification and quantification of the possible impacts on the security of SG. In this study, a simulation model of a nanogrid, including a complete household with an SM, was developed. Different cyber-attacks were injected into the SM to simulate their effects on household nanogrid. The analysis of the impacts of different cyber-attacks showed that the effects of cyber-attacks can be sorted into various categories. Integrity and confidentiality attacks cause monetary effects on the grid. While, availability attacks have monetary effects on the grid as well, they are mainly aimed at compromising the SM communication by either delaying or stopping it completely.

  15. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  16. Robust Structural Analysis and Design of Distributed Control Systems to Prevent Zero Dynamics Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Weerakkody, Sean [Carnegie Mellon Univ., Pittsburgh, PA (United States); Liu, Xiaofei [Carnegie Mellon Univ., Pittsburgh, PA (United States); Sinopoli, Bruno [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2017-12-12

    We consider the design and analysis of robust distributed control systems (DCSs) to ensure the detection of integrity attacks. DCSs are often managed by independent agents and are implemented using a diverse set of sensors and controllers. However, the heterogeneous nature of DCSs along with their scale leave such systems vulnerable to adversarial behavior. To mitigate this reality, we provide tools that allow operators to prevent zero dynamics attacks when as many as p agents and sensors are corrupted. Such a design ensures attack detectability in deterministic systems while removing the threat of a class of stealthy attacks in stochastic systems. To achieve this goal, we use graph theory to obtain necessary and sufficient conditions for the presence of zero dynamics attacks in terms of the structural interactions between agents and sensors. We then formulate and solve optimization problems which minimize communication networks while also ensuring a resource limited adversary cannot perform a zero dynamics attacks. Polynomial time algorithms for design and analysis are provided.

  17. Unique Approach to Threat Analysis Mapping: A Malware Centric Methodology for Better Understanding the Adversary Landscape

    Science.gov (United States)

    2016-04-05

    A Unique Approach to Threat Analysis Mapping: A Malware-Centric Methodology for Better Understanding the Adversary Landscape Deana Shick Kyle...allows at- tackers to execute arbitrary code via unspecified vectors [Mitre 2016]. Again the wide landscape and usage of Adobe Flash Player made this...after-free vulnerability in Microsoft Internet Ex- plorer affecting versions 9 and 10 [Mitre 2016]. The attack landscape of these vulnerabilities was

  18. Pest Risk Analysis - a Way to Counter Attack Losses Caused

    OpenAIRE

    FLEŞERIU A.; I. OROIAN; Oana VIMAN; I. BRAŞOVEAN

    2010-01-01

    Risk analysis against pathogens is a process of investigation, assessment and decision-making information on apest that begins when it is known or determined that it is a quarantine pest. Pest risk analysis is meaningful only inrelation to an area considered to be at risk. The annual losses are about 35% but can be greatly reduced after applicationof pest risk analysis.

  19. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  20. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  1. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  2. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    Lu Zhigang; Wu Huan; Liu Baoxu

    2007-01-01

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  3. Simulation of Attacks for Security in Wireless Sensor Network.

    Science.gov (United States)

    Diaz, Alvaro; Sanchez, Pablo

    2016-11-18

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.

  4. Vulnerability analysis and critical areas identification of the power systems under terrorist attacks

    Science.gov (United States)

    Wang, Shuliang; Zhang, Jianhua; Zhao, Mingwei; Min, Xu

    2017-05-01

    This paper takes central China power grid (CCPG) as an example, and analyzes the vulnerability of the power systems under terrorist attacks. To simulate the intelligence of terrorist attacks, a method of critical attack area identification according to community structures is introduced. Meanwhile, three types of vulnerability models and the corresponding vulnerability metrics are given for comparative analysis. On this basis, influence of terrorist attacks on different critical areas is studied. Identifying the vulnerability of different critical areas will be conducted. At the same time, vulnerabilities of critical areas under different tolerance parameters and different vulnerability models are acquired and compared. Results show that only a few number of vertex disruptions may cause some critical areas collapse completely, they can generate great performance losses the whole systems. Further more, the variation of vulnerability values under different scenarios is very large. Critical areas which can cause greater damage under terrorist attacks should be given priority of protection to reduce vulnerability. The proposed method can be applied to analyze the vulnerability of other infrastructure systems, they can help decision makers search mitigation action and optimum protection strategy.

  5. A Hop-Count Analysis Scheme for Avoiding Wormhole Attacks in MANET.

    Science.gov (United States)

    Jen, Shang-Ming; Laih, Chi-Sung; Kuo, Wen-Chung

    2009-01-01

    MANET, due to the nature of wireless transmission, has more security issues compared to wired environments. A specific type of attack, the Wormhole attack does not require exploiting any nodes in the network and can interfere with the route establishment process. Instead of detecting wormholes from the role of administrators as in previous methods, we implement a new protocol, MHA, using a hop-count analysis from the viewpoint of users without any special environment assumptions. We also discuss previous works which require the role of administrator and their reliance on impractical assumptions, thus showing the advantages of MHA.

  6. A Hop-Count Analysis Scheme for Avoiding Wormhole Attacks in MANET

    Directory of Open Access Journals (Sweden)

    Chi-Sung Laih

    2009-06-01

    Full Text Available MANET, due to the nature of wireless transmission, has more security issues compared to wired environments. A specific type of attack, the Wormhole attack does not require exploiting any nodes in the network and can interfere with the route establishment process. Instead of detecting wormholes from the role of administrators as in previous methods, we implement a new protocol, MHA, using a hop-count analysis from the viewpoint of users without any special environment assumptions. We also discuss previous works which require the role of administrator and their reliance on impractical assumptions, thus showing the advantages of MHA.

  7. ADTool: Security Analysis with Attack-Defense Trees

    NARCIS (Netherlands)

    Kordy, Barbara; Kordy, P.T.; Mauw, Sjouke; Schweitzer, Patrick; Joshi, Kaustubh; Siegle, Markus; Stoelinga, Mariëlle Ida Antoinette; d' Argenio, P.R.

    ADTool is free, open source software assisting graphical modeling and quantitative analysis of security, using attack–defense trees. The main features of ADTool are easy creation, efficient editing, and automated bottom-up evaluation of security-relevant measures. The tool also supports the usage of

  8. Attacking Automatic Video Analysis Algorithms: A Case Study of Google Cloud Video Intelligence API

    OpenAIRE

    Hosseini, Hossein; Xiao, Baicen; Clark, Andrew; Poovendran, Radha

    2017-01-01

    Due to the growth of video data on Internet, automatic video analysis has gained a lot of attention from academia as well as companies such as Facebook, Twitter and Google. In this paper, we examine the robustness of video analysis algorithms in adversarial settings. Specifically, we propose targeted attacks on two fundamental classes of video analysis algorithms, namely video classification and shot detection. We show that an adversary can subtly manipulate a video in such a way that a human...

  9. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  10. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  11. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  12. Methodological considerations for improving Western blot analysis.

    Science.gov (United States)

    MacPhee, Daniel J

    2010-01-01

    The need for a technique that could allow the determination of antigen specificity of antisera led to the development of a method that allowed the production of a replica of proteins, which had been separated electrophoretically on polyacrylamide gels, on to a nitrocellulose membrane. This method was coined Western blotting and is very useful to study the presence, relative abundance, relative molecular mass, post-translational modification, and interaction of specific proteins. As a result it is utilized routinely in many fields of scientific research such as chemistry, biology and biomedical sciences. This review serves to touch on some of the methodological conditions that should be considered to improve Western blot analysis, particularly as a guide for graduate students but also scientists who wish to continue adapting this now fundamental research tool. Copyright 2009 Elsevier Inc. All rights reserved.

  13. AR.Drone: security threat analysis and exemplary attack to track persons

    Science.gov (United States)

    Samland, Fred; Fruth, Jana; Hildebrandt, Mario; Hoppe, Tobias; Dittmann, Jana

    2012-01-01

    In this article we illustrate an approach of a security threat analysis of the quadrocopter AR.Drone, a toy for augmented reality (AR) games. The technical properties of the drone can be misused for attacks, which may relate security and/or privacy aspects. Our aim is to sensitize for the possibility of misuses and the motivation for an implementation of improved security mechanisms of the quadrocopter. We focus primarily on obvious security vulnerabilities (e.g. communication over unencrypted WLAN, usage of UDP, live video streaming via unencrypted WLAN to the control device) of this quadrocopter. We could practically verify in three exemplary scenarios that this can be misused by unauthorized persons for several attacks: high-jacking of the drone, eavesdropping of the AR.Drones unprotected video streams, and the tracking of persons. Amongst other aspects, our current research focuses on the realization of the attack of tracking persons and objects with the drone. Besides the realization of attacks, we want to evaluate the potential of this particular drone for a "safe-landing" function, as well as potential security enhancements. Additionally, in future we plan to investigate an automatic tracking of persons or objects without the need of human interactions.

  14. Risk analysis methodology designed for small and medium enterprises

    OpenAIRE

    Ladislav Beránek; Radim Remeš

    2009-01-01

    The aim of this paper is to present risk analysis procedures successfully applied by several Czech small and medium enterprises. The paper presents in detail the individual steps we use in risk analysis of small and medium enterprises in the Czech Republic. Suggested method to risk analysis is based on the modification of the FRAP methodology and the BITS recommendation. Modifications of both methodologies are described in detail. We propose modified risk analysis methodology which is quick a...

  15. Attack rates assessment of the 2009 pandemic H1N1 influenza A in children and their contacts: a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Aharona Glatman-Freedman

    Full Text Available BACKGROUND: The recent H1N1 influenza A pandemic was marked by multiple reports of illness and hospitalization in children, suggesting that children may have played a major role in the propagation of the virus. A comprehensive detailed analysis of the attack rates among children as compared with their contacts in various settings is of great importance for understanding their unique role in influenza pandemics. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE (PubMed and Embase for published studies reporting outbreak investigations with direct measurements of attack rates of the 2009 pandemic H1N1 influenza A among children, and quantified how these compare with those of their contacts. We identified 50 articles suitable for review, which reported school, household, travel and social events. The selected reports and our meta-analysis indicated that children had significantly higher attack rates as compared to adults, and that this phenomenon was observed for both virologically confirmed and clinical cases, in various settings and locations around the world. The review also provided insight into some characteristics of transmission between children and their contacts in the various settings. CONCLUSION/SIGNIFICANCE: The consistently higher attack rates of the 2009 pandemic H1N1 influenza A among children, as compared to adults, as well as the magnitude of the difference is important for understanding the contribution of children to disease burden, for implementation of mitigation strategies directed towards children, as well as more precise mathematical modeling and simulation of future influenza pandemics.

  16. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  17. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  18. Cyber Security Analysis by Attack Trees for a Reactor Protection System

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Lee, Cheol Kwon; Choi, Jong Gyun; Kim, Dong Hoon; Lee, Young Jun; Kwon, Kee-Choon

    2008-01-01

    As nuclear facilities are introducing digital systems, the cyber security becomes an emerging topic to be analyzed and resolved. The domestic and other nation's regulatory bodies notice this topic and are preparing an appropriate guidance. The nuclear industry where new construction or upgrade of I and C systems is planned is analyzing and establishing a cyber security. A risk-based analysis for the cyber security has been performed in the KNICS (Korea Nuclear I and C Systems) project where the cyber security analysis has been applied to a reactor protection system (RPS). In this paper, the cyber security analysis based on the attack trees is proposed for the KNICS RPS

  19. Use of Attack Graphs in Security Systems

    Directory of Open Access Journals (Sweden)

    Vivek Shandilya

    2014-01-01

    Full Text Available Attack graphs have been used to model the vulnerabilities of the systems and their potential exploits. The successful exploits leading to the partial/total failure of the systems are subject of keen security interest. Considerable effort has been expended in exhaustive modeling, analyses, detection, and mitigation of attacks. One prominent methodology involves constructing attack graphs of the pertinent system for analysis and response strategies. This not only gives the simplified representation of the system, but also allows prioritizing the security properties whose violations are of greater concern, for both detection and repair. We present a survey and critical study of state-of-the-art technologies in attack graph generation and use in security system. Based on our research, we identify the potential, challenges, and direction of the current research in using attack graphs.

  20. Analysis Of Default Passwords In Routers Against Brute-Force Attack

    Directory of Open Access Journals (Sweden)

    Mohammed Farik

    2015-08-01

    Full Text Available Abstract Password authentication is the main means of access control on network routers and router manufacturers provide a default password for initial login to the router. While there has been many publications regarding the minimum requirements of a good password how widely the manufacturers themselves are adhering to the minimum standards and whether these passwords can withstand brute-force attack are not widely known. The novelty of this research is that this is the first time default passwords have been analyzed and documented from such a large variety of router models to reveal password strengths or weaknesses against brute-force attacks. Firstly individual default router password of each model was collected tabulated and tested using password strength meter for entropy. Then descriptive statistical analysis was performed on the tabulated data. The analysis revealed quantitatively how strong or weak default passwords are against brute-force attacks. The results of this research give router security researchers router manufacturers router administrators a useful guide on the strengths and weaknesses of passwords that follow similar patterns.

  1. Stability Analysis of Hypersonic Boundary Layer over a Cone at Small Angle of Attack

    Directory of Open Access Journals (Sweden)

    Feng Ji

    2014-04-01

    Full Text Available An investigation on the stability of hypersonic boundary layer over a cone at small angle of attack has been performed. After obtaining the steady base flow, linear stability theory (LST analysis has been made with local parallel assumption. The growth rates of the first mode and second mode waves at different streamwise locations and different azimuthal angles are obtained. The results show that the boundary layer stability was greatly influenced by small angles of attack. The maximum growth rate of the most unstable wave on the leeward is larger than that on the windward. Moreover, dominating second mode wave starts earlier on the leeward than that on the windward. The LST result also shows that there is a “valley” region around 120°~150° meridian in the maximum growth rates curve.

  2. Methodological Factors in Determining Risk of Dementia After Transient Ischemic Attack and Stroke: (III) Applicability of Cognitive Tests.

    Science.gov (United States)

    Pendlebury, Sarah T; Klaus, Stephen P; Thomson, Ross J; Mehta, Ziyah; Wharton, Rose M; Rothwell, Peter M

    2015-11-01

    Cognitive assessment is recommended after stroke but there are few data on the applicability of short cognitive tests to the full spectrum of patients. We therefore determined the rates, causes, and associates of untestability in a population-based study of all transient ischemic attack (TIA) and stroke. Patients with TIA or stroke prospectively recruited (2002-2007) into the Oxford Vascular Study had ≥1 short cognitive test (mini-mental state examination, telephone interview of cognitive status, Montreal cognitive assessment, and abbreviated mental test score) at baseline and on follow-up to 5 years. Among 1097 consecutive assessed survivors (mean: age/SD, 74.8/12.1 years; 378 TIA), numbers testable with a short cognitive test at baseline, 1, 6, 12, and 60 months were 835/1097 (76%), 778/947 (82%), 756/857 (88%), 692/792 (87%), and 472/567 (83%). Eighty-eight percent (331/378) of assessed patients with TIA were testable at baseline compared with only 46% (133/290) of major stroke (Pstroke effects at baseline (153/262 [58%]: dysphasia/anarthria/hemiparesis=84 [32%], drowsiness=58 [22%], and acute confusion=11 [4%]), whereas sensory deficits caused relatively more problems with testing at later time points (24/63 [38%] at 5 years). Substantial numbers of patients with TIA and stroke are untestable with short cognitive tests. Future studies should report data on untestable patients and those with problems with testing in whom the likelihood of dementia is high. © 2015 American Heart Association, Inc.

  3. Dynamic Analysis of the Evolution of Cereus peruvianus (Cactaceae Areas Attacked by Phoma sp.

    Directory of Open Access Journals (Sweden)

    Gyorgy FESZT

    2009-12-01

    Full Text Available Cereus Peruvianus (night blooming Cereus, or peruvian apple is one of the sensitive species to Phoma attack. Photographic images can intercept a certain phytopathology, at a certain moment. The computerized analysis of such an image turns into a value the spread which the phytopathological process has at that moment. The purpose of this study is to assimilate the technique of achieving successions of digital photos of Cereus peruvianus f. monstruosa attacked by Phoma sp. Parallely with recording the images, with the help of Rhythm digital temperature humidity controller, were recorded data about the green house microclimate (air humidity-minimum and maximum, temperature-minimum and maximum. In the first stage of the study, the attack presents small fluctuations, reaching a high level in days with low temperatures. So, the most significant growths were recorded in the periods: 10. 02. 2005-20. 02. 2005 with an affected area of 10.97-8.82 = 2.15 and 11. 03. 2005-22. 04. 2005 with growth differences of 14.67-13.32 = 1.35. Generally, the affected areas grow in days with low minimum temperatures. The great advantage of this technique is represented by the possibility of using in situ in home areas of species or crop plants in fields. Repeated images, achieved in time, then overlapped, can provide important data on the evolution of affected areas.

  4. Performance analysis of chaotic and white watermarks in the presence of common watermark attacks

    International Nuclear Information System (INIS)

    Mooney, Aidan; Keating, John G.; Heffernan, Daniel M.

    2009-01-01

    Digital watermarking is a technique that aims to embed a piece of information permanently into some digital media, which may be used at a later stage to prove owner authentication and attempt to provide protection to documents. The most common watermark types used to date are pseudorandom number sequences which possess a white spectrum. Chaotic watermark sequences have been receiving increasing interest recently and have been shown to be an alternative to the pseudorandom watermark types. In this paper the performance of pseudorandom watermarks and chaotic watermarks in the presence of common watermark attacks is performed. The chaotic watermarks are generated from the iteration of the skew tent map, the Bernoulli map and the logistic map. The analysis focuses on the watermarked images after they have been subjected to common image distortion attacks. The capacities of each of these images are also calculated. It is shown that signals generated from lowpass chaotic signals have superior performance over the other signal types analysed for the attacks studied.

  5. The economic impacts of the September 11 terrorist attacks: a computable general equilibrium analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oladosu, Gbadebo A [ORNL; Rose, Adam [University of Southern California, Los Angeles; Bumsoo, Lee [University of Illinois; Asay, Gary [University of Southern California

    2009-01-01

    This paper develops a bottom-up approach that focuses on behavioral responses in estimating the total economic impacts of the September 11, 2001, World Trade Center (WTC) attacks. The estimation includes several new features. First, is the collection of data on the relocation of firms displaced by the attack, the major source of resilience in muting the direct impacts of the event. Second, is a new estimate of the major source of impacts off-site -- the ensuing decline of air travel and related tourism in the U.S. due to the social amplification of the fear of terrorism. Third, the estimation is performed for the first time using Computable General Equilibrium (CGE) analysis, including a new approach to reflecting the direct effects of external shocks. This modeling framework has many advantages in this application, such as the ability to include behavioral responses of individual businesses and households, to incorporate features of inherent and adaptive resilience at the level of the individual decision maker and the market, and to gauge quantity and price interaction effects across sectors of the regional and national economies. We find that the total business interruption losses from the WTC attacks on the U.S. economy were only slightly over $100 billion, or less than 1.0% of Gross Domestic Product. The impacts were only a loss of $14 billion of Gross Regional Product for the New York Metropolitan Area.

  6. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    This paper presents a goal based methodology for HAZOP studies in which a functional model of the plant is used to assist in a functional decomposition of the plant starting from the purpose of the plant and continuing down to the function of a single node, e.g. a pipe section. This approach lead...

  7. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  8. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  9. JFCGuard: Detecting juice filming charging attack via processor usage analysis on smartphones

    DEFF Research Database (Denmark)

    Meng, Weizhi; Jiang, Lijun; Wang, Yu

    2017-01-01

    attacks especially charging attacks and threaten user's privacy. As an example, juice filming charging (JFC) attack is able to steal users' sensitive and private information from both Android OS and iOS devices, through automatically recording phone-screen and monitoring users' inputs during the whole...... damage of JFC attack, in this work, we investigate the impact of JFC attack on processor usage including both CPU- and GPU-usage. It is found that JFC attack would cause a noticeable usage increase when connecting the phone to the JFC charger. Then, we design a security mechanism, called JFCGuard...

  10. Practical In-Depth Analysis of IDS Alerts for Tracing and Identifying Potential Attackers on Darknet

    Directory of Open Access Journals (Sweden)

    Jungsuk Song

    2017-02-01

    Full Text Available The darknet (i.e., a set of unused IP addresses is a very useful solution for observing the global trends of cyber threats and analyzing attack activities on the Internet. Since the darknet is not connected with real systems, in most cases, the incoming packets on the darknet (‘the darknet traffic’ do not contain a payload. This means that we are unable to get real malware from the darknet traffic. This situation makes it difficult for security experts (e.g., academic researchers, engineers, operators, etc. to identify whether the source hosts of the darknet traffic are infected by real malware or not. In this paper, we present the overall procedure of the in-depth analysis between the darknet traffic and IDS alerts using real data collected at the Science and Technology Cyber Security Center (S&T CSC in Korea and provide the detailed in-depth analysis results. The ultimate goal of this paper is to provide practical experience, insight and know-how to security experts so that they are able to identify and trace the root cause of the darknet traffic. The experimental results show that correlation analysis between the darknet traffic and IDS alerts is very useful to discover potential attack hosts, especially internal hosts, and to find out what kinds of malware infected them.

  11. Methodology for risk analysis of nuclear installations

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Senne Junior, Murillo; Jordao, Elizabete

    2002-01-01

    Both the licensing standards for general uses in nuclear facilities and the specific ones require a risk assessment during their licensing processes. The risk assessment is carried out through the estimation of both probability of the occurrence of the accident, and their magnitudes. This is a complex task because the great deal of potential hazardous events that can occur in nuclear facilities difficult the statement of the accident scenarios. There are also many available techniques to identify the potential accidents, estimate their probabilities, and evaluate their magnitudes. In this paper is presented a new methodology that systematizes the risk assessment process, and orders the accomplishment of their several steps. (author)

  12. A retrospective analysis of practice patterns in the management of acute asthma attack across Turkey.

    Science.gov (United States)

    Türktaş, Haluk; Bavbek, Sevim; Misirligil, Zeynep; Gemicioğlu, Bilun; Mungan, Dilşad

    2010-12-01

    To evaluate patient characteristics and practice patterns in the management of acute asthma attack at tertiary care centers across Turkey. A total of 294 patients (mean age: 50.4 ± 15.1 years; females: 80.3%) diagnosed with persistent asthma were included in this retrospective study upon their admission to the hospital with an acute asthma attack. Patient demographics, asthma control level, asthma attack severity and the management of the attack were evaluated. There was no influence of gender on asthma control and attack severity. In 57.5% of the patients, asthma attack was moderate. Most patients (78.9%) were hospitalized with longer duration evident in the severe attack. Spirometry and chest X-Ray were the most frequent tests (85.4%), while steroids (72.0% parenteral; 29.0% oral) and short-acting beta-agonists (SABA) + anticholinergics (45.5%) were the main drugs of choice in the attack management. Attack severity and pre-attack asthma control level was significantly correlated (p attack asthma was uncontrolled in 42.6% of the patients with severe attack. Most of the patients were on combination of more than one (two in 38.7% and 3-4 in 31.2%) controller drugs before the attack. Providing country specific data on practice patterns in the management of acute asthma attack in a representative cohort in Turkey, prescription of steroids and SABA + anticholinergics as the main drugs of choice was in line with guidelines while the significant relation of pre-attack asthma control to risk/severity of asthma attack and rate/duration of hospitalization seem to be the leading results of the present study. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. A Stochastic Framework for Quantitative Analysis of Attack-Defense Trees

    NARCIS (Netherlands)

    Jhawar, Ravi; Lounis, Karim; Mauw, Sjouke

    2016-01-01

    Cyber attacks are becoming increasingly complex, practically sophisticated and organized. Losses due to such attacks are important, varying from the loss of money to business reputation spoilage. Therefore, there is a great need for potential victims of cyber attacks to deploy security solutions

  14. Structural differences in interictal migraine attack after epilepsy: A diffusion tensor imaging analysis.

    Science.gov (United States)

    Huang, Qi; Lv, Xin; He, Yushuang; Wei, Xing; Ma, Meigang; Liao, Yuhan; Qin, Chao; Wu, Yuan

    2017-12-01

    Patients with epilepsy (PWE) are more likely to suffer from migraine attack, and aberrant white matter (WM) organization may be the mechanism underlying this phenomenon. This study aimed to use diffusion tensor imaging (DTI) technique to quantify WM structural differences in PWE with interictal migraine. Diffusion tensor imaging data were acquired in 13 PWE with migraine and 12 PWE without migraine. Diffusion metrics were analyzed using tract-atlas-based spatial statistics analysis. Atlas-based and tract-based spatial statistical analyses were conducted for robustness analysis. Correlation was explored between altered DTI metrics and clinical parameters. The main results are as follows: (i) Axonal damage plays a key role in PWE with interictal migraine. (ii) Significant diffusing alterations included higher fractional anisotropy (FA) in the fornix, higher mean diffusivity (MD) in the middle cerebellar peduncle (CP), left superior CP, and right uncinate fasciculus, and higher axial diffusivity (AD) in the middle CP and right medial lemniscus. (iii) Diffusion tensor imaging metrics has the tendency of correlation with seizure/migraine type and duration. Results indicate that characteristic structural impairments exist in PWE with interictal migraine. Epilepsy may contribute to migraine by altering WMs in the brain stem. White matter tracts in the fornix and right uncinate fasciculus also mediate migraine after epilepsy. This finding may improve our understanding of the pathological mechanisms underlying migraine attack after epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Risk analysis of breakwater caisson under wave attack using load surface approximation

    Science.gov (United States)

    Kim, Dong Hyawn

    2014-12-01

    A new load surface based approach to the reliability analysis of caisson-type breakwater is proposed. Uncertainties of the horizontal and vertical wave loads acting on breakwater are considered by using the so-called load surfaces, which can be estimated as functions of wave height, water level, and so on. Then, the first-order reliability method (FORM) can be applied to determine the probability of failure under the wave action. In this way, the reliability analysis of breakwaters with uncertainties both in wave height and in water level is possible. Moreover, the uncertainty in wave breaking can be taken into account by considering a random variable for wave height ratio which relates the significant wave height to the maximum wave height. The proposed approach is applied numerically to the reliability analysis of caisson breakwater under wave attack that may undergo partial or full wave breaking.

  16. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  17. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  18. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  19. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  20. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  1. An Analysis of Media’s Role: Case Study of Army Public School (APS Peshawar Attack

    Directory of Open Access Journals (Sweden)

    Qureshi Rameesha

    2016-12-01

    Full Text Available The study aimed at analyzing the role of media during and after terrorist attacks by examining the media handling of APS Peshawar attack. The sample consisted of males and females selected on convenience basis from universities of Rawalpindi and Islamabad. It was hypothesized that (1 Extensive media coverage of terrorist attacks leads to greater publicity/recognition of terrorist groups (2 Media coverage of APS Peshawar attack increased fear and anxiety in public (3 Positive media handling/coverage of APS Peshawar attack led to public solidarity and peace. The results indicate that i Media coverage of terrorist attacks does help terrorist groups to gain publicity and recognition amongst public ii Media coverage of Aps Peshawar attack did not increase fear/anxiety in fact it directed the Pakistani nation towards public solidarity and peace.

  2. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  3. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  4. [Free will and neurobiology: a methodological analysis].

    Science.gov (United States)

    Brücher, K; Gonther, U

    2006-04-01

    Whether or not the neurobiological basis of mental processes is compatible with the philosophical postulate of free will is a matter of committed debating in our days. What is the meaning of those frequently-quoted experiments concerning voluntary action? Both convictions, being autonomous subjects and exercising a strong influence on the world by applying sciences, have become most important for modern human self-conception. Now these two views are growing apart and appear contradictory because neurobiology tries to reveal the illusionary character of free will. In order to cope with this ostensible dichotomy it is recommended to return to the core of scientific thinking, i. e. to the reflection about truth and methods. The neurobiological standpoint referring to Libet as well as the philosophical approaches to free will must be analysed, considering pre-conceptions and context-conditions. Hence Libet's experiments can be criticised on different levels: methods, methodology and epistemology. Free will is a highly complex system, not a simple fact. Taking these very complicated details into account it is possible to define conditions of compatibility and to use the term free will still in a meaningful way, negotiating the obstacles called pure chance and determinism.

  5. Heart Attack

    Science.gov (United States)

    Each year almost 800,000 Americans have a heart attack. A heart attack happens when blood flow to the heart suddenly ... it's important to know the symptoms of a heart attack and call 9-1-1 if you or ...

  6. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  7. Social Networks Analysis: Classification, Evaluation, and Methodologies

    Science.gov (United States)

    2011-02-28

    and time performance. We also focus on large-scale network size and dynamic changes in networks and research new capabilities in performing social networks analysis utilizing parallel and distributed processing.

  8. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  9. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  10. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  11. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  12. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  13. Cyber attacks against state estimation in power systems: Vulnerability analysis and protection strategies

    Science.gov (United States)

    Liu, Xuan

    Power grid is one of the most critical infrastructures in a nation and could suffer a variety of cyber attacks. With the development of Smart Grid, false data injection attack has recently attracted wide research interest. This thesis proposes a false data attack model with incomplete network information and develops optimal attack strategies for attacking load measurements and the real-time topology of a power grid. The impacts of false data on the economic and reliable operations of power systems are quantitatively analyzed in this thesis. To mitigate the risk of cyber attacks, a distributed protection strategies are also developed. It has been shown that an attacker can design false data to avoid being detected by the control center if the network information of a power grid is known to the attacker. In practice, however, it is very hard or even impossible for an attacker to obtain all network information of a power grid. In this thesis, we propose a local load redistribution attacking model based on incomplete network information and show that an attacker only needs to obtain the network information of the local attacking region to inject false data into smart meters in the local region without being detected by the state estimator. A heuristic algorithm is developed to determine a feasible attacking region by obtaining reduced network information. This thesis investigates the impacts of false data on the operations of power systems. It has been shown that false data can be designed by an attacker to: 1) mask the real-time topology of a power grid; 2) overload a transmission line; 3) disturb the line outage detection based on PMU data. To mitigate the risk of cyber attacks, this thesis proposes a new protection strategy, which intends to mitigate the damage effects of false data injection attacks by protecting a small set of critical measurements. To further reduce the computation complexity, a mixed integer linear programming approach is also proposed to

  14. Advanced Methodology for Containment M/E Release Analysis

    International Nuclear Information System (INIS)

    Kim, C. W.; Park, S. J.; Song, J. H.; Choi, H. R.; Seo, J. T.

    2006-01-01

    Recently, a new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been developed and adopted on small break LOCA (SBLOCA). This new M/E release analysis methodology for EEQ is extended to the M/E release analysis for the containment design for large break LOCA (LBLOCA) and main steam line break (MSLB) accident. The advanced methodology of the M/E release analysis for the containment design includes the same engine as the M/E methodology for EEQ, however, conservative approaches for the M/E release such as break spillage model and multiplier on heat transfer coefficient (HTC) etc. are added. The computer code systems used in this methodology are RELAP5K/CONTEMPT4 (or RELAP5- ME) like KREM (KEPRI Realistic Evaluation Model) which couples RELAP5/MOD3.1/K and CONTEMPT4/ MOD5. RELAP5K is based on RELAP5/MOD3.1/K and includes conservatisms for the M/E release and long-term analysis model. The advanced methodology adopting the recent analysis technology is able to calculate the various transient stages of a LOCA in a single code system and also can calculate the M/E release analysis during the long term cooling period with the containment response. This advanced methodology for the M/E release is developed based on the LOCA and applied to the MSLB. The results are compared with the Ulchin Nuclear Unit (UCN) 3 and 4 FSAR

  15. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  16. Towards the Development of a Methodology for the Cyber Security Analysis of Safety Related Nuclear Digital I and C Systems

    International Nuclear Information System (INIS)

    Khand, Parvaiz Ahmed; Seong, Poong Hyun

    2007-01-01

    In nuclear power plants the redundant safety related systems are designed to take automatic action to prevent and mitigate accident conditions if the operators and the non-safety systems fail to maintain the plant within normal operating conditions. In case of an event, the failure of these systems has catastrophic consequences. The tendency in the industry over the past 10 years has been to use of commercial of the shelf (COTS) technologies in these systems. COTS software was written with attention to function and performance rather than security. COTS hardware usually designed to fail safe, but security vulnerabilities could be exploited by an attacker to disable the fail safe mechanisms. Moreover, the use of open protocols and operating systems in these technologies make the plants to become vulnerable to a host of cyber attacks. An effective security analysis process is required during all life cycle phases of these systems in order to ensure the security from cyber attacks. We are developing a methodology for the cyber security analysis of safety related nuclear digital I and C Systems. This methodology will cover all phases of development, operation and maintenance processes of software life cycle. In this paper, we will present a security analysis process for the concept stage of software development life cycle

  17. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  18. Reachable Sets of Hidden CPS Sensor Attacks : Analysis and Synthesis Tools

    NARCIS (Netherlands)

    Murguia, Carlos; van de Wouw, N.; Ruths, Justin; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    For given system dynamics, control structure, and fault/attack detection procedure, we provide mathematical tools–in terms of Linear Matrix Inequalities (LMIs)–for characterizing and minimizing the set of states that sensor attacks can induce in the system while keeping the alarm rate of the

  19. Physical Attacks: An Analysis of Teacher Characteristics Using the Schools and Staffing Survey

    Science.gov (United States)

    Williams, Thomas O., Jr.; Ernst, Jeremy V.

    2016-01-01

    This study investigated physical attacks as reported by public school teachers on the most recent Schools and Staffing Survey (SASS) from the National Center for Education Statistics administered by the Institute of Educational Sciences. For this study, characteristics of teachers who responded affirmatively to having been physically attacked in…

  20. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  1. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  2. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  3. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  4. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  5. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  6. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  7. Frenzied attacks. A micro-sociological analysis of the emotional dynamics of extreme youth violence.

    Science.gov (United States)

    Weenink, Don

    2014-09-01

    Inspired by phenomenological and interactionist studies of youth violence, this article offers an empirical evaluation of Collins's micro-sociological theory of violence. The main question is whether situations of extreme violence have distinct situational dynamics. Based on analyses of 159 interactions taken from judicial case files, situations of extreme youth violence, here called frenzied attacks, were identified on the basis of the state of encapsulation of the attackers (absorbed in the violence, their sole focus is the destruction of the victim) and the disproportionateness of the violence (the attackers continue to hurt the victims even though they do not pose a threat or a challenge to them). Qualitative and statistical analyses revealed that this emotional state results from a social figuration in which the emotional balance shifts toward complete dominance of the attackers. Thus, the occurrence of frenzied attacks is associated with the moment victims hit the ground, paralyse and start to apologize, with the numerical dominance of the attackers' supportive group and with feelings of group membership, in the form of solidarity excitement and family ties in the attackers' group. Alcohol intoxication is of influence as well, but contrary to the expectation, this effect was independent from solidarity excitement. The article concludes that Collins's theory on the emotional dynamics of violence adds a new dimension to the phenomenological and interactionist traditions of research. © London School of Economics and Political Science 2014.

  8. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  9. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  10. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  11. Attack surfaces

    DEFF Research Database (Denmark)

    Gruschka, Nils; Jensen, Meiko

    2010-01-01

    The new paradigm of cloud computing poses severe security risks to its adopters. In order to cope with these risks, appropriate taxonomies and classification criteria for attacks on cloud computing are required. In this work-in-progress paper we present one such taxonomy based on the notion...... of attack surfaces of the cloud computing scenario participants....

  12. Methodological Tool or Methodology? Beyond Instrumentality and Efficiency with Qualitative Data Analysis Software

    Directory of Open Access Journals (Sweden)

    Pengfei Zhao

    2016-04-01

    Full Text Available Qualitative data analysis software (QDAS has become increasingly popular among researchers. However, very few discussions have developed regarding the effect of QDAS on the validity of qualitative data analysis. It is a pressing issue, especially because the recent proliferation of conceptualizations of validity has challenged, and to some degree undermined, the taken-for-granted connection between the methodologically neutral understanding of validity and QDAS. This article suggests an alternative framework for examining the relationship between validity and the use of QDAS. Shifting the analytic focus from instrumentality and efficiency of QDAS to the research practice itself, we propose that qualitative researchers should formulate a "reflective space" at the intersection of their methodological approach, the built-in validity structure of QDAS and the specific research context, in order to make deliberative and reflective methodological decisions. We illustrate this new framework through discussion of a collaborative action research project. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1602160

  13. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  14. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  15. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  16. Design of Cyber Attack Precursor Symptom Detection Algorithm through System Base Behavior Analysis and Memory Monitoring

    Science.gov (United States)

    Jung, Sungmo; Kim, Jong Hyun; Cagalaban, Giovanni; Lim, Ji-Hoon; Kim, Seoksoo

    More recently, botnet-based cyber attacks, including a spam mail or a DDos attack, have sharply increased, which poses a fatal threat to Internet services. At present, antivirus businesses make it top priority to detect malicious code in the shortest time possible (Lv.2), based on the graph showing a relation between spread of malicious code and time, which allows them to detect after malicious code occurs. Despite early detection, however, it is not possible to prevent malicious code from occurring. Thus, we have developed an algorithm that can detect precursor symptoms at Lv.1 to prevent a cyber attack using an evasion method of 'an executing environment aware attack' by analyzing system behaviors and monitoring memory.

  17. Analysis of attacks success in three matchup of soccer clubs Partisan and Cukaricki during the season 2014/2015: A case study

    Directory of Open Access Journals (Sweden)

    Živanović Vladimir

    2016-01-01

    Full Text Available The aim of this research was to determine the differences in performance between the variables door attacks in a soccer game, and that have a statistically significant contribution to the achievement of positive results in the competition. We analyzed three matches played between the two elite soccer clubs in Serbia 'Partisan' Belgrade, and 'Cukaricki' Belgrade in the season 2014/2015. An analysis has been done of attack types (continuous, fast attack and counter attack and their finals at goal or achieving a goal. Results of the analysis showed that the highest representation are interrupted and unrealized continuous team attacks (CTA compared to a much smaller number of efficient fast attacks (FA and the counter-attack (CA, i.e. prevailed a lot slower transfer mode (transition balls, compared to its much faster the flow, the observed matches. It can be concluded that the greatest statistical differences were found in the variables ineffective attacks (IA, value of 10.5 and effective attacks (EA, value of 9, while the lowest statistical difference was found in the variable ineffective attacks with kick the ball outside the goal ICA(KBOTG, value of 1.5. These data correspond to the practical situation, because the observation of which, with the greatest certainty determines the winner of a soccer match.

  18. Development of mass and energy release analysis methodology

    International Nuclear Information System (INIS)

    Kim, Cheol Woo; Song, Jeung Hyo; Park, Seok Jeong; Kim, Tech Mo; Han, Kee Soo; Choi, Han Rim

    2009-01-01

    Recently, new approaches to the accident analysis using the realistic evaluation have been attempted. These new approaches provide more margins to the plant safety, design, operation and maintenance. KREM (KEPRI Realistic Evaluation Methodology) for a large break loss-of-coolant accident (LOCA) is performed using RELAP5/MOD3 computer code including realistic evaluation models. KOPEC has developed KIMERA (KOPEC Improved Mass and Energy Release Analysis methodology) based on the realistic evaluation to improve the analysis method for the mass and energy (M/E) release and to obtain the adequate margin. KIMERA uses a simplified single code system unlike conventional M/E release analysis methodologies. This simple code system reduces the computing efforts especially for LOCA analysis. The computer code systems of this methodology are RELAP5K/CONTEMPT4 (or RELAP5-ME) like KREM methodology which couples RELAP5/MOD3.1/K and CONTEMPT4/MOD5. The new methodology, KIMERA based on the same engine as KREM, adopted conservative approaches for the M/E release such as break spillage model, multiplier on heat transfer coefficient (HTC), and long-term cooling model. KIMERA is developed based on a LOCA and applied to a main steam line break (MSLB) and approved by Korea Government. KIMERA has an ability of one-through calculation of the various transient stages of LOCAs in a single code system and calculate the M/E release analysis during the long term cooling period with the containment pressure and temperature (P/T) response. The containment P/T analysis results are compared with those of the Ulchin Nuclear Power Plant Units 3 and 4 (UCN 3 and 4) FSAR which is the OPR1000 (Optimized Power Reactor 1000) type nuclear power plant. The results of a large break LOCA and an MSLB are similar to those of FSAR for UCN 3 and 4. However, the containment pressure during the post-blowdown period of a large break LOCA has much lower second peak than the first peak. The resultant containment peak

  19. HackAttack: Game-Theoretic Analysis of Realistic Cyber Conflicts

    Energy Technology Data Exchange (ETDEWEB)

    Ferragut, Erik M [ORNL; Brady, Andrew C [Jefferson Middle School, Oak Ridge, TN; Brady, Ethan J [Oak Ridge High School, Oak Ridge, TN; Ferragut, Jacob M [Oak Ridge High School, Oak Ridge, TN; Ferragut, Nathan M [Oak Ridge High School, Oak Ridge, TN; Wildgruber, Max C [ORNL

    2016-01-01

    Game theory is appropriate for studying cyber conflict because it allows for an intelligent and goal-driven adversary. Applications of game theory have led to a number of results regarding optimal attack and defense strategies. However, the overwhelming majority of applications explore overly simplistic games, often ones in which each participant s actions are visible to every other participant. These simplifications strip away the fundamental properties of real cyber conflicts: probabilistic alerting, hidden actions, unknown opponent capabilities. In this paper, we demonstrate that it is possible to analyze a more realistic game, one in which different resources have different weaknesses, players have different exploits, and moves occur in secrecy, but they can be detected. Certainly, more advanced and complex games are possible, but the game presented here is more realistic than any other game we know of in the scientific literature. While optimal strategies can be found for simpler games using calculus, case-by-case analysis, or, for stochastic games, Q-learning, our more complex game is more naturally analyzed using the same methods used to study other complex games, such as checkers and chess. We define a simple evaluation function and employ multi-step searches to create strategies. We show that such scenarios can be analyzed, and find that in cases of extreme uncertainty, it is often better to ignore one s opponent s possible moves. Furthermore, we show that a simple evaluation function in a complex game can lead to interesting and nuanced strategies.

  20. Interpersonal Dynamics in a Simulated Prison: A Methodological Analysis

    Science.gov (United States)

    Banuazizi, Ali; Movahedi, Siamak

    1975-01-01

    A critical overview is presented of the Stanford Prison Experiment, conducted by Zimbardo and his coinvestigators in which they attempted a structural analysis of the problems of imprisonment. Key assumptions are questioned, primarily on methodological grounds, which casts doubts on the plausibility of the experimenters' final causal inferences.…

  1. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  2. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  3. METHODOLOGY FOR INSTITUTIONAL ANALYSIS OF STABILITY OF REGIONAL FINANCIAL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. V. Milenkov

    2016-01-01

    Full Text Available The relevance of the article due to the urgent need to develop a methodological framework in the field of regional finance research dictated by the substantial increase in the volume and composition of the socio-economic problems, the solution of which, including financial support, the responsibility of the public authorities of the Russian Federation. The article presents the results of the author's research in the field of institutional analysis of the stability of the regional financial system as a set of institutions and organizations interacting with the regional real sector of economy.Methodology. The methodological basis of this article are the economic and statistical methods of analysis, legal documents in the field of the sustainability of the regional financial system, publications in the field of economic and financial security.Conclusions / relevance. The practical significance of the work lies in the provisions of orientation, conclusions and recommendations aimed at the widespread use of search and adaptation of the institutional analysis of the sources of the regional stability of the financial system, which can be used by the legislative and executive authorities of the Russian Federation, the Ministry of Defence in the current activity.Methodological approaches to the structuring objectives of institutional analysis on the basis of the hierarchical representation of the institutional environment of functioning of federal subject the financial system.

  4. Active disturbance rejection control: methodology and theoretical analysis.

    Science.gov (United States)

    Huang, Yi; Xue, Wenchao

    2014-07-01

    The methodology of ADRC and the progress of its theoretical analysis are reviewed in the paper. Several breakthroughs for control of nonlinear uncertain systems, made possible by ADRC, are discussed. The key in employing ADRC, which is to accurately determine the "total disturbance" that affects the output of the system, is illuminated. The latest results in theoretical analysis of the ADRC-based control systems are introduced. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  5. [Analysis on the individual-response behavior and the influence factors to violent terrorist attacks among undergraduates in Guangzhou].

    Science.gov (United States)

    Ye, Yunfeng; Rao, Jiaming; Wang, Haiqing; Zhang, Siheng; Li, Yang; Wang, Shengyong; Dong, Xiaomei

    2015-04-01

    To analyze related behaviors of individual preparedness and influencing factors on violent terrorist attacks among undergraduates. A total of 1 800 undergraduates from 5 colleges or universities in Guangzhou were selected, using the stratified cluster method. A questionnaire involving the response to violent terrorist attack behavior was used to assess the individual preparedness behaviors among undergraduates. A self-made questionnaire was applied to collect information on demographic factors, cognitive and preparedness behaviors. The mean score of individual preparedness behavior among undergraduates was 13.49 ± 5.02 while information on seeking behavior was 4.27 ± 1.64, avoidance behavior was 5.97 ± 2.16 and violent terrorist attack response behaviors was 23.73 ± 7.21, with 30.0 percent of undergraduates behaved properly. Significant differences were found in the scores of behaviors on the response to violent terrorist attack with different gender, major they pursue or religious belief (P undergraduates involved in this study. Results from the logistic regression analysis revealed that persons being girls (OR = 1.46, 95% CI: 1.06-2.01), with bigger perceived probability (OR = 1.60, 95% CI: 1.12-2.30), with higher alertness (OR = 3.77, 95% CI: 2.15-6.61), with stronger coping confidence (OR = 0.34, 95% CI: 0.24-0.48) and bigger affective response (OR1 = 3.42, 95% CI: 2.40-4.86; OR2 = 0.23, 95% CI: 0.13-0.41), would present more prominent behavior responses when facing the violent terrorist attack. Individual response behaviors to violent terrorist attacks among undergraduates were relatively ideal. Perceived probability, alertness, coping confidence and affective response appeared to be independent influencing factors related to response behaviors against violent terrorist attack. In colleges and universities, awareness on violent terrorist attacks should be strengthened among undergraduates. Focus should target on psychological education dealing with

  6. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  7. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    Science.gov (United States)

    Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

    2009-05-01

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

  8. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  9. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  10. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  11. Heart Attack

    Science.gov (United States)

    ... pain Fatigue Heart attack Symptoms & causes Diagnosis & treatment Advertisement Mayo Clinic does not endorse companies or products. ... a Job Site Map About This Site Twitter Facebook Google YouTube Pinterest Mayo Clinic is a not- ...

  12. Security Analysis of 7-Round MISTY1 against Higher Order Differential Attacks

    Science.gov (United States)

    Tsunoo, Yukiyasu; Saito, Teruo; Shigeri, Maki; Kawabata, Takeshi

    MISTY1 is a 64-bit block cipher that has provable security against differential and linear cryptanalysis. MISTY1 is one of the algorithms selected in the European NESSIE project, and it has been recommended for Japanese e-Government ciphers by the CRYPTREC project. This paper shows that higher order differential attacks can be successful against 7-round versions of MISTY1 with FL functions. The attack on 7-round MISTY1 can recover a partial subkey with a data complexity of 254.1 and a computational complexity of 2120.8, which signifies the first successful attack on 7-round MISTY1 with no limitation such as a weak key. This paper also evaluates the complexity of this higher order differential attack on MISTY1 in which the key schedule is replaced by a pseudorandom function. It is shown that resistance to the higher order differential attack is not substantially improved even in 7-round MISTY1 in which the key schedule is replaced by a pseudorandom function.

  13. Proactive Routing Mutation Against Stealthy Distributed Denial of Service Attacks – Metrics, Modeling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Qi; Al-Shaer, Ehab; Chatterjee, Samrat; Halappanavar, Mahantesh; Oehmen, Christopher S.

    2018-04-01

    The Infrastructure Distributed Denial of Service (IDDoS) attacks continue to be one of the most devastating challenges facing cyber systems. The new generation of IDDoS attacks exploit the inherent weakness of cyber infrastructure including deterministic nature of routes, skew distribution of flows, and Internet ossification to discover the network critical links and launch highly stealthy flooding attacks that are not observable at the victim end. In this paper, first, we propose a new metric to quantitatively measure the potential susceptibility of any arbitrary target server or domain to stealthy IDDoS attacks, and es- timate the impact of such susceptibility on enterprises. Second, we develop a proactive route mutation technique to minimize the susceptibility to these attacks by dynamically changing the flow paths periodically to invalidate the adversary knowledge about the network and avoid targeted critical links. Our proposed approach actively changes these network paths while satisfying security and qualify of service requirements. We present an integrated approach of proactive route mutation that combines both infrastructure-based mutation that is based on reconfiguration of switches and routers, and middle-box approach that uses an overlay of end-point proxies to construct a virtual network path free of critical links to reach a destination. We implemented the proactive path mutation technique on a Software Defined Network using the OpendDaylight controller to demonstrate a feasible deployment of this approach. Our evaluation validates the correctness, effectiveness, and scalability of the proposed approaches.

  14. Experiment Analysis of Concrete’s Mechanical Property Deterioration Suffered Sulfate Attack and Drying-Wetting Cycles

    Directory of Open Access Journals (Sweden)

    Wei Tian

    2017-01-01

    Full Text Available The mechanism of concrete deterioration in sodium sulfate solution is investigated. The macroperformance was characterized via its apparent properties, mass loss, and compressive strength. Changes in ions in the solution at different sulfate attack periods were tested by inductively coupled plasma (ICP. The damage evolution law, as well as analysis of the concrete’s meso- and microstructure, was revealed by scanning electron microscope (SEM and computed tomography (CT scanning equipment. The results show that the characteristics of concrete differed at each sulfate attack period; the drying-wetting cycles generally accelerated the deterioration process of concrete. In the early sulfate attack period, the pore structure of the concrete was filled with sulfate attack products (e.g., ettringite and gypsum, and its mass and strength increased. The pore size and porosity decreased while the CT number increased. As deterioration progressed, the swelling/expansion force of products and the salt crystallization pressure of sulfate crystals acted on the inner wall of the concrete to accumulate damage and accelerate deterioration. The mass and strength of concrete sharply decreased. The number and volume of pores increased, and the pore grew more quickly resulting in initiation and expansion of microcracks while the CT number decreased.

  15. Cyber-Attacks on Smart Meters in Household Nanogrid: Modeling, Simulation and Analysis

    OpenAIRE

    Denise Tellbach; Yan-Fu Li

    2018-01-01

    The subject of cyber-security and therefore cyber-attacks on smart grid (SG) has become subject of many publications in the last years, emphasizing its importance in research, as well as in practice. One especially vulnerable part of SG are smart meters (SMs). The major contribution of simulating a variety of cyber-attacks on SMs that have not been done in previous studies is the identification and quantification of the possible impacts on the security of SG. In this study, a simulation model...

  16. Evaluation of a Multi-Agent System for Simulation and Analysis of Distributed Denial-of-Service Attacks

    National Research Council Canada - National Science Library

    Huu, Tee

    2003-01-01

    DDoS attack is evolving at a rapid and alarming rate; an effective solution must be formulated using an adaptive approach Most of the simulations are performed at the attack phase of the DDoS attack...

  17. Improving Attack Graph Visualization through Data Reduction and Attack Grouping

    Energy Technology Data Exchange (ETDEWEB)

    John Homer; Ashok Varikuti; Xinming Ou; Miles A. McQueen

    2008-09-01

    Various tools exist to analyze enterprise network systems and to produce attack graphs detailing how attackers might penetrate into the system. These attack graphs, however, are often complex and difficult to comprehend fully, and a human user may find it problematic to reach appropriate configuration decisions. This paper presents methodologies that can 1) automatically identify portions of an attack graph that do not help a user to understand the core security problems and so can be trimmed, and 2) automatically group similar attack steps as virtual nodes in a model of the network topology, to immediately increase the understandability of the data. We believe both methods are important steps toward improving visualization of attack graphs to make them more useful in configuration management for large enterprise networks. We implemented our methods using one of the existing attack-graph toolkits. Initial experimentation shows that the proposed approaches can 1) significantly reduce the complexity of attack graphs by trimming a large portion of the graph that is not needed for a user to understand the security problem, and 2) significantly increase the accessibility and understandability of the data presented in the attack graph by clearly showing, within a generated visualization of the network topology, the number and type of potential attacks to which each host is exposed.

  18. RISK ANALYSIS IN CONSTRUCTION PROJECTS: A PRACTICAL SELECTION METHODOLOGY

    OpenAIRE

    Alberto De Marco; Muhammad Jamaluddin Thaheem

    2014-01-01

    Project Risk Management (PRM) is gaining attention from researchers and practitioners in the form of sophisticated tools and techniques to help construction managers perform risk management. However, the large variety of techniques has made selecting an appropriate solution a complex and risky task in itself. Accordingly, this study proposes a practical framework methodology to assist construction project managers and practitioners in choosing a suitable risk analysis technique based on selec...

  19. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  20. Lone Actor Terrorist Attack Planning and Preparation: A Data-Driven Analysis.

    Science.gov (United States)

    Schuurman, Bart; Bakker, Edwin; Gill, Paul; Bouhana, Noémie

    2017-10-23

    This article provides an in-depth assessment of lone actor terrorists' attack planning and preparation. A codebook of 198 variables related to different aspects of pre-attack behavior is applied to a sample of 55 lone actor terrorists. Data were drawn from open-source materials and complemented where possible with primary sources. Most lone actors are not highly lethal or surreptitious attackers. They are generally poor at maintaining operational security, leak their motivations and capabilities in numerous ways, and generally do so months and even years before an attack. Moreover, the "loneness" thought to define this type of terrorism is generally absent; most lone actors uphold social ties that are crucial to their adoption and maintenance of the motivation and capability to commit terrorist violence. The results offer concrete input for those working to detect and prevent this form of terrorism and argue for a re-evaluation of the "lone actor" concept. © 2017 The Authors. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.

  1. Crossing trend analysis methodology and application for Turkish rainfall records

    Science.gov (United States)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  2. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  3. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  4. Multi-resolution Convolution Methodology for ICP Waveform Morphology Analysis.

    Science.gov (United States)

    Shaw, Martin; Piper, Ian; Hawthorne, Christopher

    2016-01-01

    Intracranial pressure (ICP) monitoring is a key clinical tool in the assessment and treatment of patients in neurointensive care. ICP morphology analysis can be useful in the classification of waveform features.A methodology for the decomposition of an ICP signal into clinically relevant dimensions has been devised that allows the identification of important ICP waveform types. It has three main components. First, multi-resolution convolution analysis is used for the main signal decomposition. Then, an impulse function is created, with multiple parameters, that can represent any form in the signal under analysis. Finally, a simple, localised optimisation technique is used to find morphologies of interest in the decomposed data.A pilot application of this methodology using a simple signal has been performed. This has shown that the technique works with performance receiver operator characteristic area under the curve values for each of the waveform types: plateau wave, B wave and high and low compliance states of 0.936, 0.694, 0.676 and 0.698, respectively.This is a novel technique that showed some promise during the pilot analysis. However, it requires further optimisation to become a usable clinical tool for the automated analysis of ICP signals.

  5. A multiple linear regression analysis of hot corrosion attack on a series of nickel base turbine alloys

    Science.gov (United States)

    Barrett, C. A.

    1985-01-01

    Multiple linear regression analysis was used to determine an equation for estimating hot corrosion attack for a series of Ni base cast turbine alloys. The U transform (i.e., 1/sin (% A/100) to the 1/2) was shown to give the best estimate of the dependent variable, y. A complete second degree equation is described for the centered" weight chemistries for the elements Cr, Al, Ti, Mo, W, Cb, Ta, and Co. In addition linear terms for the minor elements C, B, and Zr were added for a basic 47 term equation. The best reduced equation was determined by the stepwise selection method with essentially 13 terms. The Cr term was found to be the most important accounting for 60 percent of the explained variability hot corrosion attack.

  6. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  7. About Heart Attacks

    Science.gov (United States)

    ... Artery Disease Venous Thromboembolism Aortic Aneurysm More About Heart Attacks Updated:Jan 11,2018 A heart attack is ... coronary artery damage leads to a heart attack . Heart Attack Questions and Answers What is a heart attack? ...

  8. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  9. Considerations on a methodological framework for the analysis of texts

    Directory of Open Access Journals (Sweden)

    David Andrés Camargo Mayorga

    2017-03-01

    Full Text Available This article presents a review of relevant literature for the construction of a methodological framework for the analysis of texts in applied social sciences, such as economics, which we have supported in the main hermeneutical approaches from philosophy, linguistics and social sciences. In essence, they assume that every discourse carries meaning - be it truthful or not - and that they express complex social relations. Thus, any analysis of content happens finally to be a certain type of hermeneutics (interpretation, while trying to account for multiple phenomena immersed in the production, application, use and reproduction of knowledge within the text. When applying discourse analysis in teaching texts in economic sciences, we find traces of legalistic, political, ethnocentric tendencies, among other discourses hidden from the text. For this reason, the analysis of the internal discourse of the text allows us to delve inside the state ideology and its underlying or latent discourses.

  10. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. New Methodology of Block Cipher Analysis using Chaos Game

    Directory of Open Access Journals (Sweden)

    Budi Sulistyo

    2011-08-01

    Full Text Available Block cipher analysis covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper defines bias which represents violation of a random sequence from true random sequence. In this paper, block cipher is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make the application of various analysis techniques developed in dynamical system field becomes possible. There are three main parts of the methodology presented in this paper: the dynamical system framework for block cipher analysis, a new chaos game scheme and an extended measure concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discrete dynamical system whose block cipher as its mapping function, random sequence construction, the random sequence usage as input of a chaos game scheme, output measurement of chaos game scheme using extended measure concept, analysis the result of the measurement. The analysis process and of a specific real or sample block cipher and the analysis result are beyond the scope of this paper.

  12. A continuum damage analysis of hydrogen attack in 2.25 Cr-1Mo vessel

    DEFF Research Database (Denmark)

    van der Burg, M.W.D.; van der Giessen, E.; Tvergaard, Viggo

    1998-01-01

    A micromechanically based continuum damage model is presented to analyze the stress, temperature and hydrogen pressure dependent material degradation process termed hydrogen attack, inside a pressure vessel. Hydrogen attack (HA) is the damage process of grain boundary facets due to a chemical...... reaction of carbides with hydrogen, thus forming cavities with high pressure methane gas. Driven by the methane gas pressure, the cavities grow, while remote tensile stresses can significantly enhance the cavitation rate. The damage model gives the strain-rate and damage rate as a function...... of the temperature, hydrogen pressure and applied stresses. The model is applied to study HA in a vessel wall, where nonuniform distributions of hydrogen pressure, temperature and stresses result in a nonuniform damage distribution over the vessel wall. Stresses inside the vessel wall first tend to accelerate...

  13. Analysis of Forgery Attack on One-Time Proxy Signature and the Improvement

    Science.gov (United States)

    Wang, Tian-Yin; Wei, Zong-Li

    2016-02-01

    In a recent paper, Yang et al. (Quant. Inf. Process. 13(9), 2007-2016, 2014) analyzed the security of one-time proxy signature scheme Wang and Wei (Quant. Inf. Process. 11(2), 455-463, 2012) and pointed out that it cannot satisfy the security requirements of unforgeability and undeniability because an eavesdropper Eve can forge a valid proxy signature on a message chosen by herself. However, we find that the so-called proxy message-signature pair forged by Eve is issued by the proxy signer in fact, and anybody can obtain it as a requester, which means that the forgery attack is not considered as a successful attack. Therefore, the conclusion that this scheme cannot satisfy the security requirements of proxy signature against forging and denying is not appropriate in this sense. Finally, we study the reason for the misunderstanding and clarify the security requirements for proxy signatures.

  14. Non-Normal Impact and Penetration. Analysis for Hard Targets and Small Angles of Attack

    Science.gov (United States)

    1978-09-01

    acceleration of the projectile CG for different initial values of the attack angle. Beyond the point of nose embedment (0.2 ms), there is a marked difference...DeBoesser, Jr. Technical Library Director, Defense Intelligence Agency ATN : DI-TE Comander , Frankford Arsenal DT-2, Weapons & Systems ATTN: L. Baldini...Monitored Battlefield A’TN: STSI (Archives) Sensor System 2 cy STTL (Technical Library) ATTN: Chuck Higgins 15 cy SPSS DDST Comanding Officer, Picatinny

  15. Analysis Of Default Passwords In Routers Against Brute-Force Attack

    OpenAIRE

    Mohammed Farik; ABM Shawkat Ali

    2015-01-01

    Abstract Password authentication is the main means of access control on network routers and router manufacturers provide a default password for initial login to the router. While there has been many publications regarding the minimum requirements of a good password how widely the manufacturers themselves are adhering to the minimum standards and whether these passwords can withstand brute-force attack are not widely known. The novelty of this research is that this is the first time default...

  16. Heart Attack

    Science.gov (United States)

    ... Pressure, tightness, pain, or a squeezing or aching sensation in your chest or arms that may spread to your neck, jaw or back Nausea, indigestion, heartburn or abdominal pain Shortness of breath Cold sweat Fatigue Lightheadedness or sudden dizziness Heart attack ...

  17. Heart attack

    Science.gov (United States)

    ... part in support groups for people with heart disease . Outlook (Prognosis) After a heart attack, you have a higher ... P, Bonow RO, Braunwald E, eds. Braunwald's Heart Disease: A Textbook of Cardiovascular Medicine . 10th ed. Philadelphia, PA: Elsevier Saunders; 2014: ...

  18. Shark attack.

    Science.gov (United States)

    Guidera, K J; Ogden, J A; Highhouse, K; Pugh, L; Beatty, E

    1991-01-01

    Shark attacks are rare but devastating. This case had major injuries that included an open femoral fracture, massive hemorrhage, sciatic nerve laceration, and significant skin and muscle damage. The patient required 15 operative procedures, extensive physical therapy, and orthotic assistance. A review of the literature pertaining to shark bites is included.

  19. Structural Learning of Attack Vectors for Generating Mutated XSS Attacks

    Directory of Open Access Journals (Sweden)

    Yi-Hsun Wang

    2010-09-01

    Full Text Available Web applications suffer from cross-site scripting (XSS attacks that resulting from incomplete or incorrect input sanitization. Learning the structure of attack vectors could enrich the variety of manifestations in generated XSS attacks. In this study, we focus on generating more threatening XSS attacks for the state-of-the-art detection approaches that can find potential XSS vulnerabilities in Web applications, and propose a mechanism for structural learning of attack vectors with the aim of generating mutated XSS attacks in a fully automatic way. Mutated XSS attack generation depends on the analysis of attack vectors and the structural learning mechanism. For the kernel of the learning mechanism, we use a Hidden Markov model (HMM as the structure of the attack vector model to capture the implicit manner of the attack vector, and this manner is benefited from the syntax meanings that are labeled by the proposed tokenizing mechanism. Bayes theorem is used to determine the number of hidden states in the model for generalizing the structure model. The paper has the contributions as following: (1 automatically learn the structure of attack vectors from practical data analysis to modeling a structure model of attack vectors, (2 mimic the manners and the elements of attack vectors to extend the ability of testing tool for identifying XSS vulnerabilities, (3 be helpful to verify the flaws of blacklist sanitization procedures of Web applications. We evaluated the proposed mechanism by Burp Intruder with a dataset collected from public XSS archives. The results show that mutated XSS attack generation can identify potential vulnerabilities.

  20. Proficiency examination in English language: Needs analysis and methodological proposals

    Directory of Open Access Journals (Sweden)

    Maria Elizete Luz Saes

    2014-04-01

    Full Text Available The purpose of this work is to provide tools for reflections on some learning difficulties presented by students when they are submitted to English proficiency examinations, as well as to suggest some methodological proposals that can be implemented among didactic support groups, monitoring or in classrooms, by means of face-to-face or distance learning activities. The observations resulting from the performance presented by the students have motivated the preparation of this paper, whose theoretical assumptions are based on the needs analysis of the target audience, the exploration of oral and written discursive genres and the possibilities of interaction provided by technological mediation.

  1. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  2. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  3. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  4. Comparative Analysis of Return of Serve Comparative Analysis of Return of Serve as Counter- as Counter-attack in Modern Tennis attack in Modern Tennis attack in Modern Tennis

    Directory of Open Access Journals (Sweden)

    Petru Eugen MERGHEŞ

    2017-02-01

    Full Text Available High performance modern tennis is characterised by high dynamism, speed in thinking and action, precision and high technical and tactical skills. In this study, we used direct observation and statistical recording of nine matches during two competition years in the tennis players Roger Federer, Rafael Nadal and Andre Agassi. In these tennis players, we studied mainly the return of serve, one of the most important shots in tennis, together with serve, as first shots in a point. We have chosen the three tennis players because they are the best example of return of serve as shown by the matches recorded and interpreted. The study we have carried out shows that return of serve makes Agassi a winner in most matches. The high percentage in Federer’s serves makes his adversaries have a lower percentage in return of serve, which prevents them to win against his serve. High percentage in return of serve results in more points on the adversary’s serve and an opportunity to start the offensive point. After comparing the three tennis players mentioned above, we can see that the highest percentage of points won on return of serve belongs to Agassi, which ranks him among the best return of serve tennis players in the world. The tennis player with the highest percentage in return of service is the one who wins the match, which shows, once again, the importance of the return of serve. Return of serve can be a strong counter-attack weapon if used at its highest level.

  5. Analysis of Al-Qaeda Terrorist Attacks to Investigate Rational Action

    Directory of Open Access Journals (Sweden)

    Daniel P. Hepworth

    2013-04-01

    Full Text Available Many studies have been conducted to demonstrate the collective rationality of traditional terrorist groups; this study seeks to expand this and apply collective rationality to Islamic terrorist groups. A series of statistical analyses were conducted on terrorist attacks carried out by Al-Qaeda and affiliated terrorist organization; these were then compared to two more conventional terrorist groups: the Euskadi Ta Askatasuna (ETA and the Liberation Tigers of Tamil Eelam (LTTE. When viewed through the context of the groups’ various motivations and objectives, the results of these analyses demonstrates collective rationality for those terrorist groups examined. 

  6. Practical In-Depth Analysis of IDS Alerts for Tracing and Identifying Potential Attackers on Darknet

    OpenAIRE

    Jungsuk Song; Younsu Lee; Jang-Won Choi; Joon-Min Gil; Jaekyung Han; Sang-Soo Choi

    2017-01-01

    The darknet (i.e., a set of unused IP addresses) is a very useful solution for observing the global trends of cyber threats and analyzing attack activities on the Internet. Since the darknet is not connected with real systems, in most cases, the incoming packets on the darknet (‘the darknet traffic’) do not contain a payload. This means that we are unable to get real malware from the darknet traffic. This situation makes it difficult for security experts (e.g., academic researchers, engineers...

  7. Statistical analysis of large passwords lists, used to optimize brute force attacks

    CSIR Research Space (South Africa)

    Van Heerden, RP

    2009-03-01

    Full Text Available in 1989. He successfully obtained 25% of passwords using a dictionary attack(Klein 1990). The study lasted 12 months, although 80% of the passwords guessed were obtained in the first week. The following trends were revealed: • The most popular password... • password1 • bink182 • (username) The PC magazine list shares some similarities with the Brown list. Exceptions are the use of unique UK area-specific passwords. 2.2.4 J Ruska Jimmy Ruska constructed a list of passwords from online students (Ruska 2008...

  8. Time series analysis of records of asthmatic attacks in New York City. [Relation to air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, I F; Rausch, L E

    1977-01-01

    One of several methods developed to search for temporo-spatial patterns in the health variable, as an alternative method to overcome certain shortcomings that afflict the usual techniques of studying the acute health effects of air pollution by multiple regression analyses, were used in an epidemiological study of asthmatic attacks in two areas of New York City (Harlem and Brooklyn), where aerometric monitoring stations are located to provide data on daily fluctuations in air pollution levels. Sulfur dioxide and smokeshade were used as independent variables in the studies.

  9. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  10. A methodology for reliability analysis in health networks.

    Science.gov (United States)

    Spyrou, Stergiani; Bamidis, Panagiotis D; Maglaveras, Nicos; Pangalos, George; Pappas, Costas

    2008-05-01

    A reliability model for a health care domain based on requirement analysis at the early stage of design of regional health network (RHN) is introduced. RHNs are considered as systems supporting the services provided by health units, hospitals, and the regional authority. Reliability assessment in health care domain constitutes a field-of-quality assessment for RHN. A novel approach for predicting system reliability in the early stage of designing RHN systems is presented in this paper. The uppermost scope is to identify the critical processes of an RHN system prior to its implementation. In the methodology, Unified Modeling Language activity diagrams are used to identify megaprocesses at regional level and the customer behavior model graph (CBMG) to describe the states transitions of the processes. CBMG is annotated with: 1) the reliability of each component state and 2) the transition probabilities between states within the scope of the life cycle of the process. A stochastic reliability model (Markov model) is applied to predict the reliability of the business process as well as to identify the critical states and compare them with other processes to reveal the most critical ones. The ultimate benefit of the applied methodology is the design of more reliable components in an RHN system. The innovation of the approach of reliability modeling lies with the analysis of severity classes of failures and the application of stochastic modeling using discrete-time Markov chain in RHNs.

  11. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  12. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    Science.gov (United States)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  13. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  14. PROTECTION WORKS AGAINST WAVE ATTACKS IN THE HAOR AREAS OF BANGLADESH: ANALYSIS OF SUSTAINABILITY

    Directory of Open Access Journals (Sweden)

    M.K. Alam

    2010-12-01

    Full Text Available Haor is the local name of the saucer-shaped, naturally depressed areas of Bangladesh. There are 414 haors in the northeast region that comprise approximately 17% of the country. These areas are submerged under deep water from July to November due to the overflow of rivers and heavy rainfall, causing them to appear like seas with erosive waves. Recently, the wave attack has drastically increased because of de-plantation and changing cropping patterns to allow for more agricultural production. The local people, government and Non-Government Organisations (NGOs have tried many techniques to protect life and property against wave attacks. A cost comparison shows that Cement Concrete (CC blocks over geotextile on the slope embankment is a cost-effective, environment friendly and socially acceptable method to prevent loss of life and property. However, the design rules employed by the engineers are faulty because there is knowledge gap in the application of wave hydraulics among these professionals. As a result, damage frequently occurs and maintenance costs are increasing. This study explores the sustainability of the CC blocks used in the Haor areas by evaluating two case studies with the verification of available design rules.

  15. Smartphone Based Heart Attack Risk Prediction System with Statistical Analysis and Data Mining Approaches

    Directory of Open Access Journals (Sweden)

    M. Raihan

    2017-11-01

    Full Text Available Nowadays, Ischemic Heart Disease (IHD (Heart Attack is ubiquitous and one of the major reasons of death worldwide. Early screening of people at risk of having IHD may lead to minimize morbidity and mortality. A simple approach is proposed in this paper to predict risk of developing heart attack using smartphone and data mining. Clinical data from 835 patients was collected, analyzed and also correlated with their risk existing clinical symptoms which may suggest underlying non detected IHD. A user friendly Android application was developed by incorporating clinical data obtained from patients who admitted with chest pain in a cardiac hospital. Upon user input of risk factors, the application categorizes the level of IHD risks of the user as high, low or medium. It was found by analyzing and correlating the data that there was a significant correlation of having an IHD and the application results in high & low, medium & low and medium & high categories; where the p values were 0.0001, 0.0001 and 0.0001 respectively. The experimental results showed that the sensitivity and accuracy of the proposed technique were 89.25 % and 76.05 % respectively, whereas, using C4.5 decision tree, accuracy was found 86% and sensitivity was obtained 91.6%. Existing tools need mandatory input of lipid values which makes them underutilized by general people; though these risk calculators bear significant academic importance. Our research is motivated to reduce that limitation and promote a risk evaluation on time.

  16. Automated Generation of Attack Trees

    DEFF Research Database (Denmark)

    Vigo, Roberto; Nielson, Flemming; Nielson, Hanne Riis

    2014-01-01

    -prone and impracticable for large systems. Nonetheless, the automated generation of attack trees has only been explored in connection to computer networks and levering rich models, whose analysis typically leads to an exponential blow-up of the state space. We propose a static analysis approach where attack trees...

  17. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  18. Disparities in adult African American women's knowledge of heart attack and stroke symptomatology: an analysis of 2003-2005 Behavioral Risk Factor Surveillance Survey data.

    Science.gov (United States)

    Lutfiyya, May Nawal; Cumba, Marites T; McCullough, Joel Emery; Barlow, Erika Laverne; Lipsky, Martin S

    2008-06-01

    Heart disease and stroke are the first and third leading causes of death of American women, respectively. African American women experience a disproportionate burden of these diseases compared with Caucasian women and are also more likely to delay seeking treatment for acute symptoms. As knowledge is a first step in seeking care, this study examined the knowledge of heart attack and stroke symptoms among African American women. This was a cross-sectional study analyzing 2003-2005 Behavioral Risk Factor Surveillance Survey (BRFSS) data. A composite heart attack and stroke knowledge score was computed for each respondent from the 13 heart attack and stroke symptom knowledge questions. Multivariate logistic regression was performed using low scores on the heart attack and stroke knowledge questions as the dependent variable. Twenty percent of the respondents were low scorers, and 23.8% were high scorers. Logistic regression analysis showed that adult African American women who earned low scores on the composite heart attack and stroke knowledge questions (range 0-8 points) were more likely to be aged 18-34 (OR = 1.36, CI 1.35, 1.37), be uninsured (OR = 1.32, CI 1.31, 1.33), have an annual household income heart attack and stroke symptoms varied significantly among African American women, depending on socioeconomic variables. Targeting interventions to African American women, particularly those in lower socioeconomic groups, may increase knowledge of heart attack and stroke symptoms, subsequently improving preventive action taken in response to these conditions.

  19. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    Menzel, Francine; Sabundjian, Gaiane; D'auria, Francesco; Madeira, Alzira A.

    2015-01-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  20. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  1. Analysis of the low-altitude proton flux asymmetry: methodology

    CERN Document Server

    Kruglanski, M

    1999-01-01

    Existing East-West asymmetry models of the trapped proton fluxes at low altitudes depend on the local magnetic dip angle and a density scale height derived from atmospheric models. We propose an alternative approach which maps the directional flux over a drift shell (B sub m , L) in terms of the local pitch and azimuthal angles alpha and beta, where beta is defined in the local mirror plane as the angle between the proton arrival direction and the surface normal to the drift shell. This approach has the advantage that it only depends on drift shell parameters and does not involve an atmosphere model. A semi-empirical model based on the new methodology is able to reproduce the angular distribution of a set of SAMPEX/PET proton flux measurements. Guidelines are proposed for spacecraft missions and data analysis procedures that are intended to be used for the building of new trapped radiation environment models.

  2. Computational methodology for ChIP-seq analysis

    Science.gov (United States)

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  3. Securing public transportation systems an integrated decision analysis framework for the prevention of terrorist attacks as example

    CERN Document Server

    Brauner, Florian

    2017-01-01

    Florian Brauner addresses the risk reduction effects of security measures (SecMe) as well as economic and social effects using terrorist threats in public transportation as use case. SecMe increase the level of security but cause interferences and restrictions for customers (e.g. privacy). This study identifies the interferences and analyzes the acceptance with an empirical survey of customers. A composite indicator for the acceptance of different SecMe is developed and integrated into a risk management framework for multi-criteria decision analysis achieving the right balance of risk reduction, costs, and social acceptance. Contents Assessment of Security Measures for Risk Management Measurement of Objective Effectiveness of Security Measures Against Terrorist Attacks Determination of Subjective Effects of Security Measures (Customer Acceptance Analysis) Cost Analysis of Security Measures Multi-Criteria Decision Support Systems Target Groups Scientists with Interest in Civil Security Research Providers and S...

  4. Impact modeling and prediction of attacks on cyber targets

    Science.gov (United States)

    Khalili, Aram; Michalk, Brian; Alford, Lee; Henney, Chris; Gilbert, Logan

    2010-04-01

    In most organizations, IT (information technology) infrastructure exists to support the organization's mission. The threat of cyber attacks poses risks to this mission. Current network security research focuses on the threat of cyber attacks to the organization's IT infrastructure; however, the risks to the overall mission are rarely analyzed or formalized. This connection of IT infrastructure to the organization's mission is often neglected or carried out ad-hoc. Our work bridges this gap and introduces analyses and formalisms to help organizations understand the mission risks they face from cyber attacks. Modeling an organization's mission vulnerability to cyber attacks requires a description of the IT infrastructure (network model), the organization mission (business model), and how the mission relies on IT resources (correlation model). With this information, proper analysis can show which cyber resources are of tactical importance in a cyber attack, i.e., controlling them enables a large range of cyber attacks. Such analysis also reveals which IT resources contribute most to the organization's mission, i.e., lack of control over them gravely affects the mission. These results can then be used to formulate IT security strategies and explore their trade-offs, which leads to better incident response. This paper presents our methodology for encoding IT infrastructure, organization mission and correlations, our analysis framework, as well as initial experimental results and conclusions.

  5. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  6. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  7. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  8. New Methodology of Block Cipher Analysis Using Chaos Game

    Directory of Open Access Journals (Sweden)

    Budi Sulistyo

    2014-11-01

    Full Text Available Block cipher analysis  covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence  concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper  defines  bias  which represents  violation  of  a  random  sequence  from  true random sequence. In this paper, block cipher   is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make  the  application  of  various  analysis  techniques  developed  in  dynamical system  field  becomes  possible.  There  are three  main parts of  the methodology presented  in  this  paper:  the  dynamical  system  framework  for  block  cipher analysis, a  new chaos game  scheme and an extended measure  concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discr ete dynamical system whose block cipher as its mapping function, random sequence construction,  the  random  sequence  usage  as  input  of  a  chaos  game  scheme, output  measurement  of  chaos  game  scheme  using  extended  measure  concept, analysis  the  result  of  the  measurement.  The  analysis  process  and  of  a  specific real or sample block cipher and the analysis result are beyond the scope of this paper.

  9. A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.

    2016-11-01

    In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.

  10. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  11. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  12. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  13. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  14. Cost analysis in health centers using 'Step Down' methodology

    Directory of Open Access Journals (Sweden)

    Matejić S.

    2015-01-01

    Full Text Available Health care reform aims to improve Health System performance by achievement of one of the four objectives: reducing costs by increasing the efficiency of health care provision. Performance improvement implies acceptance of innovations in all Health care activities including health care financial management. Successful implementation of health care financing reform requires previous costs and activities analysis in health institutions. In the work we performed comparative analysis of the costs of 27 health institutions by applying innovative system for health care services costs analysis and control. Initialy spreadsheet system was made, by using internationaly recognised 'Step Down' methodology, for cost control and analisys in the hospitals and was adapted for Primary health care institutions. Results achieved: The dominant cost for employees salaries, on average around 80%, does not depend on the size of Primary health institution (Dom zdravlja; Significant differences in the percentage values of the cost of medicines, medical supplies, diagnostic services; There is an obvious difference percentage values of technical maintenance costs as a result of uneven percentage of the number of non-medical employees, differences in infrastructure organization, the difference in the condition and type of equipment, the difference in the type of space heating and type of fuel for heating, patients transportation obligations especialy of home treatment services and polyvalent patronage. There is a big difference in average cost per outpatient examination, as a consequence of uneven number of services performed, especialy in the dentistry services. There is a significant difference in the number of preventive health examinations performed which has a direct impact on the cost of these inspections. The main conclusion of the analysis done indicates that in the actual situation of disparities, in terms of costs, can joperdize implementation of Primary health

  15. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  16. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  17. Calculating Adversarial Risk from Attack Trees: Control Strength and Probabilistic Attackers

    NARCIS (Netherlands)

    Pieters, Wolter; Davarynejad, Mohsen

    2015-01-01

    Attack trees are a well-known formalism for quantitative analysis of cyber attacks consisting of multiple steps and alternative paths. It is possible to derive properties of the overall attacks from properties of individual steps, such as cost for the attacker and probability of success. However, in

  18. A Game Theoretic Approach to Cyber Attack Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Peng Liu

    2005-11-28

    The area investigated by this project is cyber attack prediction. With a focus on correlation-based prediction, current attack prediction methodologies overlook the strategic nature of cyber attack-defense scenarios. As a result, current cyber attack prediction methodologies are very limited in predicting strategic behaviors of attackers in enforcing nontrivial cyber attacks such as DDoS attacks, and may result in low accuracy in correlation-based predictions. This project develops a game theoretic framework for cyber attack prediction, where an automatic game-theory-based attack prediction method is proposed. Being able to quantitatively predict the likelihood of (sequences of) attack actions, our attack prediction methodology can predict fine-grained strategic behaviors of attackers and may greatly improve the accuracy of correlation-based prediction. To our best knowledge, this project develops the first comprehensive framework for incentive-based modeling and inference of attack intent, objectives, and strategies; and this project develops the first method that can predict fine-grained strategic behaviors of attackers. The significance of this research and the benefit to the public can be demonstrated to certain extent by (a) the severe threat of cyber attacks to the critical infrastructures of the nation, including many infrastructures overseen by the Department of Energy, (b) the importance of cyber security to critical infrastructure protection, and (c) the importance of cyber attack prediction to achieving cyber security.

  19. Clinical and Demographic Characteristics Associated With Suboptimal Primary Stroke and Transient Ischemic Attack Prevention: Retrospective Analysis.

    Science.gov (United States)

    Turner, Grace M; Calvert, Melanie; Feltham, Max G; Ryan, Ronan; Finnikin, Samuel; Marshall, Tom

    2018-03-01

    Primary prevention of stroke and transient ischemic attack (TIA) is important to reduce the burden of these conditions; however, prescribing of prevention drugs is suboptimal. We aimed to identify individual clinical and demographic characteristics associated with potential missed opportunities for prevention therapy with lipid-lowering, anticoagulant, or antihypertensive drugs before stroke/TIA. We analyzed anonymized electronic primary care records from a UK primary care database that covers 561 family practices. Patients with first-ever stroke/TIA, ≥18 years, with diagnosis between January 1, 2009, and December 31, 2013, were included. Missed opportunities for prevention were defined as people with clinical indications for lipid-lowering, anticoagulant, or antihypertensive drugs but not prescribed these drugs before their stroke/TIA. Mixed-effect logistic regression models evaluated the relationship between missed opportunities and individual clinical/demographic characteristics. The inclusion criteria were met by 29 043 people with stroke/TIA. Patients with coronary heart disease, chronic kidney disease, peripheral arterial disease, or diabetes mellitus were at less risk of a missed opportunity for prescription of lipid-lowering and antihypertensive drugs. However, patients with a 10-year cardiovascular disease risk ≥20% but without these diagnoses had increased risk of having a missed opportunity for prescription of lipid-lowering drugs or antihypertensive drugs. Women were less likely to be prescribed anticoagulants but more likely to be prescribed antihypertensive drugs. The elderly (≥85 years of age) were less likely to be prescribed all 3 prevention drugs, compared with people aged 75 to 79 years. Knowing the patient characteristics predictive of missed opportunities for stroke prevention may help primary care identify and appropriately manage these patients. Improving the management of these groups may reduce their risk and potentially prevent

  20. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  1. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  2. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  3. [Compressive-spectral analysis of EEG in patients with panic attacks in the context of different psychiatric diseases].

    Science.gov (United States)

    Tuter, N V; Gnezditskiĭ, V V

    2008-01-01

    Panic disorders (PD) which develop in the context of different psychiatric diseases (neurotic, personality disorder and schizotypal disorders) have their own clinical and neurophysiological features. The results of compressive-spectral analysis of EEG (CSA EEG) in patients with panic attack were different depending on the specifics of initial psychiatric status. EEG parameters in patients differed from those in controls. The common feature for all PD patients was the lower spectral density of theta-, alpha- and beta-bands as well as total spectral density without any alterations of region distribution. The decrease of electrical activity of activation systems was found in the groups with neurotic and schizotypal disorders and that of inhibition systems - in the group with schizotypal disorders. The EEG results did not suggest any depression of activation systems in patients with specific personality disorders. The data obtained with CSA EEG mirror the integrative brain activity which determinad of the appearance of PA as well as of nosology of psychiatre disease.

  4. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  5. Iterative Transport-Diffusion Methodology For LWR Core Analysis

    Science.gov (United States)

    Colameco, David; Ivanov, Boyan D.; Beacon, Daniel; Ivanov, Kostadin N.

    2014-06-01

    This paper presents an update on the development of an advanced methodology for core calculations that uses local heterogeneous solutions for on-the-fly nodal cross-section generation. The Iterative Transport-Diffusion Method is an embedded transport approach that is expected to provide results with near 3D transport accuracy for a fraction of the time required by a full 3D transport method. In this methodology, the infinite environment used for homogenized nodal cross-section generation is replaced with a simulated 3D environment of the diffusion calculation. This update focuses on burnup methodology, axial leakage and 3D modeling.

  6. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  7. Temporal Cyber Attack Detection.

    Energy Technology Data Exchange (ETDEWEB)

    Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Draelos, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Galiardi, Meghan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Doak, Justin E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    Rigorous characterization of the performance and generalization ability of cyber defense systems is extremely difficult, making it hard to gauge uncertainty, and thus, confidence. This difficulty largely stems from a lack of labeled attack data that fully explores the potential adversarial space. Currently, performance of cyber defense systems is typically evaluated in a qualitative manner by manually inspecting the results of the system on live data and adjusting as needed. Additionally, machine learning has shown promise in deriving models that automatically learn indicators of compromise that are more robust than analyst-derived detectors. However, to generate these models, most algorithms require large amounts of labeled data (i.e., examples of attacks). Algorithms that do not require annotated data to derive models are similarly at a disadvantage, because labeled data is still necessary when evaluating performance. In this work, we explore the use of temporal generative models to learn cyber attack graph representations and automatically generate data for experimentation and evaluation. Training and evaluating cyber systems and machine learning models requires significant, annotated data, which is typically collected and labeled by hand for one-off experiments. Automatically generating such data helps derive/evaluate detection models and ensures reproducibility of results. Experimentally, we demonstrate the efficacy of generative sequence analysis techniques on learning the structure of attack graphs, based on a realistic example. These derived models can then be used to generate more data. Additionally, we provide a roadmap for future research efforts in this area.

  8. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  9. Radicalization patterns and modes of attack planning and preparation among lone-actor terrorists : an exploratory analysis

    NARCIS (Netherlands)

    Lindekilde, L.; O'Connor, F.; Schuurman, B.W.

    2017-01-01

    This article explores the link between radicalization patterns and modes of attack planning and preparation among lone-actor terrorists. Building on theorized patterns of lone-actor radicalization, we discuss and compare their modes of pre-attack behavior, including target and weapon choice,

  10. A continuum damage analysis of hydrogen attack in a 2.25Cr–1Mo pressure vessel

    NARCIS (Netherlands)

    Burg, M.W.D. van der; Giessen, E. van der; Tvergaard, V.

    1998-01-01

    A micromechanically based continuum damage model is presented to analyze the stress, temperature and hydrogen pressure dependent material degradation process termed hydrogen attack, inside a pressure vessel. Hydrogen attack (HA) is the damage process of grain boundary facets due to a chemical

  11. The Terrorist Attacks and the Human Live Birth Sex Ratio: a Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Masukume, Gwinyai; O'Neill, Sinéad M; Khashan, Ali S; Kenny, Louise C; Grech, Victor

    2017-01-01

    The live birth sex ratio is defined as male/total births (M/F). Terrorist attacks have been associated with a transient decline in M/F 3-5 months later with an excess of male losses in ongoing pregnancies. The early 21st century is replete with religious/politically instigated attacks. This study estimated the pooled effect size between exposure to attacks and M/F. Registration number CRD42016041220. PubMed and Scopus were searched for ecological studies that evaluated the relationship between terrorist attacks from 1/1/2000 to 16/6/2016 and M/F. An overall pooled odds ratio (OR) for the main outcome was generated using the generic inverse variance method. Five studies were included: 2011 Norway attacks; 2012 Sandy Hook Elementary School shooting; 2001 September 11 attacks; 2004 Madrid and 2005 London bombings. OR at 0.97 95% CI (0.94-1.00) (I2 = 63%) showed a small statistically significant 3% decline in the odds (p = 0.03) of having a male live birth 3-5 months later. For lone wolf attacks there was a 10% reduction, OR 0.90 95% CI (0.86-0.95) (p = 0.0001). Terrorist (especially lone wolf) attacks were significantly associated with reduced odds of having a live male birth. Pregnancy loss remains an important Public Health challenge. Systematic reviews and meta-analyses considering other calamities are warranted.

  12. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  13. Methodologies for analysis of patterning in the mouse RPE sheet

    Science.gov (United States)

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  14. Methodologies for analysis of patterning in the mouse RPE sheet.

    Science.gov (United States)

    Boatright, Jeffrey H; Dalal, Nupur; Chrenek, Micah A; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E; Nickerson, John M

    2015-01-01

    Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20-50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell-cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether

  15. Iterative transport-diffusion methodology for LWR core analysis

    International Nuclear Information System (INIS)

    Colameco, D.; Beacon, D.; Ivanov, K.N.; Inanov, B.D.

    2013-01-01

    This paper presents an update on the development of an advanced methodology for Light Water Reactor core calculations that uses local heterogeneous solutions for on-the-fly nodal cross-section generation. The Iterative Transport-Diffusion Method (ITDM) is an embedded transport approach that is expected to provide results with near 3D transport accuracy for a fraction of the time required by a full 3D transport method. In this methodology, the infinite environment used for homogenized nodal cross-section generation is replaced with a simulated 3D environment of the diffusion calculation. It is shown that the ITDM methodology provides very promising results when using partial currents as boundary conditions for loosely coupling a 2D lattice transport code to a 3D core nodal solver. The use of partial currents is a major improvement over the albedo concept: the solutions converged in a smoother manner

  16. The analysis of classroom talk: methods and methodologies.

    Science.gov (United States)

    Mercer, Neil

    2010-03-01

    This article describes methods for analysing classroom talk, comparing their strengths and weaknesses. Both quantitative and qualitative methods are described and assessed for their strengths and weaknesses, with a discussion of the mixed use of such methods. It is acknowledged that particular methods are often embedded in particular methodologies, which are based on specific theories of social action, research paradigms, and disciplines; and so a comparison is made of two contemporary methodologies, linguistic ethnography, and sociocultural research. The article concludes with some comments on the current state of development of this field of research and on ways that it might usefully progress.

  17. An Analysis of the Impact of the AuthRF and AssRF Attacks on IEEE 802.11e Standard

    OpenAIRE

    Bogdanoski, Mitko; Latkoski, Pero; Risteski, Aleksandar

    2015-01-01

    The paper shows detailed analysis of the effects of the AuthRF (Authentication Request Flooding) and AssRF (Association Request Flooding) MAC Layer DoS (Denial of Service) attacks on 802.11e wireless standard based on a proposed queuing model. More specific, the paper analyzes the Access Point (AP) behavior under AuthRF DoS attacks with different frequency of the requests arrival, i.e. Low Level (LL), Medium Level (ML) and High Level (HL), at the same time considering different traffic priori...

  18. Analysis of Protection Measures for Naval Vessels Berthed at Harbor Against Terrorist Attacks

    Science.gov (United States)

    2016-06-01

    their support and backing. Last but not the least, I am really thankful to my wife and kids for their support, patience, and tolerance while I...methods to analyze the data generated from the experiment discussed in the previous chapter. For the analysis, the JMP statistical package was the

  19. Development of a method of analysis and computer program for calculating the inviscid flow about the windward surfaces of space shuttle configurations at large angles of attack

    Science.gov (United States)

    Maslen, S. H.

    1974-01-01

    A general method developed for the analysis of inviscid hypersonic shock layers is discussed for application to the case of the shuttle vehicle at high (65 deg) angle of attack. The associated extensive subsonic flow region caused convergence difficulties whose resolution is discussed. It is required that the solution be smoother than anticipated.

  20. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  1. Seismic hazard analysis. A methodology for the Eastern United States

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  2. Qualitative Analysis of Comic Strip Culture: A Methodological Inquiry.

    Science.gov (United States)

    Newman, Isadore; And Others

    The paper is a methodological inquiry into the interpretation of qualitative data. It explores a grounded-theory approach to the synthesis of data and examines, in particular, the construction of categories. It focuses on ways of organizing and attaching meaning to data, as research problems embedded in a cultural context are explored. A…

  3. Grounded Theory and Educational Ethnography: A Methodological Analysis and Critique.

    Science.gov (United States)

    Smith, Louis M.; Pohland, Paul A.

    This paper analyzes and evaluates the methodological approach developed by B. G. Glaser and A. L. Strauss in THE DISCOVERY OF GROUNDED THEORY (Chicago: Aldine, 1967). Smith and Pohland's major intent is to raise Glaser and Strauss' most significant concepts and issues, analyze them in the context of seven of their own studies, and in conclusion…

  4. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  5. Practical Correlation Analysis between Scan and Malware Profiles against Zero-Day Attacks Based on Darknet Monitoring

    Science.gov (United States)

    Nakao, Koji; Inoue, Daisuke; Eto, Masashi; Yoshioka, Katsunari

    Considering rapid increase of recent highly organized and sophisticated malwares, practical solutions for the countermeasures against malwares especially related to zero-day attacks should be effectively developed in an urgent manner. Several research activities have been already carried out focusing on statistic calculation of network events by means of global network sensors (so-called macroscopic approach) as well as on direct malware analysis such as code analysis (so-called microscopic approach). However, in the current research activities, it is not clear at all how to inter-correlate between network behaviors obtained from macroscopic approach and malware behaviors obtained from microscopic approach. In this paper, in one side, network behaviors observed from darknet are strictly analyzed to produce scan profiles, and in the other side, malware behaviors obtained from honeypots are correctly analyzed so as to produce a set of profiles containing malware characteristics. To this end, inter-relationship between above two types of profiles is practically discussed and studied so that frequently observed malwares behaviors can be finally identified in view of scan-malware chain.

  6. A PANIC Attack on Inflation and Unemployment in Africa: Analysis of Persistence and Convergence

    OpenAIRE

    DO ANGO, Simplicio; AMBA OYON, Claude Marius

    2016-01-01

    The purpose of the study is to analyze the nature of inflation and unemployment rates -in Africa and its regions- allowing cross-sectional dependence among their countries. The paper contributes to the literature assessing the stochastic properties of unemployment and inflation using the recently developed and more powerful panel unit root tests namely PANIC -Panel Analysis of Non-stationarity in Idiosyncratic and Common Component- from Bay and Ng (2010). To check the robustness of our findin...

  7. Neck-focused panic attacks among Cambodian refugees; a logistic and linear regression analysis.

    Science.gov (United States)

    Hinton, Devon E; Chhean, Dara; Pich, Vuth; Um, Khin; Fama, Jeanne M; Pollack, Mark H

    2006-01-01

    Consecutive Cambodian refugees attending a psychiatric clinic were assessed for the presence and severity of current--i.e., at least one episode in the last month--neck-focused panic. Among the whole sample (N=130), in a logistic regression analysis, the Anxiety Sensitivity Index (ASI; odds ratio=3.70) and the Clinician-Administered PTSD Scale (CAPS; odds ratio=2.61) significantly predicted the presence of current neck panic (NP). Among the neck panic patients (N=60), in the linear regression analysis, NP severity was significantly predicted by NP-associated flashbacks (beta=.42), NP-associated catastrophic cognitions (beta=.22), and CAPS score (beta=.28). Further analysis revealed the effect of the CAPS score to be significantly mediated (Sobel test [Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173-1182]) by both NP-associated flashbacks and catastrophic cognitions. In the care of traumatized Cambodian refugees, NP severity, as well as NP-associated flashbacks and catastrophic cognitions, should be specifically assessed and treated.

  8. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  9. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  10. Ion beam analysis in art and archaeology: attacking the power precisions paradigm

    International Nuclear Information System (INIS)

    Abraham, Meg

    2004-01-01

    It is a post-modern axiom that the closer one looks at something the more blinkered is the view, thus the result is often a failure to see the whole picture. With this in mind, the value of a tool for art and archaeology applications is greatly enhanced if the information is scientifically precise and yet is easily integrated into the broader study regarding the objects at hand. Art and archaeological objects offer some unique challenges for researchers. First, they are almost always extraordinarily inhomogeneous across individual pieces and across types. Second they are often valuable and delicate so sampling is discouraged. Finally, in most cases, each piece is unique, thus the data is also unique and is of greatest value when incorporated into the overall understanding of the object or of the culture of the artisan. Ion beam analysis solves many of these problems. With IBA, it is possible to avoid sampling by using an external beam setup or by manipulating small objects in a vacuum. The technique is largely non-destructive, allowing for multiple data points to be taken across an object. The X-ray yields are from deeper in the sample than those of other techniques and using RBS one can attain bulk concentrations from microns into the sample. And finally, the resulting X-ray spectra is easily interpreted and understood by many conservators and curators, while PIXE maps are a wonderful visual record of the results of the analysis. Some examples of the special role that ion beam analysis plays in the examination of cultural objects will be covered in this talk

  11. Protecting Cryptographic Memory against Tampering Attack

    DEFF Research Database (Denmark)

    Mukherjee, Pratyay

    . In practice such attacks can be executed easily, e.g. by heating the device, as substantiated by numerous works in the past decade. Tampering attacks are a class of such physical attacks where the attacker can change the memory/computation, gains additional (non-black-box) knowledge by interacting...... with the faulty device and then tries to break the security. Prior works show that generically approaching such problem is notoriously difficult. So, in this dissertation we attempt to solve an easier question, known as memory-tampering, where the attacker is allowed tamper only with the memory of the device......In this dissertation we investigate the question of protecting cryptographic devices from tampering attacks. Traditional theoretical analysis of cryptographic devices is based on black-box models which do not take into account the attacks on the implementations, known as physical attacks...

  12. HILIC-MS rat brain analysis, a new approach for the study of ischemic attack

    Directory of Open Access Journals (Sweden)

    Miękus Natalia

    2017-08-01

    Full Text Available Clinicians often rely on selected small molecular compounds from body fluids for the detection, screening or monitoring of numerous life-threatening diseases. Among others, important monoamines – biogenic amines (BAs – and their metabolites serve as sensitive biomarkers to study the progression or even early detection of on-going brain pathologies or tumors of neuroendocrine origins. Undertaking the task to optimize a reliable method for the simultaneous analysis of the most relevant BAs in biological matrices is of utmost importance for scientists.

  13. Motor Impairments in Transient Ischemic Attack Increase the Odds of a Subsequent Stroke: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Neha Lodha

    2017-06-01

    Full Text Available Background and purposeTransient ischemic attack (TIA increases the risk for a subsequent stroke. Typical symptoms include motor weakness, gait disturbance, and loss of coordination. The association between the presence of motor impairments during a TIA and the chances of a subsequent stroke has not been examined. In the current meta-analysis, we examine whether the odds of a stroke are greater in TIA individuals who experience motor impairments as compared with those who do not experience motor impairments.MethodsWe conducted a systematic search of electronic databases as well as manual searches of the reference lists of retrieved articles. The meta-analysis included studies that reported an odds ratio relating motor impairments to a subsequent stroke, or the number of individuals with or without motor impairments who experienced a subsequent stroke. We examined these studies using rigorous meta-analysis techniques including random effects model, forest and funnel plots, I2, publication bias, and fail-safe analysis.ResultsTwenty-four studies with 15,129 participants from North America, Australia, Asia, and Europe qualified for inclusion. An odds ratio of 2.11 (95% CI, 1.67–2.65, p = 0.000 suggested that the chances of a subsequent stroke are increased by twofolds in individuals who experience motor impairments during a TIA compared with those individuals who have no motor impairments.ConclusionThe presence of motor impairments during TIA is a significantly high-risk clinical characteristic for a subsequent stroke. The current evidence for motor impairments following TIA relies exclusively on the clinical reports of unilateral motor weakness. A comprehensive examination of motor impairments in TIA will enhance TIA prognosis and restoration of residual motor impairments.

  14. Tight finite-key analysis for passive decoy-state quantum key distribution under general attacks

    Science.gov (United States)

    Zhou, Chun; Bao, Wan-Su; Li, Hong-Wei; Wang, Yang; Li, Yuan; Yin, Zhen-Qiang; Chen, Wei; Han, Zheng-Fu

    2014-05-01

    For quantum key distribution (QKD) using spontaneous parametric-down-conversion sources (SPDCSs), the passive decoy-state protocol has been proved to be efficiently close to the theoretical limit of an infinite decoy-state protocol. In this paper, we apply a tight finite-key analysis for the passive decoy-state QKD using SPDCSs. Combining the security bound based on the uncertainty principle with the passive decoy-state protocol, a concise and stringent formula for calculating the key generation rate for QKD using SPDCSs is presented. The simulation shows that the secure distance under our formula can reach up to 182 km when the number of sifted data is 1010. Our results also indicate that, under the same deviation of statistical fluctuation due to finite-size effects, the passive decoy-state QKD with SPDCSs can perform as well as the active decoy-state QKD with a weak coherent source.

  15. Application of KIMERA Methodology to Kori 3 and 4 LBLOCA M/E Release Analysis

    International Nuclear Information System (INIS)

    Song, Jeung Hyo; Hwang, Byung Heon; Kim, Cheol Woo

    2007-01-01

    A new mass and energy (M/E) release analysis methodology called KIMERA (KOPEC Improved Mass and Energy Release Analysis) has been developed. This is a realistic evaluation methodology of the M/E release analysis for the containment design and is applicable to a LOCA and a main steam line break (MSLB) accident. This KIMERA methodology has the same engine as KREM (KEPRI Realistic Evaluation Model) which is the realistic evaluation methodology for LOCA peak clad temperature analysis. This methodology also has several supplementary conservative models for the M/E release such as break spillage model and multiplier on heat transfer coefficient (HTC). For estimating the applicability of the KIMERA methodology to the licensing analysis, the large break LOCA (LBLOCA) M/E analysis was performed for UCN 3 and 4 which is the typical plant of OPR1000 type. The results showed that the peak pressure and temperature occurred earlier and had lower values than those of UCN 3 and 4 FSAR. The KIMERA methodology takes off the over-conservatism from the FSAR results during the post blowdown period for the large break LOCA and provides more margin in containment design. In this study, the LBLOCA M/E analysis using the KIMERA methodology is to be performed for Kori 3 and 4 which is the typical plant of Westinghouse type. The results are compared with those of the Kori Nuclear Unit 3 and 4 FSAR

  16. Full-Envelope Launch Abort System Performance Analysis Methodology

    Science.gov (United States)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  17. Methodology for seismic risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Kaplan, S.; Perla, H.F.; Bley, D.C.

    1983-01-01

    This methodology begins by quantifying the fragilty of all key components and structures in the plant. By means of the logic encoded in the plant event trees and fault trees, the component fragilities are combined to form fragilities for the occurrence of plant damage states or release categories. Combining these, in turn, with the seismicity curves yields the frequencies of those states or releases. Uncertainty is explicitly included at each step of the process

  18. Analysis of Multi-Antenna GNSS Receiver Performance under Jamming Attacks.

    Science.gov (United States)

    Vagle, Niranjana; Broumandan, Ali; Lachapelle, Gérard

    2016-11-17

    Although antenna array-based Global Navigation Satellite System (GNSS) receivers can be used to mitigate both narrowband and wideband electronic interference sources, measurement distortions induced by array processing methods are not suitable for high precision applications. The measurement distortions have an adverse effect on the carrier phase ambiguity resolution, affecting the navigation solution. Depending on the array attitude information availability and calibration parameters, different spatial processing methods can be implemented although they distort carrier phase measurements in some cases. This paper provides a detailed investigation of the effect of different array processing techniques on array-based GNSS receiver measurements and navigation performance. The main novelty of the paper is to provide a thorough analysis of array-based GNSS receivers employing different beamforming techniques from tracking to navigation solution. Two beamforming techniques, namely Power Minimization (PM) and Minimum Power Distortionless Response (MPDR), are being investigated. In the tracking domain, the carrier Doppler, Phase Lock Indicator (PLI), and Carrier-to-Noise Ratio (C/N₀) are analyzed. Pseudorange and carrier phase measurement distortions and carrier phase position performance are also evaluated. Performance analyses results from simulated GNSS signals and field tests are provided.

  19. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  20. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  1. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  2. BiGlobal linear stability analysis on low-Re flow past an airfoil at high angle of attack

    KAUST Repository

    Zhang, Wei

    2016-04-04

    We perform BiGlobal linear stability analysis on flow past a NACA0012 airfoil at 16° angle of attack and Reynolds number ranging from 400 to 1000. The steady-state two-dimensional base flows are computed using a well-tested finite difference code in combination with the selective frequency damping method. The base flow is characterized by two asymmetric recirculation bubbles downstream of the airfoil whose streamwise extent and the maximum reverse flow velocity increase with the Reynolds number. The stability analysis of the flow past the airfoil is carried out under very small spanwise wavenumber β = 10−4 to approximate the two-dimensional perturbation, and medium and large spanwise wavenumbers (β = 1–8) to account for the three-dimensional perturbation. Numerical results reveal that under small spanwise wavenumber, there are at most two oscillatory unstable modes corresponding to the near wake and far wake instabilities; the growth rate and frequency of the perturbation agree well with the two-dimensional direct numerical simulation results under all Reynolds numbers. For a larger spanwise wavenumber β = 1, there is only one oscillatory unstable mode associated with the wake instability at Re = 400 and 600, while at Re = 800 and 1000 there are two oscillatory unstable modes for the near wake and far wake instabilities, and one stationary unstable mode for the monotonically growing perturbation within the recirculation bubble via the centrifugal instability mechanism. All the unstable modes are weakened or even suppressed as the spanwise wavenumber further increases, among which the stationary mode persists until β = 4.

  3. Visualizing Risks: Icons for Information Attack Scenarios

    National Research Council Canada - National Science Library

    Hosmer, Hilary

    2000-01-01

    .... Visual attack scenarios help defenders see system ambiguities, imprecision, vulnerabilities and omissions, thus speeding up risk analysis, requirements gathering, safeguard selection, cryptographic...

  4. Development of a methodology for analysis of the impact of modifying neutron cross sections

    International Nuclear Information System (INIS)

    Wenner, M. T.; Haghighat, A.; Adams, J. M.; Carlson, A. D.; Grimes, S. M.; Massey, T. N.

    2004-01-01

    Monte Carlo analysis of a Time-of-Flight (TOF) experiment can be utilized to examine the accuracy of nuclear cross section data. Accurate determination of this data is paramount in characterization of reactor lifetime. We have developed a methodology to examine the impact of modifying the current cross section libraries available in ENDF-6 format (1) where deficiencies may exist, and have shown that this methodology may be an effective methodology for examining the accuracy of nuclear cross section data. The new methodology has been applied to the iron scattering cross sections, and the use of the revised cross sections suggests that reactor pressure vessel fluence may be underestimated. (authors)

  5. Aerodynamic configuration design using response surface methodology analysis

    Science.gov (United States)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  6. Attack Trees for Practical Security Assessment: Ranking of Attack Scenarios with ADTool 2.0

    NARCIS (Netherlands)

    Gadyatskaya, Olga; Jhawar, Ravi; Kordy, P.T.; Lounis, Karim; Mauw, Sjouke; Trujillo-Rasua, Rolando

    2016-01-01

    In this tool demonstration paper we present the ADTool2.0: an open-source software tool for design, manipulation and analysis of attack trees. The tool supports ranking of attack scenarios based on quantitative attributes entered by the user; it is scriptable; and it incorporates attack trees with

  7. Risk of Stroke/Transient Ischemic Attack or Myocardial Infarction with Herpes Zoster: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Zhang, Yanting; Luo, Ganfeng; Huang, Yuanwei; Yu, Qiuyan; Wang, Li; Li, Ke

    2017-08-01

    Accumulating evidence indicates that herpes zoster (HZ) may increase the risk of stroke/transient ischemic attack (TIA) or myocardial infarction (MI), but the results are inconsistent. We aim to explore the relationship between HZ and risk of stroke/TIA or MI and between herpes zoster ophthalmicus (HZO) and stroke. We estimated the relative risk (RR) and 95% confidence intervals (CIs) with the meta-analysis. Cochran's Q test and Higgins I 2 statistic were used to check for heterogeneity. HZ infection was significantly associated with increased risk of stroke/TIA (RR = 1.30, 95% CI: 1.17-1.46) or MI (RR = 1.18, 95% CI: 1.07-1.30). The risk of stroke after HZO was 1.91 (95% CI 1.32-2.76), higher than that after HZ. Subgroup analyses revealed increased risk of ischemic stroke after HZ infection but not hemorrhagic stroke. The risk of stroke was increased more at 1 month after HZ infection than at 1-3 months, with a gradual reduced risk with time. The risk of stroke after HZ infection was greater with age less than 40 years than 40-59 years and more than 60 years. Risk of stroke with HZ infection was greater without treatment than with treatment and was greater in Asia than Europe and America but did not differ by sex. Our study indicated that HZ infection was associated with increased risk of stroke/TIA or MI, and HZO infection was the most marked risk factor for stroke. Further studies are needed to explore whether zoster vaccination could reduce the risk of stoke/TIA or MI. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  8. Component build-up method for engineering analysis of missiles at low-to-high angles of attack

    Science.gov (United States)

    Hemsch, Michael J.

    1992-01-01

    Methods are presented for estimating the component build-up terms, with the exception of zero-lift drag, for missile airframes in steady flow and at arbitrary angles of attack and bank. The underlying and unifying bases of all these efforts are slender-body theory and its nonlinear extensions through the equivalent angle-of-attack concept. Emphasis is placed on the forces and moments which act on each of the fins, so that control cross-coupling effects as well as longitudinal and lateral-directional effects can be determined.

  9. Integrating cyber attacks within fault trees

    International Nuclear Information System (INIS)

    Nai Fovino, Igor; Masera, Marcelo; De Cian, Alessio

    2009-01-01

    In this paper, a new method for quantitative security risk assessment of complex systems is presented, combining fault-tree analysis, traditionally used in reliability analysis, with the recently introduced Attack-tree analysis, proposed for the study of malicious attack patterns. The combined use of fault trees and attack trees helps the analyst to effectively face the security challenges posed by the introduction of modern ICT technologies in the control systems of critical infrastructures. The proposed approach allows considering the interaction of malicious deliberate acts with random failures. Formal definitions of fault tree and attack tree are provided and a mathematical model for the calculation of system fault probabilities is presented.

  10. On the Application of Syntactic Methodologies in Automatic Text Analysis.

    Science.gov (United States)

    Salton, Gerard; And Others

    1990-01-01

    Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

  11. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  12. Substance precedes methodology: on cost-benefit analysis and equity

    NARCIS (Netherlands)

    Martens, C.J.C.M.

    2011-01-01

    While distributive aspects have been a topic of discussion in relation to cost–benefit analysis (CBA), little systematic thought has been given in the CBA literature to the focus of such an equity analysis in evaluating transport projects. The goal of the paper is to provide an overview of the

  13. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    Science.gov (United States)

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  14. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  15. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  16. Stomatal oscillations in olive trees: analysis and methodological implications.

    Science.gov (United States)

    López-Bernal, Alvaro; García-Tejera, Omar; Testi, Luca; Orgaz, Francisco; Villalobos, Francisco J

    2017-10-13

    Stomatal oscillations have long been disregarded in the literature despite the fact that the phenomenon has been described for a variety of plant species. This study aims to characterize the occurrence of oscillations in olive trees (Olea europaea L.) under different growing conditions and its methodological implications. Three experiments with young potted olives and one with large field-grown trees were performed. Sap flow measurements were always used to monitor the occurrence of oscillations, with additional determinations of trunk diameter variations and leaf-level stomatal conductance, photosynthesis and water potential also conducted in some cases. Strong oscillations with periods of 30-60 min were generally observed for young trees, while large field trees rarely showed significant oscillations. Severe water stress led to the disappearance of oscillations, but moderate water deficits occasionally promoted them. Simultaneous oscillations were also found for leaf stomatal conductance, leaf photosynthesis and trunk diameter, with the former presenting the highest amplitudes. The strong oscillations found in young potted olive trees preclude the use of infrequent measurements of stomatal conductance and related variables to characterize differences between trees of different cultivars or subjected to different experimental treatments. Under these circumstances, our results suggest that reliable estimates could be obtained using measurement intervals below 15 min. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10 5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  18. Meta-analysis: Its role in psychological methodology

    Directory of Open Access Journals (Sweden)

    Andrej Kastrin

    2008-11-01

    Full Text Available Meta-analysis refers to the statistical analysis of a large collection of independent observations for the purpose of integrating results. The main objectives of this article are to define meta-analysis as a method of data integration, to draw attention to some particularities of its use, and to encourage researchers to use meta-analysis in their work. The benefits of meta-analysis include more effective exploitation of existing data from independent sources and contribution to more powerful domain knowledge. It may also serve as a support tool to generate new research hypothesis. The idea of combining results of independent studies addressing the same research question dates back to sixteenth century. Metaanalysis was reinvented in 1976 by Glass, to refute the conclusion of an eminent colleague, Eysenck, that psychotherapy was essentially ineffective. We review some major historical landmarks of metaanalysis and its statistical background. We present the concept of effect size measure, the problem of heterogeneity and two models which are used to combine individual effect sizes (fixed and random effect model in great details. Two visualization techniques, forest and funnel plot graphics are demonstrated. We developed RMetaWeb, simple and fast web server application to conduct meta-analysis online. RMetaWeb is the first web meta-analysis application and is completely based on R software environment for statistical computing and graphics.

  19. Predicting Factors of Zone 4 Attack in Volleyball.

    Science.gov (United States)

    Costa, Gustavo C; Castro, Henrique O; Evangelista, Breno F; Malheiros, Laura M; Greco, Pablo J; Ugrinowitsch, Herbert

    2017-06-01

    This study examined 142 volleyball games of the Men's Super League 2014/2015 seasons in Brazil from which we analyzed 24-26 games of each participating team, identifying 5,267 Zone 4 attacks for further analysis. Within these Zone 4 attacks, we analyzed the association between the effect of the attack carried out and the separate effects of serve reception, tempo and type of attack. We found that the reception, tempo of attack, second tempo of attack, and power of diagonal attack were predictors of the attack effect in Zone 4. Moreover, placed attacks showed a tendency to not yield a score. In conclusion, winning points in high-level men's volleyball requires excellent receptions, a fast attack tempo and powerfully executed of attacks.

  20. Development of Safety Margin Analysis Methodology on Aging Effect for CANDU Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Woong; Lee, Sang Kyu; Kim, Hyun Koon; Yoo, Kun Joong; Ryu, Yong Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Yoo, Jun Soo; Choi, Yong Won; Park, Chang Hwan [Seoul National Univ., Seoul (Korea, Republic of)

    2007-07-01

    Considering that operating year of Wolsong Unit 1 gets close to the design life, 30 years, the aging effect due to the component degradation takes into consideration as an important safety issue. However, since the thermalhydraulic effect due to the aging did not identify clearly, the safety analysis methodology is not be well established so far. Therefore, in this study, the aging effect affected by thermal-hydraulic characteristics was investigated and a safety margin analysis methodology considering aging effect was proposed.

  1. Development of Safety Margin Analysis Methodology on Aging Effect for CANDU Reactors (II)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Woong; Lee, Sang Kyu; Kim, Hyun Koon; Yoo, Kun Joong; Ryu, Yong Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Choi, Yong Won; Lee, Un Chul [Nuclear Engr. Seoul Nat' l Univ., Seoul (Korea, Republic of)

    2007-10-15

    Considering that operating year of Wolsong Unit 1 gets close to the design life, 30 years, the aging effect due to the component degradation takes into consideration as an important safety issue. However, since the thermal hydraulic effect due to the aging did not identify clearly, the safety analysis methodology is not be well established so far. Therefore, in this study, the aging effect affected by thermal-hydraulic characteristics was investigated and a safety margin analysis methodology considering aging effect was proposed.

  2. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  3. Vapor Pressure Data Analysis and Correlation Methodology for Data Spanning the Melting Point

    Science.gov (United States)

    2013-10-01

    specimen is adequately degassed, the liquid menisci in the U-tube are brought to the same level and the pressure read on the manometer . The measurement...VAPOR PRESSURE DATA ANALYSIS AND CORRELATION METHODOLOGY FOR DATA SPANNING THE MELTING POINT ECBC-CR-135 David E...REPORT TYPE Final 3. DATES COVERED (From - To) Mar 2013 - June 2013 4. TITLE AND SUBTITLE Vapor Pressure Data Analysis and Correlation Methodology

  4. Numerical analysis on the effect of angle of attack on evaluating radio-frequency blackout in atmospheric reentry

    Science.gov (United States)

    Jung, Minseok; Kihara, Hisashi; Abe, Ken-ichi; Takahashi, Yusuke

    2016-06-01

    A three-dimensional numerical simulation model that considers the effect of the angle of attack was developed to evaluate plasma flows around reentry vehicles. In this simulation model, thermochemical nonequilibrium of flowfields is considered by using a four-temperature model for high-accuracy simulations. Numerical simulations were performed for the orbital reentry experiment of the Japan Aerospace Exploration Agency, and the results were compared with experimental data to validate the simulation model. A comparison of measured and predicted results showed good agreement. Moreover, to evaluate the effect of the angle of attack, we performed numerical simulations around the Atmospheric Reentry Demonstrator of the European Space Agency by using an axisymmetric model and a three-dimensional model. Although there were no differences in the flowfields in the shock layer between the results of the axisymmetric and the three-dimensional models, the formation of the electron number density, which is an important parameter in evaluating radio-frequency blackout, was greatly changed in the wake region when a non-zero angle of attack was considered. Additionally, the number of altitudes at which radio-frequency blackout was predicted in the numerical simulations declined when using the three-dimensional model for considering the angle of attack.

  5. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  6. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  7. Common methodology for steady state harmonic analysis of inverters

    Energy Technology Data Exchange (ETDEWEB)

    Vittek, J. [Technical Univ. of Transport and Communication, Zilina (Slovakia). Dept. of Electric Traction and Energetics; Najjar, M.Y. [Cleveland State Univ., OH (United States)

    1995-07-01

    This paper shows the time advantage analysis of m-phase symmetrical inverter systems in complex plane. Computation time reduction occurs due to the 2m-side symmetry in complex plane of such systems. Equations of characteristic values of voltage and current waveforms in the complex domain are developed. The validity of the analysis is shown for single and three phase symmetrical systems under three different modulation techniques using the equations presented in this paper.

  8. Methodologies and techniques for analysis of network flow data

    Energy Technology Data Exchange (ETDEWEB)

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  9. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  10. Pericarditis - after heart attack

    Science.gov (United States)

    ... medlineplus.gov/ency/article/000166.htm Pericarditis - after heart attack To use the sharing features on this page, ... occur in the days or weeks following a heart attack . Causes Two types of pericarditis can occur after ...

  11. Heart attack first aid

    Science.gov (United States)

    First aid - heart attack; First aid - cardiopulmonary arrest; First aid - cardiac arrest ... A heart attack occurs when the blood flow that carries oxygen to the heart is blocked. The heart muscle becomes ...

  12. Social engineering attack framework

    CSIR Research Space (South Africa)

    Mouton, F

    2014-07-01

    Full Text Available link. A social engineering attack targets this weakness by; using various manipulation techniques in order to elicit sensitive; information. The field of social engineering is still in its infancy; stages with regards to formal definitions and attack...

  13. Terrorists and Suicide Attacks

    National Research Council Canada - National Science Library

    Cronin, Audrey K

    2003-01-01

    Suicide attacks by terrorist organizations have become more prevalent globally, and assessing the threat of suicide attacks against the United States and its interests at home and abroad has therefore...

  14. Solidarity under Attack

    DEFF Research Database (Denmark)

    Meret, Susi; Goffredo, Sergio

    2017-01-01

    https://www.opendemocracy.net/can-europe-make-it/susi-meret-sergio-goffredo/solidarity-under-attack......https://www.opendemocracy.net/can-europe-make-it/susi-meret-sergio-goffredo/solidarity-under-attack...

  15. Transient Ischemic Attack

    Medline Plus

    Full Text Available ... Ischemic Attack TIA , or transient ischemic attack, is a "mini stroke" that occurs when a blood clot blocks an artery for a short time. The only difference between a stroke ...

  16. Social Sentiment Sensor in Twitter for Predicting Cyber-Attacks Using ℓ1 Regularization

    Directory of Open Access Journals (Sweden)

    Aldo Hernandez-Suarez

    2018-04-01

    Full Text Available In recent years, online social media information has been the subject of study in several data science fields due to its impact on users as a communication and expression channel. Data gathered from online platforms such as Twitter has the potential to facilitate research over social phenomena based on sentiment analysis, which usually employs Natural Language Processing and Machine Learning techniques to interpret sentimental tendencies related to users’ opinions and make predictions about real events. Cyber-attacks are not isolated from opinion subjectivity on online social networks. Various security attacks are performed by hacker activists motivated by reactions from polemic social events. In this paper, a methodology for tracking social data that can trigger cyber-attacks is developed. Our main contribution lies in the monthly prediction of tweets with content related to security attacks and the incidents detected based on ℓ 1 regularization.

  17. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  18. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  19. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    Science.gov (United States)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  20. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  2. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  3. Physical data generation methodology for return-to-power steam line break analysis

    International Nuclear Information System (INIS)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new

  4. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan

    2016-01-01

    This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...... process operating at Novozymes A/S. Following the FUPCR methodology, the final product concentration could be predicted with an average prediction error of 7.4%. Multiple iterations of preprocessing were applied by implementing the methodology to identify the best data handling methods for the model....... It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify...

  5. Análisis del ataque posicional de balonmano playa masculino y femenino mediante coordenadas polares. [Analysis of positional attack in beach handball male and female with polar coordinates].

    Directory of Open Access Journals (Sweden)

    Rafael E. Reigal

    2015-07-01

    Full Text Available La presente investigación tiene como objetivo aportar una perspectiva novedosa en la comprensión y diferenciación de las conductas de juego en la fase de ataque posicional en el balonmano playa masculino y femenino. Para ello se analizaron 28 partidos de alto nivel con el programa informático Hoisan. Se utilizó un diseño Observacional de carácter nomotético, de seguimiento y multidimensional con un sistema taxonómico metodológicamente validado. Los datos fueron sometidos a un análisis de coordenadas polares en su versión genuina. Para llevar a cabo estos análisis se escogieron siete conductas focales relativas, principalmente, a los jugadores que finalizan el ataque y el modo de realizarlo. Los resultados mostraron diferencias entre las conductas de apareo en la categoría masculina y femenina. Destaca que el ataque posicional en la categoría femenina se orienta hacia zonas de finalización izquierdas ante un sistema defensivo abierto y depende más de la jugadora que adquiere el rol de doble portera (especialista que en la categoría masculina, donde las responsabilidades están más repartidas y el ataque se dirige hacia la banda derecha ante un sistema defensivo cerrado. El lanzamiento en giro se ha mostrado como el principal recurso ofensivo en ambas categorías. Abstract This research aims to provide a new perspective on understanding and differentiation of play behavior in the phase of positional attack in the male and female beach handball. 28 high-level games with Hoisan software were analyzed. The observational design used is nomothetic, monitoring and multidimensional. The taxonomic system has been validated methodologically. Data were subjected to analysis of polar coordinates in its genuine version. To carry out these analyzes on seven focal behaviors were chosen mainly for players who complete the attack and how to create it. The results showed differences in mating behavior in the male and female category. Emphasizes

  6. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  7. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  8. Revised Rapid Soils Analysis Kit (RSAK) - Wet Methodology

    Science.gov (United States)

    2018-01-01

    identify characteristics of buried explosives. The existing Rapid Soils Analysis Kit (RSAK), developed at ERDC, was modified to shrink its cube volume...under Project 354894 CALDERA. The technical monitor was Dr. John Q. Ehrgott Jr. The principal investigator for this study was completed by the...and depth of a buried improvised explosive device (IED) based on factors such as soil type, soil density, soil moisture, and dimensional

  9. Exploratory market structure analysis. Topology-sensitive methodology.

    OpenAIRE

    Mazanec, Josef

    1999-01-01

    Given the recent abundance of brand choice data from scanner panels market researchers have neglected the measurement and analysis of perceptions. Heterogeneity of perceptions is still a largely unexplored issue in market structure and segmentation studies. Over the last decade various parametric approaches toward modelling segmented perception-preference structures such as combined MDS and Latent Class procedures have been introduced. These methods, however, are not taylored for qualitative ...

  10. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  11. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    Analysis (Fourth Edition)”, McGraw Hill, Boston, 2007 7. Hogg and Tanis, “ Probability and Statistical Inference (Sixth Edition)”, Prentice Hall, 2001 ... Probability 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON John Nierwinski a...time assessment for one notional technology. This time assessment is a probability distribution [7] where the area under the curve totals 1.0

  12. Model checking exact cost for attack scenarios

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming

    2017-01-01

    . However, current model checking does not encompass the exact cost analysis of an attack, which is standard for attack trees. Our first contribution is the logic erPCTL with cost-related operators. The extended logic allows to analyse the probability of an event satisfying given cost bounds and to compute......Attack trees constitute a powerful tool for modelling security threats. Many security analyses of attack trees can be seamlessly expressed as model checking of Markov Decision Processes obtained from the attack trees, thus reaping the benefits of a coherent framework and a mature tool support...... the exact cost of an event. Our second contribution is the model checking algorithm for erPCTL. Finally, we apply our framework to the analysis of attack trees....

  13. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  14. Methodological Approach to the Energy Analysis of Unconstrained Historical Buildings

    OpenAIRE

    Chiara Burattini; Fabio Nardecchia; Fabio Bisegna; Lucia Cellucci; Franco Gugliermetti; Andrea de Lieto Vollaro; Ferdinando Salata; Iacopo Golasi

    2015-01-01

    The goal set by the EU of quasi-zero energy buildings is not easy to reach for a country like Italy, as it holds a wide number of UNESCO sites and most of them are entire historical old towns. This paper focuses on the problem of the improvement of energy performance of historical Italian architecture through simple interventions that respect the building without changing its shape and structure. The work starts from an energy analysis of a building located in the historic center of Tivoli, a...

  15. Composite Dos Attack Model

    Directory of Open Access Journals (Sweden)

    Simona Ramanauskaitė

    2012-04-01

    Full Text Available Preparation for potential threats is one of the most important phases ensuring system security. It allows evaluating possible losses, changes in the attack process, the effectiveness of used countermeasures, optimal system settings, etc. In cyber-attack cases, executing real experiments can be difficult for many reasons. However, mathematical or programming models can be used instead of conducting experiments in a real environment. This work proposes a composite denial of service attack model that combines bandwidth exhaustion, filtering and memory depletion models for a more real representation of similar cyber-attacks. On the basis of the introduced model, different experiments were done. They showed the main dependencies of the influence of attacker and victim’s properties on the success probability of denial of service attack. In the future, this model can be used for the denial of service attack or countermeasure optimization.

  16. An In-Depth Analysis of the Cold Boot Attack: Can It Be Used for Sound Forensic Memory Acquisition?

    Science.gov (United States)

    2011-01-01

    the cold boot attack It is very important to bear in mind that in general, the higher the memory density of a given memory module, the less it will...very limited amounts of data27. When a computer system is placed in hibernation or sleep mode rather than the operating system copy out any...cryptographic keys to CPU cache (SRAM) the keys are instead written out to the hibernation file. However, in cases where the computer system both has an

  17. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  18. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  19. Respirable crystalline silica: Analysis methodologies; Silice cristalina respirable: Metodologias de analisis

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Tena, M. P.; Zumaquero, E.; Ibanez, M. J.; Machi, C.; Escric, A.

    2012-07-01

    This paper describes different analysis methodologies in occupational environments and raw materials. A review is presented of the existing methodologies, the approximations made, some of the constraints involved, as well as the best measurement options for the different raw materials. In addition, the different factors that might affect the precision and accuracy of the results are examined. With regard to the methodologies used for the quantitative analysis of any of the polymorph s, particularly of quartz, the study centres particularly on the analytical X-ray diffraction method. Simplified methods of calculation and experimental separation are evaluated for the estimation of this fraction in the raw materials, such as separation methods by centrifugation, sedimentation, and dust generation in controlled environments. In addition, a review is presented of the methodologies used for the collection of respirable crystalline silica in environmental dust. (Author)

  20. External Events Analysis for LWRS/RISMC Project: Methodology Development and Early Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Laboratory; Prescott, Steven Ralph [Idaho National Laboratory; Yorg, Richard Alan [Idaho National Laboratory; Coleman, Justin Leigh [Idaho National Laboratory; Szilard, Ronaldo Henriques [Idaho National Laboratory

    2016-02-01

    The ultimate scope of Industrial Application #2 (IA) of the LWRS/RISMC project is a realistic simulation of natural external hazards that impose threat to a NPP. This scope requires the development of a methodology and of a qualified set of tools able to perform advanced risk- informed safety analysis. In particular the methodology should be able to combine results from seismic, flooding and thermal-hydraulic (TH) deterministic calculations with dynamic PRA. This summary presents the key points of methodology being developed and the very first sample application of it to a simple problem (spent fuel pool).

  1. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  2. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  3. THE MURCHISON WIDEFIELD ARRAY 21 cm POWER SPECTRUM ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Daniel C.; Beardsley, A. P.; Bowman, Judd D. [Arizona State University, School of Earth and Space Exploration, Tempe, AZ 85287 (United States); Hazelton, B. J.; Sullivan, I. S.; Barry, N.; Carroll, P. [University of Washington, Department of Physics, Seattle, WA 98195 (United States); Trott, C. M.; Pindor, B.; Briggs, F.; Gaensler, B. M. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia); Dillon, Joshua S.; Oliveira-Costa, A. de; Ewall-Wice, A.; Feng, L. [MIT Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Pober, J. C. [Brown University, Department of Physics, Providence, RI 02912 (United States); Bernardi, G. [Department of Physics and Electronics, Rhodes University, Grahamstown 6140 (South Africa); Cappallo, R. J.; Corey, B. E. [MIT Haystack Observatory, Westford, MA 01886 (United States); Emrich, D., E-mail: daniel.c.jacobs@asu.edu [International Centre for Radio Astronomy Research, Curtin University, Perth, WA 6845 (Australia); and others

    2016-07-10

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  4. Mediation analysis in nursing research: a methodological review.

    Science.gov (United States)

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  5. Methodology for global nonlinear analysis of nuclear systems

    International Nuclear Information System (INIS)

    Cacuci, D.G.; Cacuci, G.L.

    1987-01-01

    This paper outlines a general method for globally computing the crucial features of nonlinear problems: bifurcations, limit points, saddle points, extrema (maxima and minima); our method also yields the local sensitivities (i.e., first order derivatives) of the system's state variables (e.g., fluxes, power, temperatures, flows) at any point in the system's phase space. We also present an application of this method to the nonlinear BWR model discussed in Refs. 8 and 11. The most significant novel feature of our method is the recasting of a general mathematical problem comprising three aspects: (1) nonlinear constrained optimization, (2) sensitivity analysis, into a fixed point problem of the form F[u(s), λ(s)] = 0 whose global zeros and singular points are related to the special features (i.e., extrema, bifurcations, etc.) of the original problem

  6. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  7. Development of Methodology for Spent Fuel Pool Severe Accident Analysis Using MELCOR Program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won-Tae; Shin, Jae-Uk [RETech. Co. LTD., Yongin (Korea, Republic of); Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The general reason why SFP severe accident analysis has to be considered is that there is a potential great risk due to the huge number of fuel assemblies and no containment in a SFP building. In most cases, the SFP building is vulnerable to external damage or attack. In contrary, low decay heat of fuel assemblies may make the accident processes slow compared to the accident in reactor core because of a great deal of water. In short, its severity of consequence cannot exclude the consideration of SFP risk management. The U.S. Nuclear Regulatory Commission has performed the consequence studies of postulated spent fuel pool accident. The Fukushima-Daiichi accident has accelerated the needs for the consequence studies of postulated spent fuel pool accidents, causing the nuclear industry and regulatory bodies to reexamine several assumptions concerning beyond-design basis events such as a station blackout. The tsunami brought about the loss of coolant accident, leading to the explosion of hydrogen in the SFP building. Analyses of SFP accident processes in the case of a loss of coolant with no heat removal have studied. Few studies however have focused on a long term process of SFP severe accident under no mitigation action such as a water makeup to SFP. USNRC and OECD have co-worked to examine the behavior of PWR fuel assemblies under severe accident conditions in a spent fuel rack. In support of the investigation, several new features of MELCOR model have been added to simulate both BWR fuel assembly and PWR 17 x 17 assembly in a spent fuel pool rack undergoing severe accident conditions. The purpose of the study in this paper is to develop a methodology of the long-term analysis for the plant level SFP severe accident by using the new-featured MELCOR program in the OPR-1000 Nuclear Power Plant. The study is to investigate the ability of MELCOR in predicting an entire process of SFP severe accident phenomena including the molten corium and concrete reaction. The

  8. What Is a Heart Attack?

    Science.gov (United States)

    ... Research Home / Heart Attack Heart Attack Also known as Myocardial infarction Leer en español ... or years after the procedure. Other Treatments for Heart Attack Other treatments for heart attack include: Medicines Medical ...

  9. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  10. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  11. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  12. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  13. Notions and methodologies for uncertainty analysis in simulations of transitory events of a nuclear central

    International Nuclear Information System (INIS)

    Alva N, J.; Ortiz V, J.; Amador G, R.; Delfin L, A.

    2007-01-01

    The present work has as objective to gather the basic notions related with the uncertainty analysis and some of the methodologies to be applied in the studies of transitory events analysis of a nuclear power station, in particular of those thermal hydraulics phenomena. The concepts and methodologies mentioned in this work are the result of an exhaustive bibliographical investigation of the topic in the nuclear area. The methodologies of uncertainties analysis have been developed by diverse institutions and they are broadly used at world level for their application in the results of the computer codes of the class of better estimation in the thermal hydraulics analysis and safety of plants and nuclear reactors. The main sources of uncertainty, types of uncertainty and aspects related with the models of better estimation and better estimation method are also presented. (Author)

  14. Economy As A Phenomenon Of Culture: Theoretical And Methodological Analysis

    Directory of Open Access Journals (Sweden)

    S. N. Ivaskovsky

    2017-01-01

    Full Text Available The article redefines economy as a phenomenon of culture, a product of a historically and socially grounded set of values shared by members of a given society. The research shows that culture is not always identical to social utility, because there are multiple examples when archaic, traditionalist, irrational cultural norms hinder social and economic progress and trap nations into poverty and underdevelopment. One of the reasons for the lack of scholarly attention to cultural dimension of economy is the triumph of positivism in economics. Mathematics has become the dominant language of economic analysis. It leads to the transformation of the economics into a sort of «social physics», accompanied by the loss of its original humanitarian nature shared in the works of all the great economists of the past. The author emphasizes the importance of the interdisciplinary approach to the economic research and the incorporation of the achievements of the other social disciplines – history, philosophy, sociology and cultural studies - into the subject matter of economic theory. Substantiating the main thesis of the article, the author shows that there is a profound ontological bond between economy and culture, which primarily consists in the fact that these spheres of human relations are aimed at the solution of the same problem – the competitive selection of the best ways for survival of people, of satisfying the relevant living needs. In order to overcome the difficulties related to the inclusion of culture in the set of analytical tools used in the economic theory, the author suggests using a category of «cultural capital», which reestablishes the earlier and more familiar for the economists meaning of capital.

  15. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  16. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  17. Comparative analysis of two weight-of-evidence methodologies for integrated sediment quality assessment.

    Science.gov (United States)

    Khosrovyan, A; Rodríguez-Romero, A; Antequera Ramos, M; DelValls, T A; Riba, I

    2015-02-01

    The results of sediment quality assessment by two different weight-of-evidence methodologies were compared. Both methodologies used the same dataset but as criteria and procedures were different, the results emphasized different aspects of sediment contamination. One of the methodologies integrated the data by means of a multivariate analysis and suggested bioavailability of contaminants and their spatial distribution. The other methodology, used in the dredged material management framework recently proposed in Spain, evaluated sediment toxicity in general by assigning categories. Despite the differences in the interpretation and presentation of results, the methodologies evaluated sediment risk similarly, taking into account chemical concentrations and toxicological effects. Comparison of the results of different approaches is important to define their limitations and thereby avoid implications of potential environmental impacts from different management options, as in the case of dredged material risk assessment. Consistent results of these two methodologies emphasized validity and robustness of the integrated, weight-of-evidence, approach to sediment quality assessment. Limitations of the methodologies were discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  19. One-Year Outcomes After Minor Stroke or High-Risk Transient Ischemic Attack: Korean Multicenter Stroke Registry Analysis.

    Science.gov (United States)

    Park, Hong-Kyun; Kim, Beom Joon; Han, Moon-Ku; Park, Jong-Moo; Kang, Kyusik; Lee, Soo Joo; Kim, Jae Guk; Cha, Jae-Kwan; Kim, Dae-Hyun; Nah, Hyun-Wook; Park, Tai Hwan; Park, Sang-Soon; Lee, Kyung Bok; Lee, Jun; Hong, Keun-Sik; Cho, Yong-Jin; Lee, Byung-Chul; Yu, Kyung-Ho; Oh, Mi-Sun; Kim, Joon-Tae; Choi, Kang-Ho; Kim, Dong-Eog; Ryu, Wi-Sun; Choi, Jay Chol; Johansson, Saga; Lee, Su Jin; Lee, Won Hee; Lee, Ji Sung; Lee, Juneyoung; Bae, Hee-Joon

    2017-11-01

    Patients with minor ischemic stroke or transient ischemic attack are at high risk of recurrent stroke and vascular events, which are potentially disabling or fatal. This study aimed to evaluate contemporary subsequent vascular event risk after minor ischemic stroke or transient ischemic attack in Korea. Patients with minor ischemic stroke or high-risk transient ischemic attack admitted within 7 days of symptom onset were identified from a Korean multicenter stroke registry database. We estimated 3-month and 1-year event rates of the primary outcome (composite of stroke recurrence, myocardial infarction, or all-cause death), stroke recurrence, a major vascular event (composite of stroke recurrence, myocardial infarction, or vascular death), and all-cause death and explored differences in clinical characteristics and event rates according to antithrombotic strategies at discharge. Of 9506 patients enrolled in this study, 93.8% underwent angiographic assessment and 72.7% underwent cardiac evaluations; 25.1% had symptomatic stenosis or occlusion of intracranial arteries. At discharge, 95.2% of patients received antithrombotics (antiplatelet polytherapy, 37.1%; anticoagulation, 15.3%) and 86.2% received statins. The 3-month cumulative event rate was 5.9% for the primary outcome, 4.3% for stroke recurrence, 4.6% for a major vascular event, and 2.0% for all-cause death. Corresponding values at 1 year were 9.3%, 6.1%, 6.7%, and 4.1%, respectively. Patients receiving nonaspirin antithrombotic strategies or no antithrombotic agent had higher baseline risk profiles and at least 1.5× higher event rates for clinical event outcomes than those with aspirin monotherapy. Contemporary secondary stroke prevention strategies based on thorough diagnostic evaluation may contribute to the low subsequent vascular event rates observed in real-world clinical practice in Korea. © 2017 American Heart Association, Inc.

  20. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  1. Battle Analysis: The Battle of Monte Altuzzo, Offensive, Deliberate Attack, Mountain 85th and 91st Infantry Divisions, September 1944.

    Science.gov (United States)

    1984-05-01

    Quartermaster Company .. 85th Signal Company 85th MIlitary Pol ice Platoon Attached to the 85th Division 752d Tank Battalion (mediumf) 805th Tank...rap;,i!Y appr,:,ach i ng and no ordJers had bee - rc e ved, the three platoon I eade’s. in the abs.erce cf the :M p a. v. c Oiman der wv,,ho d i dr n, t...pired. I n actual ity , the two companes kicked of their Eq attack but Company B got disoriented and inste.ad of attackir.g anc- ri Kr,; up wi th

  2. Kleptographic Attacks on ECDSA

    Directory of Open Access Journals (Sweden)

    Nadezhda Anatolievna Chepick

    2014-12-01

    Full Text Available This paper presents secretly trapdoor with universal protection (SETUP attacks on the elliptic curve digital signature algorithm ECDSA. It allows a malicious manufacturer of black-box cryptosystems to implement these attacks to get access to user’s private key. The attacker can obtain user’s private key. The way ECDSA can be used for encryption and key exchange is also described.

  3. Evaluation of Crosstalk Attacks in Access Networks

    DEFF Research Database (Denmark)

    Wagner, Christoph; Eiselt, Michael; Grobe, Klaus

    2016-01-01

    WDM-PON systems regained interest as low-cost solution for metro and access networks. We present a comparative analysis of resilience of wavelength-selective and wavelength-routed architectures against crosstalk attackers. We compare the vulnerability of these architectures against attacks...

  4. Transient Ischemic Attack

    Medline Plus

    Full Text Available ... stroke symptoms. Popular Topics TIA Cardiac Catheter Cholesterol Heart Attack Stent © 2018, American Heart Association, Inc. All rights reserved. Unauthorized use prohibited. ...

  5. Seven deadliest USB attacks

    CERN Document Server

    Anderson, Brian

    2010-01-01

    Do you need to keep up with the latest hacks, attacks, and exploits effecting USB technology? Then you need Seven Deadliest USB Attacks. This book pinpoints the most dangerous hacks and exploits specific to USB, laying out the anatomy of these attacks including how to make your system more secure. You will discover the best ways to defend against these vicious hacks with step-by-step instruction and learn techniques to make your computer and network impenetrable. Attacks detailed in this book include: USB Hacksaw USB Switchblade USB Based Virus/Malicous Code Launch USB Device Overflow RAMdum

  6. Bit-level differential power analysis attack on implementations of advanced encryption standard software running inside a PIC18F2420 microcontroller

    CSIR Research Space (South Africa)

    Mpalane, K

    2015-12-01

    Full Text Available Analysis Attack on implementations of Advanced Encryption Standard software running inside a PIC18F2420 Microcontroller Kealeboga Mpalane∗ Department of Computer Science North West University Private Bag X2046,Mafikeng 2745, South Africa Email: kmpalane...@csir.co.za H.D Tsague Council For Scientific and Industrial Research P.O Box 395,Pretoria 0001, South Africa Email: HDjononTsague@csir.co.za Naison Gasela and E.M Esiefa Department of Computer Science North West University Private Bag X2046,Mafikeng 2745, South...

  7. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  8. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  9. French Epistemology and its Revisions: Towards a Reconstruction of the Methodological Position of Foucaultian Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-05-01

    Full Text Available This article reconstructs epistemology in the tradition of Gaston BACHELARD as one of the main foundations of the methodology of FOUCAULTian discourse analysis. Foundational concepts and the methodological approach of French epistemology are one of the continuities in the work of Michel FOUCAULT. BACHELARDian epistemology (and of his successor Georges CANGUILHEM can be used for the reconstruction of the FOUCAULTian methodology and it can also be used to instruct the practices of FOUCAULTian discourse analysis as a stand-alone form of qualitative social research. French epistemology was developed in critical opposition to the phenomenology of Edmund HUSSERL, and to phenomenological theories of science. Because the phenomenology of HUSSERL is one foundation of social phenomenology, the reconstruction of the FOUCAULTian methodology—as built on the French tradition of BACHELARDian epistemology—makes it clear that FOUCAULTian discourse analysis is incommensurable with approaches derived from social phenomenology. The epistemology of BACHELARD is portrayed as a proto-version of discourse analysis. Discourses as well as discourse analyses are conceived as forms of socio-epistemological practice. In this article, the main concepts and strategies of French epistemology are introduced and related to discourse analysis. The consequences of epistemology for a self-reflexive methodology and its practice are discussed. URN: urn:nbn:de:0114-fqs0702241

  10. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  11. Is Article Methodological Quality Associated With Conflicts of Interest?: An Analysis of the Plastic Surgery Literature.

    Science.gov (United States)

    Cho, Brian H; Lopez, Joseph; Means, Jessica; Lopez, Sandra; Milton, Jacqueline; Tufaro, Anthony P; May, James W; Dorafshar, Amir H

    2017-12-01

    Conflicts of interest (COI) are an emerging area of discussion within the field of plastic surgery. Recently, several reports have found that research studies that disclose COI are associated with publication of positive outcomes. We hypothesize that this association is driven by higher-quality studies receiving industry funding. This study aimed to investigate the association between industry support and study methodological quality. We reviewed all entries in Plastic and Reconstructive Surgery, Annals of Plastic Surgery, and Journal of Plastic, Reconstructive, and Aesthetic Surgery within a 1-year period encompassing 2013. All clinical research articles were analyzed. Studies were evaluated blindly for methodology quality based on a validated scoring system. An ordinal logistic regression model was used to examine the association between methodology score and COI. A total of 1474 articles were reviewed, of which 483 met our inclusion criteria. These articles underwent methodological quality scoring. Conflicts of interest were reported in 28 (5.8%) of these articles. After adjusting for article characteristics in the ordinal logistic regression analysis, there was no significant association between articles with COI and higher methodological scores (P = 0.7636). Plastic surgery studies that disclose COI are not associated with higher methodological quality when compared with studies that do not disclose COI. These findings suggest that although the presence of COI is associated with positive findings, the association is not shown to be driven by higher-quality studies.

  12. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  13. Mapping the Most Significant Computer Hacking Events to a Temporal Computer Attack Model

    OpenAIRE

    Heerden , Renier ,; Pieterse , Heloise; Irwin , Barry

    2012-01-01

    Part 4: Section 3: ICT for Peace and War; International audience; This paper presents eight of the most significant computer hacking events (also known as computer attacks). These events were selected because of their unique impact, methodology, or other properties. A temporal computer attack model is presented that can be used to model computer based attacks. This model consists of the following stages: Target Identification, Reconnaissance, Attack, and Post-Attack Reconnaissance stages. The...

  14. Attribution Of Cyber Attacks On Process Control Systems

    Science.gov (United States)

    Hunker, Jeffrey; Hutchinson, Robert; Margulies, Jonathan

    The attribution of cyber attacks is an important problem. Attribution gives critical infrastructure asset owners and operators legal recourse in the event of attacks and deters potential attacks. This paper discusses attribution techniques along with the associated legal and technical challenges. It presents a proposal for a voluntary network of attributable activity, an important first step towards a more complete attribution methodology for the control systems community.

  15. Plants under dual attack

    NARCIS (Netherlands)

    Ponzio, C.A.M.

    2016-01-01

    Though immobile, plants are members of complex environments, and are under constant threat from a wide range of attackers, which includes organisms such as insect herbivores or plant pathogens. Plants have developed sophisticated defenses against these attackers, and include chemical responses such

  16. Heart attack - discharge

    Science.gov (United States)

    ... and lifestyle Cholesterol - drug treatment Controlling your high blood pressure Deep vein thrombosis - discharge Dietary fats explained Fast food tips Heart attack - discharge Heart attack - what to ask your doctor Heart bypass ... pacemaker - discharge High blood pressure - what to ask your doctor How to read ...

  17. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    Science.gov (United States)

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items.

  18. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  19. An analysis of the methodological underpinnings of social learning research in natural resource Management

    NARCIS (Netherlands)

    Rodela, R.; Cundill, G.; Wals, A.E.J.

    2012-01-01

    This analysis is focused on research that uses a social learning approach to study natural resource issues. We map out the prevailing epistemological orientation of social learning research through the de-construction of the methodological choices reported in current social learning literature.

  20. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  1. Analysis of Introducing Active Learning Methodologies in a Basic Computer Architecture Course

    Science.gov (United States)

    Arbelaitz, Olatz; José I. Martín; Muguerza, Javier

    2015-01-01

    This paper presents an analysis of introducing active methodologies in the Computer Architecture course taught in the second year of the Computer Engineering Bachelor's degree program at the University of the Basque Country (UPV/EHU), Spain. The paper reports the experience from three academic years, 2011-2012, 2012-2013, and 2013-2014, in which…

  2. Genome-wide expression studies of atherosclerosis: critical issues in methodology, analysis, interpretation of transcriptomics data

    NARCIS (Netherlands)

    Bijnens, A. P. J. J.; Lutgens, E.; Ayoubi, T.; Kuiper, J.; Horrevoets, A. J.; Daemen, M. J. A. P.

    2006-01-01

    During the past 6 years, gene expression profiling of atherosclerosis has been used to identify genes and pathways relevant in vascular (patho)physiology. This review discusses some critical issues in the methodology, analysis, and interpretation of the data of gene expression studies that have made

  3. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  4. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (engineering community are its: unique coupling of the aspects of performance, cost, and schedule; assessment of system level impacts of technology insertion; procedures for estimating uncertainties (risks) associated with advanced technology; and application of heuristics to facilitate informed system-level technology utilization decisions earlier in the conceptual design phase. MERIT extends the state of the art in technology insertion assessment selection practice and, if adopted, may aid designers in determining

  5. Quantitative Verification and Synthesis of Attack-Defence Scenarios

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi; Nielson, Flemming; Parker, David

    2016-01-01

    Attack-defence trees are a powerful technique for formally evaluating attack-defence scenarios. They represent in an intuitive, graphical way the interaction between an attacker and a defender who compete in order to achieve conflicting objectives. We propose a novel framework for the formal...... analysis of quantitative properties of complex attack-defence scenarios, using an extension of attack-defence trees which models temporal ordering of actions and allows explicit dependencies in the strategies adopted by attackers and defenders. We adopt a game-theoretic approach, translating attack......-defence trees to two-player stochastic games, and then employ probabilistic model checking techniques to formally analyse these models. This provides a means to both verify formally specified security properties of the attack-defence scenarios and, dually, to synthesise strategies for attackers or defenders...

  6. An Adaptive Approach for Defending against DDoS Attacks

    Directory of Open Access Journals (Sweden)

    Muhai Li

    2010-01-01

    Full Text Available In various network attacks, the Distributed Denial-of-Service (DDoS attack is a severe threat. In order to deal with this kind of attack in time, it is necessary to establish a special type of defense system to change strategy dynamically against attacks. In this paper, we introduce an adaptive approach, which is used for defending against DDoS attacks, based on normal traffic analysis. The approach can check DDoS attacks and adaptively adjust its configurations according to the network condition and attack severity. In order to insure the common users to visit the victim server that is being attacked, we provide a nonlinear traffic control formula for the system. Our simulation test indicates that the nonlinear control approach can prevent the malicious attack packets effectively while making legitimate traffic flows arrive at the victim.

  7. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  8. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  9. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Directory of Open Access Journals (Sweden)

    Laszlo B Kish

    Full Text Available Recently, Bennett and Riedel (BR (http://arxiv.org/abs/1303.7435v1 argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional security of the KLJN method has not been successfully challenged.

  10. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Science.gov (United States)

    Kish, Laszlo B; Abbott, Derek; Granqvist, Claes G

    2013-01-01

    Recently, Bennett and Riedel (BR) (http://arxiv.org/abs/1303.7435v1) argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive) attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

  11. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  12. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  13. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  14. Rotational Rebound Attacks on Reduced Skein

    DEFF Research Database (Denmark)

    Khovratovich, Dmitry; Nikolić, Ivica; Rechberger, Christian

    2014-01-01

    In this paper we combine two powerful methods of symmetric cryptanalysis: rotational cryptanalysis and the rebound attack. Rotational cryptanalysis was designed for the analysis of bit-oriented designs like ARX (Addition-Rotation-XOR) schemes. It has been applied to several hash functions and block...... ciphers, including the new standard SHA-3 (Keccak). The rebound attack is a start-from-the-middle approach for finding differential paths and conforming pairs in byte-oriented designs like Substitution-Permutation networks and AES. We apply our new compositional attack to the reduced version of the hash...... function Skein, a finalist of the SHA-3 competition. Our attack penetrates more than two thirds of the Skein core—the cipher Threefish, and made the designers to change the submission in order to prevent it. The rebound part of our attack has been significantly enhanced to deliver results on the largest...

  15. Two Improved Multiple-Differential Collision Attacks

    Directory of Open Access Journals (Sweden)

    An Wang

    2014-01-01

    Full Text Available In CHES 2008, Bogdanov proposed multiple-differential collision attacks which could be applied to the power analysis attacks on practical cryptographic systems. However, due to the effect of countermeasures on FPGA, there are some difficulties during the collision detection, such as local high noise and the lack of sampling points. In this paper, keypoints voting test is proposed for solving these problems, which can increase the success ratio from 35% to 95% on the example of one implementation. Furthermore, we improve the ternary voting test of Bogdanov, which can improve the experiment efficiency markedly. Our experiments show that the number of power traces required in our attack is only a quarter of the requirement of traditional attack. Finally, some alternative countermeasures against our attacks are discussed.

  16. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    National Research Council Canada - National Science Library

    Patev, Robert C; Putcha, Chandra; Foltz, Stuart D

    2005-01-01

    .... This report summarizes research on methodologies to assist in quantifying risks related to dam gates and associated operating equipment, and how those risks relate to overall spillway failure risk...

  17. Nocturnal panic attacks

    Directory of Open Access Journals (Sweden)

    Lopes Fabiana L.

    2002-01-01

    Full Text Available The panic-respiration connection has been presented with increasing evidences in the literature. We report three panic disorder patients with nocturnal panic attacks with prominent respiratory symptoms, the overlapping of the symptoms with the sleep apnea syndrome and a change of the diurnal panic attacks, from spontaneous to situational pattern. The implication of these findings and awareness to the distinct core of the nocturnal panic attacks symptoms may help to differentiate them from sleep disorders and the search for specific treatment.

  18. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  19. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  20. Methodological choices for research in Information Science: Contributions to domain analysis

    Directory of Open Access Journals (Sweden)

    Juliana Lazzarotto FREITAS

    Full Text Available Abstract The article focuses on the ways of organizing studies according to their methodological choices in the Base Referencial de Artigos de Periódicos em Ciência da Informação (Reference Database of Journal articles in Information Science. We highlight how the organization of scientific production by the methodological choices in Information Science contributes to the identification of its production features and domain analysis. We studied research categories and proposed five classification criteria: research purposes, approaches, focus, techniques and type of analysis. The proposal of a corpus in Information Science is empirically applied, represented by 689 articles, 10% of the production indexed in Base Referencial de Artigos de Periódicos em Ciência da Informação from 1972 to 2010. We adopt content analysis to interpret the methodological choices of authors identified in the corpus. The results point out that exploratory studies are more predominant when considering the research purpose; regarding the research approach, bibliographic and documentary studies are more predominant; systematic observation, questionnaire and interview were the most widely used techniques; document analysis and content analysis are the most widely used types of analysis; the research focus of theoretical, historical and bibliometric studies are more predominant. We found that some studies use two methodological choices and explicit epistemological approaches, such as the studies following the positivist approach in the 1970s, and those influenced by the phenomenological approach in the 1980s, which increased the use of methods in qualitative research.

  1. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  2. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  3. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    Science.gov (United States)

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  6. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  7. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  8. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  9. Heart Attack Payment - State

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – state data. This data set includes state-level data for payments associated with a 30-day episode of care for heart...

  10. Heart Attack Payment - Hospital

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – provider data. This data set includes provider data for payments associated with a 30-day episode of care for heart...

  11. Heart Attack Payment - National

    Data.gov (United States)

    U.S. Department of Health & Human Services — Payment for heart attack patients measure – national data. This data set includes national-level data for payments associated with a 30-day episode of care for heart...

  12. Transient Ischemic Attack

    Medline Plus

    Full Text Available ... TIA , or transient ischemic attack, is a "mini stroke" that occurs when a blood clot blocks an ... a short time. The only difference between a stroke and TIA is that with TIA the blockage ...

  13. Facial Dog Attack Injuries

    OpenAIRE

    Lin, Wei; Patil, Pavan Manohar

    2013-01-01

    The exposed position of the face makes it vulnerable to dog bite injuries. This fact combined with the short stature of children makes them a high-risk group for such attacks. In contrast to wounds inflicted by assaults and accidents, dog bite wounds are deep puncture type wounds compounded by the presence of pathologic bacteria from the saliva of the attacking dog. This, combined with the presence of crushed, devitalized tissue makes these wounds highly susceptible to infection. Key to succe...

  14. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  15. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  16. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  17. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  18. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  19. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  20. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  1. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  2. Methodology of demand forecast by market analysis of electric power and load curves

    International Nuclear Information System (INIS)

    Barreiro, C.J.; Atmann, J.L.

    1989-01-01

    A methodology for demand forecast of consumer classes and their aggregation is presented. An analysis of the actual attended market can be done by appropriate measures and load curves studies. The suppositions for the future market behaviour by consumer classes (industrial, residential, commercial, others) are shown, and the actions for optimise this market are foreseen, obtained by load curves modulations. The process of future demand determination is obtained by the appropriate aggregation of this segmented demands. (C.G.C.)

  3. An alternative methodology for the analysis of electrical resistivity data from a soil gas study

    OpenAIRE

    Johansson, Sara; Rosqvist, Hakan; Svensson, Mats; Dahlin, Torleif; Leroux, Virginie

    2011-01-01

    The aim of this paper is to present an alternative method for the analysis of resistivity data. The methodology was developed during a study to evaluate if electrical resistivity can be used as a tool for analysing subsurface gas dynamics and gas emissions from landfills. The main assumption of this study was that variations in time of resistivity data correspond to variations in the relative amount of gas and water in the soil pores. Field measurements of electrical resistivity, static chamb...

  4. Environmental Analysis of Springs in Urban Areas–A Methodological Proposal

    OpenAIRE

    Milton Pavezzi Netto; Gustavo D'Almeida Scarpinella; Ricardo Siloto da Silva

    2013-01-01

    The springs located in urban areas are the outpouring of surface water, which can serve as water supply, effluent receptors and important local macro-drainage elements. With unplanned occupation, non-compliance with environmental legislation and the importance of these water bodies, it is vital to analyze the springs within urban areas, considering the Brazilian forest code. This paper submits an analysis and discussion methodology proposal of environmental compliance fun...

  5. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  6. Chemical or Biological Terrorist Attacks: An Analysis of the Preparedness of Hospitals for Managing Victims Affected by Chemical or Biological Weapons of Mass Destruction

    Science.gov (United States)

    Bennett, Russell L.

    2006-01-01

    The possibility of a terrorist attack employing the use of chemical or biological weapons of mass destruction (WMD) on American soil is no longer an empty threat, it has become a reality. A WMD is defined as any weapon with the capacity to inflict death and destruction on such a massive scale that its very presence in the hands of hostile forces is a grievous threat. Events of the past few years including the bombing of the World Trade Center in 1993, the Murrah Federal Building in Oklahoma City in 1995 and the use of planes as guided missiles directed into the Pentagon and New York’s Twin Towers in 2001 (9/11) and the tragic incidents involving twenty-three people who were infected and five who died as a result of contact with anthrax-laced mail in the Fall of 2001, have well established that the United States can be attacked by both domestic and international terrorists without warning or provocation. In light of these actions, hospitals have been working vigorously to ensure that they would be “ready” in the event of another terrorist attack to provide appropriate medical care to victims. However, according to a recent United States General Accounting Office (GAO) nationwide survey, our nation’s hospitals still are not prepared to manage mass causalities resulting from chemical or biological WMD. Therefore, there is a clear need for information about current hospital preparedness in order to provide a foundation for systematic planning and broader discussions about relative cost, probable effectiveness, environmental impact and overall societal priorities. Hence, the aim of this research was to examine the current preparedness of hospitals in the State of Mississippi to manage victims of terrorist attacks involving chemical or biological WMD. All acute care hospitals in the State were selected for inclusion in this study. Both quantitative and qualitative methods were utilized for data collection and analysis. Six hypotheses were tested. Using a

  7. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  8. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  9. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  10. Phenolic profiles of eight olive cultivars from Algeria: effect of Bactrocera oleae attack.

    Science.gov (United States)

    Medjkouh, Lynda; Tamendjari, Abderezak; Alves, Rita C; Laribi, Rahima; Oliveira, M Beatriz P P

    2018-02-21

    Olive fly (Bactrocera oleae R.) is the most harmful pest of olive trees (O. europaea) affecting their fruit development and oil production. Olive fruits have characteristic phenolic compounds, important for plant defense against pathogens and insects, and with many biological activities, they contribute to the high value of this crop. In this study, olives from 8 cultivars (Abani, Aellah, Blanquette de Guelma, Chemlal, Ferkani, Limli, Rougette de Mitidja and Souidi) with different degrees of fly infestation (0%, not attacked; 100%, all attacked; and real attack %) and different maturation indices were sampled and analysed. Qualitative and quantitative analyses of phenolic profiles were performed by colorimetric methodologies and RP-HPLC-DAD. Verbascoside, tyrosol and hydroxytyrosol were the compounds that were most adversely affected by B. oleae infestation. Principal component analysis and hierarchical cluster analysis highlighted different groups, showing different behaviours of olive cultivars to the attack. The results show that phenolic compounds displayed sharp qualitative and quantitative differences among the cultivars. The fly attack was significantly correlated with the weight of the fruits, but not with the phenolic compounds.

  11. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  12. Cyber Attacks, Information Attacks, and Postmodern Warfare

    Directory of Open Access Journals (Sweden)

    Valuch Jozef

    2017-06-01

    Full Text Available The aim of this paper is to evaluate and differentiate between the phenomena of cyberwarfare and information warfare, as manifestations of what we perceive as postmodern warfare. We describe and analyse the current examples of the use the postmodern warfare and the reactions of states and international bodies to these phenomena. The subject matter of this paper is the relationship between new types of postmodern conflicts and the law of armed conflicts (law of war. Based on ICJ case law, it is clear that under current legal rules of international law of war, cyber attacks as well as information attacks (often performed in the cyberspace as well can only be perceived as “war” if executed in addition to classical kinetic warfare, which is often not the case. In most cases perceived “only” as a non-linear warfare (postmodern conflict, this practice nevertheless must be condemned as conduct contrary to the principles of international law and (possibly a crime under national laws, unless this type of conduct will be recognized by the international community as a “war” proper, in its new, postmodern sense.

  13. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  14. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  15. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  16. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, T. B.; Ketzel, Matthias; Skov, H.

    2016-01-01

    Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to successfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach...

  17. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  18. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  19. Collaborative Attack vs. Collaborative Defense

    Science.gov (United States)

    Xu, Shouhuai

    We have witnessed many attacks in the cyberspace. However, most attacks are launched by individual attackers even though an attack may involve many compromised computers. In this paper, we envision what we believe to be the next generation cyber attacks — collaborative attacks. Collaborative attacks can be launched by multiple attackers (i.e., human attackers or criminal organizations), each of which may have some specialized expertise. This is possible because cyber attacks can become very sophisticated and specialization of attack expertise naturally becomes relevant. To counter collaborative attacks, we might need collaborative defense because each “chain” in a collaborative attack may be only adequately dealt with by a different defender. In order to understand collaborative attack and collaborative defense, we present a high-level abstracted framework for evaluating the effectiveness of collaborative defense against collaborative attacks. As a first step towards realizing and instantiating the framework, we explore a characterization of collaborative attacks and collaborative defense from the relevant perspectives.

  20. A SAS2H/KENO-V methodology for 3D fuel burnup analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.

    2002-01-01

    An efficient methodology for 3D fuel burnup analysis of LWR reactors is described in this paper. This methodology is founded on coupling Monte Carlo method for 3D calculation of node power distribution, and transport method for depletion calculation in ID Wigner-Seitz equivalent cell for each node independently. The proposed fuel burnup modeling, based on application of SCALE-4.4a control modules SAS2H and KENO-V.a is verified for the case of 2D x-y model of IRIS 15 x 15 fuel assembly (with reflective boundary condition) by using two well benchmarked code systems. The one is MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility code, and the second is KENO-V.a/ORIGEN2.1 code system recently developed by authors of this paper. The proposed SAS2H/KENO-V.a methodology was applied for 3D burnup analysis of IRIS-1000 benchmark.44 core. Detailed k sub e sub f sub f and power density evolution with burnup are reported. (author)

  1. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  2. Seven Deadliest Wireless Technologies Attacks

    CERN Document Server

    Haines, Brad

    2010-01-01

    How can an information security professional keep up with all of the hacks, attacks, and exploits? One way to find out what the worst of the worst are is to read the seven books in our Seven Deadliest Attacks Series. Not only do we let you in on the anatomy of these attacks but we also tell you how to get rid of them and how to defend against them in the future. Countermeasures are detailed so that you can fight against similar attacks as they evolve. Attacks featured in this book include:Bluetooth AttacksCredit Card, Access Card, and Passport AttacksBad Encryption

  3. A gap analysis methodology for collecting crop genepools: a case study with phaseolus beans.

    Directory of Open Access Journals (Sweden)

    Julián Ramírez-Villegas

    Full Text Available BACKGROUND: The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis of crop wild relatives as a means to guide efficient and effective collecting activities. METHODOLOGY/PRINCIPAL FINDINGS: The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5% are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap "hotspots", representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. CONCLUSIONS/SIGNIFICANCE: Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding. Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources.

  4. An Analysis of Factors That Influence Logistics, Operational Availability, and Flight Hour Supply of the German Attack Helicopter Fleet

    Science.gov (United States)

    2017-06-01

    systemic behavior of an aircraft fleet has never been attempted within the German Army xxii aviation forces before. Having an analysis tool that...computing hardware, simulation modeling paradigms, simulation software, and design-and- analysis methods. The authors claim: “When applied properly...Design and Analysis Law (2015) described outcomes of simulations as “estimates about system behavior , which if influenced by uncertainty often are

  5. Vicinity analysis: a methodology for the identification of similar protein active sites.

    Science.gov (United States)

    McGready, A; Stevens, A; Lipkin, M; Hudson, B D; Whitley, D C; Ford, M G

    2009-05-01

    Vicinity analysis (VA) is a new methodology developed to identify similarities between protein binding sites based on their three-dimensional structure and the chemical similarity of matching residues. The major objective is to enable searching of the Protein Data Bank (PDB) for similar sub-pockets, especially in proteins from different structural and biochemical series. Inspection of the ligands bound in these pockets should allow ligand functionality to be identified, thus suggesting novel monomers for use in library synthesis. VA has been developed initially using the ATP binding site in kinases, an important class of protein targets involved in cell signalling and growth regulation. This paper defines the VA procedure and describes matches to the phosphate binding sub-pocket of cyclin-dependent protein kinase 2 that were found by searching a small test database that has also been used to parameterise the methodology.

  6. Next generation iterative transport-diffusion methodology (ITDM), for LWR core analysis

    Science.gov (United States)

    Colameco, David V.

    The work in this dissertation shows that the neutronic modeling of a Pressurized Water Reactor (PWR) could be greatly improved through the use of an iterative transport-diffusion method (one-step procedure) compared to the current once through transport to diffusion methodology (two-step procedure). The current methodology is efficient; however the infinite environment approximation of the transport lattice calculation introduces errors in the diffusion core calculation due to the lack of the effect of the core environment. This iterative transportdiffusion method replaces the infinite environment with a simulated 3D environment of the diffusion calculation. This dissertation further develops previous work of ITDM in 2D, into a 3D simulated environment with contributions being made in axial leakage treatment. Burnup steps are simulated over a cycle, and in the future simple thermal modeling can be added, for full core fuel cycle analysis. (Abstract shortened by UMI.).

  7. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  8. [Risk analysis study for patients care in radiotherapy: Learning curve and methodological evolution].

    Science.gov (United States)

    El Bakri, S; Fleury, B; Le Grévellec, M

    2015-10-01

    To describe the evaluation of our risk mapping methodology over the past two years. Based on the FMEA (failure mode effects analysis) method, some aspects have been adapted, e.g. the concept of risk control and effort scale, some others have been introduced, e.g. the concept of residual risk management. A weekly meeting is scheduled by a multidisciplinary team in order to support the different projects. Experiments and practice have led us to upgrade our scales of gravity and detectability, identify critical points and introduce the residual risk management concept. Some difficulties with regards to the multiplicity of scenarios still prevail. Risk mapping is an essential tool in the implementation of risk quality management, specifically when the methodology is progressive and takes into consideration all the members of a pluridisciplinary team. Copyright © 2015 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  9. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems....... The transformation of ordinary systems to element-based ones and the aggregation of non-key elements allow the important design parameters, such as the number of stages, feed stage and minimum reflux ratio, to be determined by using simple diagrams similar to those regularly employed for non-reactive systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...

  10. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  11. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  12. An Analysis of Insider Trading in the Credit Derivatives Market Using the Event Study Methodology

    Directory of Open Access Journals (Sweden)

    Ewa Wareluk

    2013-12-01

    Full Text Available Purpose: In this paper I investigate the information fl ow between the credit default swap market and the stock market as well as insider trading in the credit default swap market. Methodology: For my analysis I use the event study methodology. Using the event study methodology I calculate abnormal stock returns and abnormal credit default swap premium changes. The analysis is based on 175,874 observations collected for 92 companies between the years 2001 and 2010. Findings: The results show that the information fl ow from the credit default swap market to the stock market is the most signifi cant in terms of negative rating outlooks. The information fl ow is much less signifi cant in relations to negative surprises during announcements of annual fi nancial results and rating upgrades. Evidence of insider trading is also most evident with reference to negative rating outlooks. Additionally, a distinctive feature of the credit default swap market and the stock market is the asymmetric response to negative and positive credit information. Research limitations: The event study methodology does not consider other potentially important reasons for the information flow between markets than the ones actually investigated. The credit events and credit risk information used in this research are just a proposal and can be extended by future researchers. Originality: This paper discusses a new research area. The main research area in terms of insider trading is still the stock market, with special focus on the US market. I decided to explore the insider trading phenomenon in the credit default swap market. I only considered contracts that are quoted with reference to European underlying assets. This part of the fi nancial market is attractive in terms of economic research as credit derivatives are more commonly used not only in North America but also in Europe.

  13. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  15. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  16. Shark attack in Natal.

    Science.gov (United States)

    White, J A

    1975-02-01

    The injuries in 5 cases of shark attack in Natal during 1973-74 are reviewed. Experience in shark attacks in South Africa during this period is discussed (1965-73), and the value of protecting heavily utilized beaches in Natal with nets is assessed. The surgical applications of elasmobranch research at the Oceanographic Research Institute (Durban) and at the Headquarters of the Natal Anti-Shark Measures Board (Umhlanga Rocks) are described. Modern trends in the training of surf life-guards, the provision of basic equipment for primary resuscitation of casualties on the beaches, and the policy of general and local care of these patients in Natal are discussed.

  17. METHODOLOGICAL ANALYSIS OF STUDYING THE PROBLEM OF PERCEPTION IN FUTURE MUSIC TEACHERS’ PROFESSIONAL TRAINING

    Directory of Open Access Journals (Sweden)

    Zhang Bo

    2017-04-01

    Full Text Available In the article the methodological analysis of problem of perception in future music teachers’ professional training is presented. The author of the article analyses works of outstanding scientists in philosophy, psychology, and art education. The hierarchical system of musical perception options is revealed. A methodological foundation is supported by consideration of the following modern research in specialty – a theory and methodology of musical study that gives proper appearance and circumstantiality to the presented material. Studying the vocal and choral researches in the field of forming the valued music art perception by future music teachers, an author sets an aim to present the methodological analysis of the problem of perception in future music teachers’ professional training. Realization of the system approach to updating the problem of forming the valued music art perception of future music teachers while being trained to vocal and choral work with senior pupils extends their artistic awareness; contributes to distinguishing art works, phenomena; to seeing their properties; to providing orientation in the informative content of music art works. The special attention is paid to revealing methodological principles of perception of category research in the aspect of the valued understanding images of music art works. As a result of analysing scientific sources on the issue of voice production the author of the article finds out that perception is densely related to transformation of external information, conditioning for forming images, operating category attention, memory, thinking, and emotions. The features of perception of maintaining vocal and choral studies and students’ extrapolation are analysed in the process of future professional activity with senior pupils in the aspects of perception and transformation of musical and intonation information, analysis, object perception, and interpretation in accordance with future

  18. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  19. Unified communications forensics anatomy of common UC attacks

    CERN Document Server

    Grant, Nicholas Mr

    2013-01-01

    Unified Communications Forensics: Anatomy of Common UC Attacks is the first book to explain the issues and vulnerabilities and demonstrate the attacks, forensic artifacts, and countermeasures required to establish a secure (UC) environment. This book is written by leading UC experts Nicholas Grant and Joseph W. Shaw II and provides material never before found on the market, including: analysis of forensic artifacts in common UC attacks an in-depth look at established UC technologies and attack exploits hands-on understanding of UC attack vectors and associated countermeasures

  20. Three years of the OCRA methodology in Brazil: critical analysis and results.

    Science.gov (United States)

    Ruddy, Facci; Eduardo, Marcatto; Edoardo, Santino

    2012-01-01

    The Authors make a detailed analysis of the introduction of the OCRA Methodology in Brazil that started in August 2008 with the launching of the "OCRA Book" translated to Portuguese. They evaluate the importance of the assessment of the exposure of the upper limbs to the risk due to repetitive movements and efforts, according to the national and international legislation, demonstrating the interconnection of the OCRA Methodology with the Regulating Norms of the Ministry of Labor and Work (NRs - MTE), especially with the NR-17 and its Application Manual. They discuss the new paradigms of the OCRA Method in relation to the classic paradigms of the ergonomic knowledge. They indicate the OCRA Method as the tool to be used for the confirmation or not of the New Previdentiary Epidemiologic Nexus NTEP/FAP. The Authors present their conclusions based on the practical results the "participants certified by the OCRA Methodology" achieved in the application on different laboral activities in diverse economic segments, showing the risk reduction and the productivity of the companies.

  1. Multiscale Entropy Analysis of Center-of-Pressure Dynamics in Human Postural Control: Methodological Considerations

    Directory of Open Access Journals (Sweden)

    Brian J. Gow

    2015-11-01

    Full Text Available Multiscale entropy (MSE is a widely used metric for characterizing the nonlinear dynamics of physiological processes. Significant variability, however, exists in the methodological approaches to MSE which may ultimately impact results and their interpretations. Using publications focused on balance-related center of pressure (COP dynamics, we highlight sources of methodological heterogeneity that can impact study findings. Seventeen studies were systematically identified that employed MSE for characterizing COP displacement dynamics. We identified five key methodological procedures that varied significantly between studies: (1 data length; (2 frequencies of the COP dynamics analyzed; (3 sampling rate; (4 point matching tolerance and sequence length; and (5 filtering of displacement changes from drifts, fidgets, and shifts. We discuss strengths and limitations of the various approaches employed and supply flowcharts to assist in the decision making process regarding each of these procedures. Our guidelines are intended to more broadly inform the design and analysis of future studies employing MSE for continuous time series, such as COP.

  2. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  3. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  4. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  5. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    objective is to attack the boost phase of ballistic missiles using the Airborne Weapons Layer concept (AWL) (Corbett, 2013) and ( Rood , Chilton, Campbell...and analysis techniques used in this research. Chapter 4 provides analysis of the simulation model to illustrate the methodology in Chapter 3 and to... techniques , and procedures. The purpose of our research is to study the use of a new missile system within an air combat environment. Therefore, the

  6. Tracking and Analyzing Individual Distress Following Terrorist Attacks Using Social Media Streams.

    Science.gov (United States)

    Lin, Yu-Ru; Margolin, Drew; Wen, Xidao

    2017-08-01

    Risk research has theorized a number of mechanisms that might trigger, prolong, or potentially alleviate individuals' distress following terrorist attacks. These mechanisms are difficult to examine in a single study, however, because the social conditions of terrorist attacks are difficult to simulate in laboratory experiments and appropriate preattack baselines are difficult to establish with surveys. To address this challenge, we propose the use of computational focus groups and a novel analysis framework to analyze a social media stream that archives user history and location. The approach uses time-stamped behavior to quantify an individual's preattack behavior after an attack has occurred, enabling the assessment of time-specific changes in the intensity and duration of an individual's distress, as well as the assessment of individual and social-level covariates. To exemplify the methodology, we collected over 18 million tweets from 15,509 users located in Paris on November 13, 2015, and measured the degree to which they expressed anxiety, anger, and sadness after the attacks. The analysis resulted in findings that would be difficult to observe through other methods, such as that news media exposure had competing, time-dependent effects on anxiety, and that gender dynamics are complicated by baseline behavior. Opportunities for integrating computational focus group analysis with traditional methods are discussed. © 2017 Society for Risk Analysis.

  7. Depression After Heart Attack

    Science.gov (United States)

    ... Heart Attack? Redford B. Williams Download PDF https://doi.org/10.1161/CIRCULATIONAHA.110.017285 Circulation. 2011; 123: ... e639-e640 , originally published June 27, 2011 https://doi.org/10.1161/CIRCULATIONAHA.110.017285 Citation Manager Formats ...

  8. Fatal crocodile attack.

    Science.gov (United States)

    Chattopadhyay, Saurabh; Shee, Biplab; Sukul, Biswajit

    2013-11-01

    Attacks on human beings by various animals leading to varied types of injuries and even death in some cases are not uncommon. Crocodile attacks on humans have been reported from a number of countries across the globe. Deaths in such attacks are mostly due to mechanical injuries or drowning. Bites by the crocodiles often cause the limbs to be separated from the body. The present case refers to an incident of a fatal attack by a crocodile on a 35 years old female where only the mutilated head of the female was recovered. Multiple lacerated wounds over the face and scalp along with fracture of the cranial bones was detected on autopsy. Two distinct bite marks in the form of punched in holes were noted over the parietal and frontal bones. Injuries on the head with its traumatic amputation from the body were sufficient to cause death. However, the presence of other fatal injuries on the unrecovered body parts could not be ruled out. Copyright © 2013 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  9. Quantitative studies of rhubarb using quantitative analysis of multicomponents by single marker and response surface methodology.

    Science.gov (United States)

    Sun, Jiachen; Wu, Yueting; Dong, Shengjie; Li, Xia; Gao, Wenyuan

    2017-10-01

    In this work, we developed a novel approach to evaluate the contents of bioactive components in rhubarb. The present method was based on the quantitative analysis of multicomponents by a single-marker and response surface methodology approaches. The quantitative analysis of multicomponents by a single-marker method based on high-performance liquid chromatography coupled with photodiode array detection was developed and applied to determine the contents of 12 bioactive components in rhubarb. No significant differences were found in the results from the quantitative analysis of multicomponents by a single-marker and the external standard method. In order to maximize the extraction of 12 bioactive compounds in rhubarb, the ultrasonic-assisted extraction conditions were obtained by the response surface methodology coupled with Box-Behnken design. According to the obtained results, we showed that the optimal conditions would be as follows: proportion of ethanol/water 74.39%, solvent-to-solid ratio 24.07:1 v/w, extraction time 51.13 min, and extraction temperature 63.61°C. The analytical scheme established in this research should be a reliable, convenient, and appropriate method for quantitative determination of bioactive compounds in rhubarb. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous......) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...

  11. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    , is presented and detailed to a level enabling an analysis of the relation to the impact categories at midpoint level considered in life cycle (LC) methodologies.The interpretation of the definition of sustainability as outlined in Our Common Future (WCED 1987) suggests that the assessment of a product...... sustainability assessment (LCSA) if focusing on the monetary gains or losses for the poor. Yet, this is an aspect which is already considered in several SLCA approaches.The current consensus that LCSA can be performed through combining the results from an SLCA, LCA and LCC is only partially supported...

  12. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  13. Problem solving and data analysis using Minitab a clear and easy guide to six sigma methodology

    CERN Document Server

    Khan, Rehman M

    2013-01-01

    Six Sigma statistical methodology using Minitab Problem Solving and Data Analysis using Minitab presents example-based learning to aid readers in understanding how to use MINITAB 16 for statistical analysis and problem solving. Each example and exercise is broken down into the exact steps that must be followed in order to take the reader through key learning points and work through complex analyses. Exercises are featured at the end of each example so that the reader can be assured that they have understood the key learning points. Key features: Provides readers with a step by step guide to problem solving and statistical analysis using Minitab 16 which is also compatible with version 15. Includes fully worked examples with graphics showing menu selections and Minitab outputs. Uses example based learning that the reader can work through at their pace. Contains hundreds of screenshots to aid the reader, along with explanations of the statistics being performed and interpretation of results. Presents the core s...

  14. Adaptation of SW-846 methodology for the organic analysis of radioactive mixed wastes

    International Nuclear Information System (INIS)

    Griest, W.H.; Schenley, R.L.; Tomkins, B.A.; Caton, J.E. Jr.; Fleming, G.S.; Harmon, S.H.; Wachter, L.J.; Garcia, M.E.; Edwards, M.D.

    1990-01-01

    Modifications to SW-846 sample preparation methodology permit the organic analysis of radioactive mixed waste with minimum personal radiation exposure and equipment contamination. This paper describes modifications to SW-846 methods 5030 and 3510-3550 for sample preparation in radiation-zoned facilities (hood, glove box, and hot cell) and GC-MS analysis of the decontaminated organic extracts in a conventional laboratory for volatile and semivolatile organics by methods 8240 and 8270 (respectively). Results will be presented from the analysis of nearly 70 nuclear waste storage tank liquids and 17 sludges. Regulatory organics do not account for the organic matter suggested to be present by total organic carbon measurements. 7 refs., 5 tabs

  15. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  16. BIRD ATTACK OCULAR INJURIES.

    Science.gov (United States)

    Tabatabaei, Seyed Ali; Soleimani, Mohammad; Behrouz, Mahmoud Jabbarvand

    2017-03-29

    To report 30 patients with bird attack-related eye injuries. This study was performed among patients coming to Farabi Eye Hospital, Tehran, Iran, from 2010 to 2015 with a history of bird attack causing eye injury. The inclusion criteria were a history of bird attack by pecking causing eye injury and having treatment and follow-up record for at least 6 months after treatment. The primary eye examinations included a full ophthalmic examination including evaluation of uncorrected visual acuity and best-corrected visual acuity (BCVA), anterior segment slit lamp biomicroscopy, and photography. For all patients with penetrating injury, primary repair was undertaken. Thirty patients (10 females and 20 males) with a mean age of 23.3 ± 18.5 years entered the study. The most common zone of injury was zone 1 (P < 0.001), and lensectomy was not needed in majority of patients (P < 0.001). The most common bird causing the injury was mynah (P < 0.001). Those patients with baseline BCVA of less than 20/200 or those with endophthalmitis had statistically worse final BCVA after treatment. Patients attacked by mynah bird had significantly better pretreatment uncorrected visual acuity and BCVA. The most common bird causing the eye injury among the sample of patients from Iran was mynah, which differs with previous studies indicating the rooster attack as the most common cause of eye injury. The authors also found that the most common zone of injury was zone 1, and the presence of endophthalmitis and lower baseline BCVA were significant risk factors for worse visual outcomes.

  17. Risk of stroke and cardiovascular events after ischemic stroke or transient ischemic attack in patients with type 2 diabetes or metabolic syndrome: secondary analysis of the Stroke Prevention by Aggressive Reduction in Cholesterol Levels (SPARCL) trial

    DEFF Research Database (Denmark)

    Callahan, Alfred; Amarenco, Pierre; Goldstein, Larry B

    2011-01-01

    To perform a secondary analysis of the Stroke Prevention by Aggressive Reduction in Cholesterol Levels (SPARCL) trial, which tested the effect of treatment with atorvastatin in reducing stroke in subjects with a recent stroke or transient ischemic attack, to explore the effects of treatment...

  18. Performance analysis and implementation of proposed mechanism for detection and prevention of security attacks in routing protocols of vehicular ad-hoc network (VANET

    Directory of Open Access Journals (Sweden)

    Parul Tyagi

    2017-07-01

    Full Text Available Next-generation communication networks have become widely popular as ad-hoc networks, broadly categorized as the mobile nodes based on mobile ad-hoc networks (MANET and the vehicular nodes based vehicular ad-hoc networks (VANET. VANET is aimed at maintaining safety to vehicle drivers by begin autonomous communication with the nearby vehicles. Each vehicle in the ad-hoc network performs as an intelligent mobile node characterized by high mobility and formation of dynamic networks. The ad-hoc networks are decentralized dynamic networks that need efficient and secure communication requirements due to the vehicles being persistently in motion. These networks are more susceptible to various attacks like Warm Hole attacks, denial of service attacks and Black Hole Attacks. The paper is a novel attempt to examine and investigate the security features of the routing protocols in VANET, applicability of AODV (Ad hoc On Demand protocol to detect and tackle a particular category of network attacks, known as the Black Hole Attacks. A new algorithm is proposed to enhance the security mechanism of AODV protocol and to introduce a mechanism to detect Black Hole Attacks and to prevent the network from such attacks in which source node stores all route replies in a look up table. This table stores the sequences of all route reply, arranged in ascending order using PUSH and POP operations. The priority is calculated based on sequence number and discard the RREP having presumably very high destination sequence number. The result show that proposed algorithm for detection and prevention of Black Hole Attack increases security in Intelligent Transportation System (ITS and reduces the effect of malicious node in the VANET. NCTUNs simulator is used in this research work.

  19. Blocking of Brute Force Attack

    OpenAIRE

    M.Venkata Krishna Reddy

    2012-01-01

    A common threat Web developers face is a password-guessing attack known as a brute-force attack. A brute-force attack is an attempt to discover a password by systematically trying every possible combination of letters, numbers, and symbols until you discover the one correct combination that works. If your Web site requires user authentication, you are a good target for a brute-force attack. An attacker can always discover a password through a brute-force attack, but the downside is that it co...

  20. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    Science.gov (United States)

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  1. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil

    Directory of Open Access Journals (Sweden)

    Gustavo A. Silva

    2018-03-01

    Full Text Available Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power, hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization. It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining. Keywords: Earth sciences, Business, Economics, Industry

  2. Gait analysis methodology for the measurement of biomechanical parameters in total knee arthroplasties. A literature review.

    Science.gov (United States)

    Papagiannis, Georgios I; Triantafyllou, Athanasios I; Roumpelakis, Ilias M; Papagelopoulos, Panayiotis J; Babis, George C

    2018-03-01

    Gait analysis using external skin markers provides scope for the study of kinematic and kinetic parameters shown on different total knee arthroplasties (TKA). Thus an appropriate methodology is of great importance for the collection and correlation of valid data. Calibration of equipment is of great importance before measurements, to assure accuracy. Force plates should be calibrated to 1080 Hz and optoelectronic cameras should use 120 Hz frequency, because of the nature of gait activity. Davis model which accurately defines the position of the markers is widely accepted and cited, for the gait analysis of TKA's. To ensure the reproducibility of the measurement, a static trial at the anatomical position must be captured. Following, all acquisitions of dynamic data must be checked for consistency in walking speed, and abnormal gait style because of fatigue or distraction. To establish the repeatability of the measurement, this procedure must be repeated at a pre-defined number of 3-5 gait cycles. Anthropometric measurements should be combined with three-dimensional marker data from the static trial to provide positions of the joint's center and define anatomical axes of total knee arthroplasty. Kinetic data should be normalized to bodyweight (BW) and percentage of BW and height depending on the study. External moments should also be calculated by using inverse dynamics and amplitude-normalized to body mass (Nm/kg). Gait analysis using external skin markers provides scope for the study of biomechanical parameters shown on different TKAs. Thus a standard gait analysis methodology when measuring TKA biomechanical parameters is necessary for the collection and correlation of accurate, adequate, valid and reproducible data. Further research should be done to clarify if the development of a specific kinematic model is appropriate for a more accurate definition of total knee implant joint center in measurements concerning 3D gait analysis.

  3. Methodology for the analysis of self-tensioned wooden structural floors

    Directory of Open Access Journals (Sweden)

    F. Suárez-Riestra

    2017-09-01

    Full Text Available It is described a self-tensioning system constituted by a force multiplying device which, attached to the supports of the ends of the structural element, is able to convert the vertical resultant from the gravitatonial actions into an effective tensioning action, through the movement that was induced by a set of rods. The self-tensioning system is able to offer a high performance, thanks to the beneficial effect of the opposite deflection generated by the tensioning, in proportion to the increasing of the gravitational action. This allows to design long-span timber ribbed floors using reduced depths. The complexity of calculation due to the non-linearity of the system can be obviated with the methodology of analysis developed in the article. In order to illustrate the advantages of the self-tensioning system and the methodology of analysis which were developed, six cases of ribbed floors have been analysed, with spans of 9, 12 and 15 m and variable using loads of 3,00 kN/m2 and 5,00 kN/m2.

  4. Investigating DMOs through the Lens of Social Network Analysis: Theoretical Gaps, Methodological Challenges and Practitioner Perspectives

    Directory of Open Access Journals (Sweden)

    Dean HRISTOV

    2015-06-01

    Full Text Available The extant literature on networks in tourism management research has traditionally acknowledged destinations as the primary unit of analysis. This paper takes an alternative perspective and positions Destination Management Organisations (DMOs at the forefront of today’s tourism management research agenda. Whilst providing a relatively structured approach to generating enquiry, network research vis-à-vis Social Network Analysis (SNA in DMOs is often surrounded by serious impediments. Embedded in the network literature, this conceptual article aims to provide a practitioner perspective on addressing the obstacles to undertaking network studies in DMO organisations. A simple, three-step methodological framework for investigating DMOs as interorganisational networks of member organisations is proposed in response to complexities in network research. The rationale behind introducing such framework lies in the opportunity to trigger discussions and encourage further academic contributions embedded in both theory and practice. Academic and practitioner contributions are likely to yield insights into the importance of network methodologies applied to DMO organisations.

  5. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  6. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  7. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  8. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained

  9. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  10. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    Bataillon, Remy; Lafont, Marielle; Rousse, Carole; Vuillez, Jean-Philippe; Ducou Le Pointe, Hubert; Grenier, Nicolas; Lartigau, Eric; Orcel, Philippe; Dujarric, Francis; Beaupin, Alain; Bar, Olivier; Blondet, Emmanuelle; Combe, Valerie; Pages, Frederique

    2012-11-01

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  11. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  12. [The perspective directions of development of methodology of the analysis of risk in Russia].

    Science.gov (United States)

    Avaliani, S L; Bezpal'ko, L E; Bobkova, I E; Mishina, A L

    2013-01-01

    In the article the perspective directions of development of methodology of the analysis of risk in Russia with taking into account the last world achievements in this area and requirements to harmonization of a system for control of environment quality are considered. Main problem questions of the analysis of risk in relation to regulation of nature protection activity were emphasized to be related as well with insufficiency of legislative, standard and executive support for this direction of administrative activity as with the need of the solution of the methodical questions concerning new tendencies of justification and use of reference levels of chemicals, development of modern approaches to the specification of cancerogenic and not cancerogenic risks, including cumulative ones.

  13. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.; Petrovic, B.

    2003-04-01

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  14. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    Science.gov (United States)

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  15. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    Science.gov (United States)

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  16. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  17. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  18. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    Bojacá, Carlos Ricardo; Casilimas, Héctor Albeiro; Gil, Rodrigo; Schrevens, Eddie

    2012-01-01

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha −1 to produce 96.4 GJ ha −1 , resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  19. Heart Attack Coronary Artery Disease

    Science.gov (United States)

    ... our e-newsletter! Aging & Health A to Z Heart Attack Coronary Artery Disease, Angina Basic Facts & Information What ... and oxygen supply; this is what causes a heart attack. If the damaged area is small, however, your ...

  20. Thrombolytic drugs for heart attack

    Science.gov (United States)

    ... gov/ency/article/007488.htm Thrombolytic drugs for heart attack To use the sharing features on this page, ... supply blood and oxygen to the heart. A heart attack can occur if a blood clot stops the ...