WorldWideScience

Sample records for bfi-fr analyse convergente

  1. Redes convergentes

    Directory of Open Access Journals (Sweden)

    Ángela Marcela Mejía Fajardo

    2004-01-01

    Full Text Available Las redes convergentes o redes de multiservicio hacen referencia a la integración de los servicios de voz, datos y video sobre una sola red basada en IP como protocolo de nivel de red. En este artículo se presenta la integración de servicios de voz sobre redes IP (VoIP como ejemplo de red convergente. La arquitectura de esta red está constituida básicamente, por el media gateway, el controlador de media gateway, el gateway de señalización y el gatekeeper. Las redes de convergencia han tenido y tendrán aún dificultades técnicas qué superar ya que los distintos servicios por ofrecer tienen diferentes características y requerimientos de red, por tanto es importante hablar aquí de ingeniería de tráfico y mecanismos que garanticen calidades de servicio.

  2. A adaptação na obra aberta como narrativa convergente

    Directory of Open Access Journals (Sweden)

    André Campos Silva

    2017-07-01

    Full Text Available Por meio da comparação narrativa do livro Os Sertões (1907 e sua adaptação para o cinema Guerra de Canudos (1997, propomos uma leitura do conceito de Obra Aberta de Umberto Eco. O processo de adaptação entre os meios de comunicação redefine o sentido dos personagens, porém considerando o processo da obra aberta entre ambas narrativas, elas se tornam convergentes. Apontando para o público uma multiplicidade de imagens midiáticas do conflito em Canudos enquanto uma realidade ficcional.

  3. Aproximación a un modelo de aprovisionamiento de servicios convergentes

    Directory of Open Access Journals (Sweden)

    Julián Andrés Caicedo

    2014-12-01

    Full Text Available El aprovisionamiento de servicios de telecomunicaciones ha cambiado notablemente en los últimos 30 años; se ha pasado de modelos estáticos, rígidos y controlados por un solo actor de negocio a modelos dinámicos, flexibles, con múltiples actores en la cadena de valor y orientados a los usuarios finales. Diferentes enfoques han planteado modelos de aprovisionamiento que cumplan requerimientos específicos a los operadores de telecomunicaciones. Sin embargo, aún no están claros los procesos principales que deben desarrollarse en cada fase del ciclo de vida del servicio para que permitan una integración efectiva entre diferentes modelos de aprovisionamiento. En este artículo se presenta una aproximación al modelo inicial de aprovisionamiento de servicios convergentes, el cual abstrae los principales procesos dentro de las fases de diseño, despliegue y operación del servicio. De igual manera, se presenta un piloto funcional que ejecuta de manera automática el proceso de despliegue inicial en un entorno convergente con base en los procesos definidos en esta fase. Finalmente, el mecanismo es sometido a una prueba de escalabilidad para evaluar su desempeño.

  4. Ensino da Bioética Convergente de ricardo maliandi nos Cursos de medicina

    Directory of Open Access Journals (Sweden)

    Nalita Maria Hall Brum de Barros Mugayar

    Full Text Available RESUMO Este artigo sublinha a patente deficiência dos currículos de Medicina em relação às Ciências Humanas e defende que o estudo da Bioética — disciplina que procura integrar as Ciências Humanas às Ciências Biológicas — poderá ajudar a preencher essa nociva lacuna. Apresentamos a Bioética Convergente de Ricardo Maliandi e Oscar Thüer como um valioso arcabouço teórico capaz de auxiliar o médico a protagonizar a resolução dos conflitos éticos inerentes à sua prática profissional, sem incorrer em unilateralidade. Comparamos sua fundamentação teórica com a conhecida proposta, também principialista, de Beauchamp e Childress, apontando as vantagens daquela em relação a esta. Exemplificamos sua aplicabilidade com a análise de potenciais conflitos éticos inferidos de informações obtidas em prontuário de uma paciente internada no Centro de Terapia Intensiva do Hospital Universitário Antônio Pedro. Para a realização dessa análise, buscamos, na literatura médica, dados probabilísticos em relação ã doença em questão (neoplasia de esôfago com fístula traqueoesofageana complicada por choque séptico pulmonar, ressaltando que esses dados podem ajudar na melhor compreensão do prognóstico, sem que por isso possam ser utilizados como respaldo da equipe médica para decisões unilaterais de limitação terapêutica. A literatura médica também nos brindou com propostas de condução de casos difíceis do ponto de vista ético, como o da paciente em tela. Escolhemos uma delas (Azoulayet al.12, reconhecendo e demonstrando sua compatibilidade com a Bioética Convergente de Maliandi e Thüer. Trata-se de um ensaio teórico sobre limitação terapêutica, no qual procuramos unir a fundamentação da literatura à aplicabilidade em um caso real de paciente crítica. Acreditamos que este artigo poderá ser um ponto de partida para a difusão da Ética Convergente — trabalho de toda a vida do filósofo Ricardo

  5. Validez convergente y discriminante del Inventario de Cociente Emocional (EQ-i

    Directory of Open Access Journals (Sweden)

    Evangelina Regner

    2008-01-01

    Full Text Available Desde la aparición del constructo de inteligencia emocional (Salovey & Mayer, 1990 el campo de las habilidades emocionales se ha ido ampliando notablemente, debido a la apa rición de diferentes conceptualizaciones, teorías e instrumentos de medición. Los desarrollos actuales se han centrado en el estudio de la validez de las pruebas utilizadas para medir la inteligencia emocional. El objetivo de la investigación que se informa fue estudiar la validez convergente y discriminante del Inventario de Cociente Emocional (EQ-i de Bar-On (1997a, 1997b en una muestra argentina de 100 personas adultas. Los instrumentos aplicados fueron: el EQ-i, el Inventario Re visado de Personalidad NEO (NEO PI-R de Costa y Mc Crae (1992 y la Prueba de Inteligencia o de Razonamiento Ver bal (RV de Bennett, Seashore y Wesman (1992. Para analizar los datos se realizaron correlaciones entre el EQ-i, el NEO PI-R y RV y regresiones múltiples por pasos sucesivos entre los factores de personalidad del NEO PI-R y las escalas de inteligencia emocional del EQ-i...

  6. A Lei Geral de Telecomunicações sob uma perspectiva convergente

    Directory of Open Access Journals (Sweden)

    Renata Tonicelli de M. Quelho

    2011-05-01

    Full Text Available O objetivo do artigo é o de promover a análise da adaptabilidade da Lei Geral de Telecomunicações brasileira, Lei nº 9.472/97 (LGT, a um contexto de convergência. Inicialmente, é trazida uma característica das infraestruturas de tecnologia da informação e comunicação que revela o potencial convergente e a possibilidade de mudança nas estratégias de mercado e nas políticas públicas: a modularidade. Em seguida, são identificados dois cenários limitadores da convergência: a existência de ilhas na política de comunicação e a de silos no regime regulatório. A regulação em camadas é utilizada para análise da convergência. Em face desses elementos, o marco legal de telecomunicações é testado para se demonstrar a relativa adaptabilidade da LGT.

  7. O MÉTODO DA PESQUISA CONVERGENTE ASSISTENCIAL E SUA APLICAÇÃO NA PRÁTICA DE ENFERMAGEM

    Directory of Open Access Journals (Sweden)

    Mercedes Trentini

    2017-01-01

    Full Text Available Objetivo: reflexionar sobre la diligencia de tres estudios que siguieron lo convenido en la Investigación Convergente Asistencial como referencial metodológico. Resultados : la Investigación Convergente Asistencial se caracteriza por la realización de mejoras con introducción de innovaciones en el contexto de la práctica asistencial de enfermería y salud. Es orientada por sus propios atributos: inmersibilidad; simultaneidad; expansibilidad y diálogo. Se analizaron tres investigaciones que utilizaron el método de la Investigación Convergente Asistencial. El estudio A consistió en la construcción de material informativo (cartilla como tecnología a ser desarrollada con base en los saberes y experiencias de los participantes en relación al examen de Tomografía computarizada. El estudio B se propuso desarrollar prácticas educativas con un grupo de mujeres recolectoras de basura para aliviar las cargas de trabajo y, de este modo, evitar accidentes de trabajo. El estudio C desarrolló una propuesta de educación en el trabajo con enfermeras que actúan en cuidados paliativos con la construcción de un instrumento sobre la evaluación del dolor en pacientes con cáncer. Conclusión: los tres estudios mostraron que el método de la Investigación Convergente Asistencial posibilita una convergencia entre acciones de asistencia y acciones de investigación para crear espacios de superposición de esas dos actividades, con la producción de un nuevo conocimiento y el cambio de la práctica asistencial. Este método permite que tanto la investigación como la práctica asistencial, puedan ser desarrolladas en el mismo espacio físico y temporal y, para ello, necesitan ser desarticuladas al operacionalizar análisis específico de cada una.

  8. Análisis y diseño de una Infraestructura convergente. Caso de estudio Vblock

    OpenAIRE

    Almeida Galárraga, José Rafael

    2015-01-01

    La Infraestructura Convergente busca solventar las necesidades de un mercado que es cada vez más exigente y que se ve insatisfecho por la falta de capacidad de respuesta de un equipo de TI que debe destinar la mayor parte de su tiempo a integrar y dar mantenimiento a la infraestructura muy diversa con la que cuentan y que dificulta el aprovisionamiento de la misma. Este sistema se enmarca en una infraestructura homogénea con componentes de cómputo, red y almacenamiento que s...

  9. Tiempo de llegada de solutos sorbentes bajo condiciones de flujo convergente. Momentos estadísticos condicionados.

    OpenAIRE

    Castillo Cerdà, Cristina

    2003-01-01

    El análisis del tiempo de llegada en transporte de solutos en el subsuelo es una herramienta indispensable en los estudios de rehabilitación de acuíferos y en los estudios hidrogeológicos en general. Se plantea un estudio numérico con una metodologia estocástica para determinar los momentos estadísticos del tiempo de llegada de las partículas de un contaminante no conservativo en condiciones de flujo convergente en un medio heterogéneo. Los momentos estadísticos se consideran condicionados...

  10. El futuro de la TV europea es híbrido, convergente y cada vez menos público

    Directory of Open Access Journals (Sweden)

    Francisco Campos-Freire

    2013-02-01

    Full Text Available La televisión pública en Europa pierde peso e influencia en las políticas de comunicación de esta segunda década del siglo XXI, en sus propios medios de financiación, en la lucha por las audiencias y en los sistemas tecnológicos de difusión frente a los nuevos modelos híbridos y convergentes abanderados por los operadores de cable, de satélite y de la distribución por Internet (IPTV. Frente a la fragmentación de audiencias y al debilitamiento de los modelos de financiación tradicionales (publicidad, canon, subvención, las televisiones concentran sus estrategias de difusión en las economías de escala y de club a través de duopolios o plataformas integradas de mercado.

  11. Saúde mental na atenção básica: uma abordagem convergente assistencial Salud mental en atención primaria: un abordaje convergente-asistencial Mental health in primary care: an assistant research approach

    Directory of Open Access Journals (Sweden)

    Milena Hohmann Antonacci

    2011-03-01

    Full Text Available Este estudo tem como objetivo conhecer expectativas e anseios de uma comunidade em relação à implantação de um grupo de saúde mental na atenção básica. Trata-se de um estudo qualitativo que utiliza como abordagem de investigação a pesquisa convergente-assistencial (PCA. Os dados foram obtidos por meio de oficinas com usuários, em uso de psicofármacos acompanhados por Unidade Básica da Região Sul do Brasil. A primeira oficina apontou para reflexão e elaboração de estratégias no enfrentamento ao modelo asilar. A segunda discutiu a importância de espaços de convivência que fortaleçam vínculos afetivos e atuem como meio de prevenção de agravos em saúde mental. A terceira discutiu a questão do cerceamento de liberdade imposto pelo sofrimento mental. Constatou-se que espaços voltados à saúde mental no contexto da atenção básica contribuirão para a efetivação de práticas e construção de novos saberes para a produção de saúde e vida no território existencial dos sujeitos.Este estudio pretende conocer expectativas y necesidades de una comunidad sobre la implantación de un grupo de salud mental en la atención primaria en salud. Se trata de un estudio cualitativo, con abordaje convergente-asistencial. Se han realizados tres talleres con usuarios en el uso de psicotrópicos acompañado por una Unidad Básica de Salud de una ciudad de la Región Sur de Brasil. El primer taller ha apuntado la reflexión y elaboración de estrategias para enfrentar el modelo psiquiátrico tradicional. El según taller ha discutido la importancia de espacios de convivencia para fortalecer el vínculo afectivo para prevenir los problemas de salud mental. El tercer taller ha discutido la inhibición de la libertad causada por el sufrimiento mental. Con eso, es importante la inserción de espacios relacionados con la salud mental en el contexto de la atención primaria en salud, contribuyendo para el proceso de rehabilitaci

  12. Motivação nas atividades de reabilitação cardiovascular: uma pesquisa convergente-assistencial

    Directory of Open Access Journals (Sweden)

    Albertina Bonetti

    2010-06-01

    Full Text Available A dificuldade na realização de atividades físicas por pessoas com Doença Arterial Coronariana nos levou a desenvolver estudo objetivando conhecer os aspectos que influenciam na motivação dessas pessoas para a participação em um Programa de Práticas Corporais Lúdicas. O caminho metodológico foi fundamentado na pesquisa convergente-assistencial. Foram envolvidas dezenove pessoas integrantes de um grupo de pesquisa interdisciplinar sobre dislipidemias. O Programa foi desenvolvido durante oito meses, efetuando coleta de dados por meio de observação participante e entrevistas semiestruturadas. As vivencias foram realizadas três vezes por semana com duração de uma hora cada encontro lúdico. Paraanalisar entrevistas e diários de campo, utilizamos o software Atlas ti. A análise dos dados possibilitou a construção de trêscategorias: ter prazer, sentir-se incluído, avanços na condição física. O programa proporcionou diversidade dos movimentoscorporais, diferenciando-se da hegemonia das práticas nos programas tradicionais; melhor compreensão e percepção de corpo; melhor condicionamento; melhor interação e integração e maior motivação para a participação efetiva no Programa. Esses resultados nos levaram a compreender que a utilização de abordagens diferenciadas, com componente lúdico favorecem a participação e continuidade na realização de atividades físicas.

  13. Vygotsky e múltiplas representações: leituras convergentes para o ensino de ciências

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Laburú

    2013-04-01

    Full Text Available http://dx.doi.org/10.5007/2175-7941.2013v30n1p7 Este trabalho traz uma reflexão centrada no tema linguagem e pensamento de Vygotsky, com o objetivo de mostrar que pontos de vista do autor sobre o assunto se encontram subjacentes às argumentações que sustentam o referencial de multimodos e múltiplas representações. Há pouco mais de uma década em desenvolvimento, o programa de pesquisa de multimodos e múltiplas representações vem se mostrando progressivo, conclusão garantida, tanto pela abrangência internacional de suas pesquisas como pela amplitude no trato das questões envolvidas com a educação científica e matemática. Por ser a semiótica a teoria que ampara esse programa de pesquisa e a psicologia a que ampara os estudos vygotskianos, o que faz com que, quase sempre, estes últimos estejam ausentes nas referências do primeiro. Todavia, o referencial de multimodos e múltiplas representações faz afirmações que se revelam compatíveis com a posição de Vygotsky no que toca a indissociável interdependência entre linguagem e pensamento. Assim, com base nessa interdependência, o trabalho discute e aponta que o uso de variabilidade de linguagens, nas suas mais diversas representações, defendido pelo referencial multimodal e de múltiplas representações, é compatível e convergente com a leitura vygotskiana sobre o tema.

  14. Processos não conscientes de produção de memórias falsas a partir do paradigma de associados convergentes

    OpenAIRE

    Rodrigues, Eduarda Pimentel

    2009-01-01

    Tese de doutoramento em Psicologia (área de conhecimento em Psicologia Experimental e Ciências Cognitivas) As memórias falsas, também designadas de distorções ou ilusões de memória, correspondem à recordação, parcial ou totalmente, alterada de acontecimentos passados (Roediger & McDermott, 2000). A produção de memórias falsas tem sido amplamente estudada com base num procedimento experimental de associados convergentes, mais conhecido por paradigma DRM (Deese/Roediger/McDermott...

  15. Evidência de validade convergente entre instrumentos de avaliação da consciência fonológica = Convergent validity evidence between instruments for assessment of phonological awareness = La validez convergente entre los instrumentos de evaluación de la conciencia fonológica

    Directory of Open Access Journals (Sweden)

    Suehiro, Adriana Cristina Boulhoça

    2015-01-01

    Full Text Available O presente estudo buscou por evidência de validade convergente entre instrumentos de avaliação da consciência fonológica. Participaram 221 crianças, ambos os sexos, entre 6 e 12 anos (M =8,53; DP=1,40 de segundo ao quinto ano do Ensino Fundamental de escola pública do interior de São Paulo. Os participantes foram submetidos, individualmente, ao Roteiro de Avaliação da Consciência Fonológica (RACF e à Prova de Consciência Fonológica por Produção Oral (PCFO. Foi identificada uma correlação positiva e moderada (r=0,65 entre os instrumentos, indicando que o RACF pode ser usado para avaliar o mesmo construto. Considerando que é um instrumento de rastreio poderá fornecer uma avaliação rápida e de baixo custo para ser usada nessa etapa da escolarização

  16. Análise Convergente do Conceito de Grupo Estratégico no Setor da Iluminação EspanholConverging Analysis of the Concept of Strategic Group in the Spanish Lighting IndustryAnálisis Convergente del Concepto de Grupo Estratégico en el Sector de la Iluminación Español

    Directory of Open Access Journals (Sweden)

    TABOADA, Lorenzo Revuelto

    2008-12-01

    Full Text Available RESUMOA literatura sobre grupos estratégicos não parece oferecer ainda uma base teórica sólida o suficiente para fundamentar a própria existência dos grupos estratégicos e seus efeitos na conduta e nos resultados das empresas. A validação empírica da existência dos grupos estratégicos foi realizada a partir do contraste entre a validade preditiva dos mesmos e a sua performance. Porém os resultados obtidos foram contraditórios. Outros trabalhos optaram por contrastar a validade convergente do construto, utilizando múltiplas definições e medidas. Nosso trabalho emoldura-se nessa segunda linha de atuação e trata de avançar na elaboração de um modelo teórico capaz de definir as características dos setores que afetam a probabilidade de existirem grupos estratégicos com maior ou menor valor preditivo. A partir de uma proposta para definições de grupo estratégico, com diferentes níveis de fortaleza, fizemos a análise de convergência no setor espanhol de fabricantes de lâmpadas e aparelhos de iluminação. O input dessa análise convergente está constituído por definições de grupo estratégico oferecidas pelas três principais tradições teóricas do estudo do fenômeno: a abordagem do posicionamento estratégico, a abordagem dos recursos e capacidades e a psicologia cognitiva. Os resultados apontam o cumprimento da definição débil de grupo estratégico, o que implica que não existe una identidade de grupo suficientemente forte capaz para convertê-lo em ponto de referência estratégico suscetível de afetar a conduta e, portanto, os resultados das empresas que o formam.ABSTRACTStrategic group literature does not seem to offer strong enough theoretical bases to state the very existence of strategic groups and their effects on firm’s conduct and performance. The empirical validation of the existence of strategic groups has been usually done trying to contrast their predictive validity in relation to performance

  17. Evidências de validade convergente-discriminante para a avaliação dos tipos profissionais de Holland (ATPH Evidencias de validez convergente-discriminante para la evaluación de los tipos profesionales de Holland (ATPH Convergent-discriminant validity evidences for the avaliation of Holland's professional types (ATPH

    Directory of Open Access Journals (Sweden)

    Ana Paula Porto Noronha

    2013-01-01

    Full Text Available Este estudo teve como objetivo buscar evidências de validade convergente-discriminante para o teste Avaliação dos Tipos Profissionais, de Holland (ATPH, com vistas a verificar as relações com a Escala de Aconselhamento Profissional (EAP. A amostra foi composta por 42 estudantes dos ensinos fundamental e médio de escolas públicas e particulares. A idade variou entre 11 e 26 anos, sendo que participaram 43% de alunos do sexo masculino, 50%, do feminino, e 7% que não informaram o gênero. Os resultados sugeriram correlações significativas entre as dimensões do EAP e as tipologias do ATPH, com magnitudes que variavam de baixas a moderadas. A partir dos dados coletados, concluiu-se que o ATPH pode ajudar na identificação dos interesses de estudantes em processos de orientação profissional e que as evidências de validade foram favoráveis.Este estudio tuvo como objetivo buscar evidencias de validez convergente-discriminante para la prueba Evaluación de los Tipos Profesionales, de Holland (ATPH, con vistas de verificar las relaciones con la Escala de Consejería Profesional (EAP. La muestra estuvo compuesta de 42 estudiantes de los ciclos primario y secundario de escuelas públicas y particulares. La edad varió entre 11 y 26 años, y participaron 43% de alumnos del sexo masculino, 50%, del femenino, y 7% que no informaron el género. Los resultados sugirieron correlaciones significativas entre las dimensiones del EAP y las tipologías del ATPH, con magnitudes que variaban de bajas a moderadas. A partir de los datos recolectados, se concluyó que el ATPH puede ayudar en la identificación de los intereses de estudiantes en procesos de orientación profesional y que las evidencias de validez fueron favorables.The present study aimed to search convergent-discriminant validity evidences for Holland's test Avaliação dos Tipos Profissionais (ATPH by verifying its relations with the Escala de Aconselhamento Profissional (EAP. The sample

  18. Retórica clásica y redes on line: dos realidades convergentes y análogas. Perspectivas y prospectivas de 9 expertos en Comunicación

    Directory of Open Access Journals (Sweden)

    Inma Berlanga

    2013-01-01

    Full Text Available El presente trabajo pretende probar la relación entre la retórica clásica y las redes sociales on line como realidades convergentes y análogas, así como las sinergias que entre ellas se establecen en aras de una comunicación más persuasiva y eficaz. Para este objetivo se sirve de la técnica cualitativa de la entrevista en profundidad a expertos en el tema, que resellen nuestra hipótesis sobre estas mutuas transferencias. En un momento de éxtasis de la comunicación, mediado por el boom de las redes on line, redescubrir los principios retóricos originales puede ser de alta pertinencia a fin de devolver a la comunicación su faceta más humana y creativa.

  19. Los modelos de conservación biológica divergente y convergente: Una mirada desde las perspectivas de la ecología del paisaje y la teoría de metapoblaciones

    Directory of Open Access Journals (Sweden)

    Cristian Kraker-Castañeda

    2015-11-01

    Full Text Available La pérdida de biodiversidad en paisajes agrícolas es un asunto preocupante a nivel mundial y tema central de mucha de la investigación contemporánea. Este fenómeno puede ser abordado desde dos perspectivas principales: la de la biología de la conservación y la agroecología. La primera enfatiza la importancia de la preservación de los ecosistemas naturales, ya que otros usos del suelo son considerados de legitimidad menor. Para la segunda, el interés se dirige a los agroecosistemas y la biodiversidad es relevante solamente si tiene una conexión con la sostenibilidad de los mismos. La realidad, es que hay agroecosistemas que albergan riqueza en el mismo orden de magnitud que en áreas conservadas y que la pérdida de algunas especies, aparentemente sin valor en la producción del sistema, puede desencadenar efectos cascada si estas son clave en las redes tróficas. Los modelos conceptuales de conservación biológica divergente/convergente, brindan argumentos sobre lo que debería ser la relación entre la agricultura y la biodiversidad; sin embargo, debido a su carga ideológica usualmente derivan en problemas de contextualización. Aquí se refuerzan elementos que parten de la ecología del paisaje y la teoría de metapoblaciones, cuyo soporte proviene de datos empíricos, para repensar este debate con implicaciones para las estrategias de conservación en la región.

  20. Theorical and experimental study of the induced forces by the mixed, divergent, convergent and straight labyrinth of seal systems on the steam turbines, gas turbines and compressor rotors; Estudio teorico-experimental de las fuerzas inducidas por los sistemas de sellos de laberinto rectos, convergentes, divergentes y mixtos sobre los rotores de turbinas de vapor, turbinas de gas y compresores

    Energy Technology Data Exchange (ETDEWEB)

    Salazar San Andres, Octavio Ramon

    1991-12-31

    A theoretical and experimental research is conducted in order to determine the labyrinth seal forces, as well as the stiffness and damping coefficients for straight, convergent, divergent, and combined shapes on turbine and compressor rotors. The mathematical model is deduced on the basis of the single volume method and its solution is obtained by the perturbation procedure. The validation is achieved with published results. Experimental work carried out on a test bench is described in the text. This involved labyrinth seals with straight, convergent, and divergent profiles, as the published information relating to mixed type is sufficient to perform the evaluation. The conclusions demonstrate that the model is able to predict and determine the performance of labyrinth seals based on forces and rotordynamic coefficients for static and dynamic motions. Finally, tests on real steam turbines of 300 MW are recommended. In this case the high pressures and use of wheels with strips on the periphery and supported by the upper part of blades, increase the susceptibility of self excited subsynchronous vibrations. [Espanol] Se presenta una investigacion teorica-experimental relacionada con la obtencion y validacion de un modelo matematico capaz de predecir las fuerzas y los coeficientes de rigidez y amortiguamiento de los sellos de laberinto de tipo recto, convergente, divergente y mixto que se emplean en turbinas y compresores tanto terrestres como aereos. El modelo matematico propuesto se deduce a partir del metodo de un solo volumen y su solucion se obtiene a traves de metodos perturbatorios. La validacion del mismo se consigue al comparar con resultados experimentales publicados en revistas especializadas y con los datos medidos en un banco de pruebas cuya descripcion se incluye en el trabajo, cualculado para sellos rectos, convergentes y divergentes, ya que la informacion publicada respecto al tipo mixto o combinado es suficiente. Las conclusiones de la investigacion

  1. Desarrollo convergente municipal entre estados contiguos a Nayarit y Sinaloa

    Directory of Open Access Journals (Sweden)

    Eduardo Meza-Ramos

    2010-01-01

    Full Text Available El análisis del crecimiento económico registra disparidades sectoriales que se manifiestan en el interior de los países, entre las áreas urbanas y las rurales; entre las regiones prósperas y las rezagadas. En México la política de liberalización comercial no se ha visto reflejada de manera generalizada en la riqueza de la población. Se evaluó la hipótesis de convergencia en el ámbito municipal de los estados de Chihuahua, Durango, Jalisco, Nayarit, Sinaloa, Sonora y Zacatecas; por los datos considerados en el estudio se puede argumentar la existencia de convergencia ¿ y ¿, pues en promedio, la desviación estándar en el año 1989 fue de 1.73 y disminuyó a 1.31 en 2006. La convergencia ¿ describe una relación negativa con un valor absoluto de su estadístico mayor a 2; con 95% de confianza. Cabe señalar que se cuenta con políticas sociales y sectoriales pero se carece de políticas públicas que promuevan el desarrollo regional.

  2. Bibliotecarios universitarios – Profesores. ¿Caminos convergentes?

    Directory of Open Access Journals (Sweden)

    Amante, María João

    2012-06-01

    Full Text Available The changes in higher education brought about by the Bologna process not only affect teaching, but also units within the institution that play a critical role in assuring compliance of the teaching-learning process with the requirements. Among them is the university library, whose officers occupy a key position that requires them to rethink their professional responsibilities and to adopt a methodology for facilitating the acquisition of the competencies and skills demanded by today’s society. A first step for defining the new role of university librarians consists of analyzing the faculty’s perceptions of them, in order to plan for the necessary collaboration between the two collectives.

    Los cambios que el proceso de Bolonia ha impuesto en la educación universitaria no sólo atañen a la docencia sino también a organismos incluidos dentro de la propia institución cuya aportación en el proceso enseñanza-aprendizaje es vital para cumplir los requisitos exigidos. Entre ellos, la biblioteca universitaria y sus responsables ocupan un lugar central que los obliga a replantearse sus cometidos profesionales y a adoptar una metodología de trabajo que ayude a la adquisición de las competencias y habilidades que actualmente la sociedad demanda. Un primer paso para delimitar el nuevo rol del bibliotecario universitario consiste en analizar las percepciones que los docentes tienen sobre ellos y, de esta manera, planificar la necesaria e imprescindible colaboración entre ambos colectivos.

  3. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  4. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  5. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  6. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  7. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  8. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  9. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  10. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  11. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  12. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  13. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  14. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  15. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  16. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  17. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  18. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  19. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  20. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  1. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  2. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  3. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  4. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  5. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  6. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  7. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  8. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  9. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  10. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  11. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  12. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  13. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  14. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  15. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  16. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    In 2001 the activity in the field of safety analyses was focused on verification of the safety analyses reports for NPP V-2 Bohunice and NPP Mochovce concerning the new profiled fuel and probabilistic safety assessment study for NPP Mochovce. The calculation safety analyses were performed and expert reviews for the internal UJD needs were elaborated. An important part of work was performed also in solving of scientific and technical tasks appointed within bilateral projects of co-operation between UJD and its international partnership organisations as well as within international projects ordered and financed by the European Commission. All these activities served as an independent support for UJD in its deterministic and probabilistic safety assessment of nuclear installations. A special attention was paid to a review of probabilistic safety assessment study of level 1 for NPP Mochovce. The probabilistic safety analysis of NPP related to the full power operation was elaborated in the study and a contribution of the technical and operational improvements to the risk decreasing was quantified. A core damage frequency of the reactor was calculated and the dominant initiating events and accident sequences with the major contribution to the risk were determined. The target of the review was to determine the acceptance of the sources of input information, assumptions, models, data, analyses and obtained results, so that the probabilistic model could give a real picture of the NPP. The review of the study was performed in co-operation of UJD with the IAEA (IPSART mission) as well as with other external organisations, which were not involved in the elaboration of the reviewed document and probabilistic model of NPP. The review was made in accordance with the IAEA guidelines and methodical documents of UJD and US NRC. In the field of calculation safety analyses the UJD activity was focused on the analysis of an operational event, analyses of the selected accident scenarios

  17. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  18. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  19. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  20. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  1. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  2. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  3. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  4. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  5. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  6. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  7. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  8. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  9. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  10. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  11. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  12. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  13. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  14. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  15. Tecnologías convergentes para la enseñanza: Realidad Aumentada, BYOD, Flipped Classroom

    Directory of Open Access Journals (Sweden)

    José Manuel Sánchez-García

    2017-01-01

    Full Text Available Conocer qué tecnologías surgirán en los próximos años interesa para de esta forma abordar los problemas que surgirán en las aulas educativas. La confluencia de tendencias como el Bring Your Own Devide (BYOD, Realidad Aumentada (RA y Flipped Classroom (FC, darán lugar a cambios que es interesante investigar, saber de antemano si se puede pro ducir esta convergencia y qué valor puede tener en el proceso de enseñanza y aprendizaje. Para ello es conveniente abordar los retos que supone esta integración, qué legislación regula el acceso de dispositivos electrónicos en clase, ya sea para su utiliza ción en educación superior, como en bachillerato, educación secundaria, primaria e infantil. Interesa conocer cuáles son los retos que plantea el acceso a estos dispositivos en los centros por parte de profesorado y alumnado. Así como qué materiales son pa ra ello más adecuados en aspectos referidos a dispositivos de hardware y la elaboración de contenidos por parte de profesores y empresas del ámbito de la educación. En aspectos relativos al uso de dispositivos y contenidos, la mejora de la calidad educativ a en los diferentes niveles en los que se incorporen contenidos de este tipo y las ventajas, desventajas y riesgos que supone su utilización en los mismos.

  16. Ordenamiento Territorial del Buen Vivir. Paisaje, Patrimonio y Biodiversidad, ¿Conceptos Divergentes o Convergentes?

    Directory of Open Access Journals (Sweden)

    Luisa Mattioli

    2017-01-01

    Full Text Available El paradigma del Buen Vivir, que nace en Bolivia y Ecuador, plantea una relación de la sociedad en armonía con la naturaleza desde una visión de transición socio-ecológica. La Constitución de Ecuador incorpora los derechos de la naturaleza de manera equivalentes a los humanos, constituyendo una postura disruptiva con el modelo actual de desarrollo. La misma, en relación al Ordenamiento Territorial presenta cierta vacancia metodológica, donde conceptos fundamentales como paisaje, patrimonio y biodiversidad, revelan una falta de complementariedad entre ellos. El objetivo de este trabajo es indagar sobre estos conceptos y sus interrelaciones para lograr un abordaje adecuado de este nuevo paradigma. Se expone una revisión crítica, conceptual y de enfoques de convenciones, cartas y acuerdos internacionales, que evidencian desfasajes para su correcta interpretación entre problemas que agravan la vulnerabilidad de territorios. Como aportación relevante se presenta una discusión y síntesis para el entendimiento y avance mancomunado del conocimiento. Se entiende este paradigma como un reto interdisciplinar para el abordaje del sistema complejo y como alternativa viable al desarrollo.

  17. Vias de síntese linear e convergente: qual é mais verde?

    Directory of Open Access Journals (Sweden)

    Adélio A. S. C Machado

    2011-01-01

    Full Text Available A comparative study of a convergent and the linear synthetic pathway with respect to their relative greenish allowed the quantification of the advantages of the former with respect to atomic productivity as well as robustness. The calculations show that convergent pathways provide a decrease of costs together with a decrease of E factor and an increase of atomic economy which means that greenish is accompanied by an economic advantage. The influence of other features of the convergent pathways synthesis on the improvement of the synthesis greenish is discussed qualitatively.

  18. Ciudades-modelo: estrategias convergentes para su difusión internacional

    Directory of Open Access Journals (Sweden)

    Fernanda Sánchez

    2005-08-01

    Full Text Available Algunas ciudades son elegidas como referencias modeladoras, y sus programas y proyectos incorporados en la agenda urbana hegemónica. Esta agenda, expresiva de la etapa contemporánea del capitalismo, difunde un ideario sintonizado con los llamados “impulsos globales” y se apoya en la codificación de acciones deseables para los gobiernos locales que buscan su inclusión competitiva en el nuevo mapa del mundo; consecuentemente, los gobiernos que conciben la ciudad como mercancía la tratan como un medio de atracción de ciudadanos-consumidores e inversionistas. Identificando estos procesos, este artículo busca desnaturalizar ciertos nexos y estrategias reiteradas en los discursos e imágenes más difundidas sobre las ciudades-modelo. Un patrón homogéneo parece revelarse en las confluencias de las actuales políticas urbanas que, sin embargo, han tenido origen en ciudades profundamente diferentes, como Curitiba (Brasil y Singapur (Singapur, tomadas como casos ilustrativos en esta reflexiónSome cities have been defined as models and its programs and basic projects are integrated into the hegemonic urban agenda. Reflecting the contemporary stage of capitalistic development, this agenda disseminates the ideas in accordance with the global tendencies, based on the actions envisioned by local governments in search for competitive insertion in the world market. The governments that conceive the city as a commodity see it as a means to attract consumers and investors. The present text tries to identify the strategies and discourses that characterize the model-cities. A homogeneous pattern of urban policy seems to be applied to very different cities as Curitiba (Brazil and Singapore (Singapore, the two illustrative cases examined in the present discussion

  19. Medios tecnológicos e Inteligencia: bases para una interrelación convergente

    Directory of Open Access Journals (Sweden)

    Navarro Bonilla, Diego

    2005-01-01

    Full Text Available Se propone una reflexión en torno a los fundamentos tecnológicos que hacen posible la generación de inteligencia. Las inversiones en tecnologías de la información aplicadas al ámbito de la seguridad y la defensa crecen cada día y demuestran cómo la explosión de la información requiere de medios y técnicas cada vez más complejas para no quedar saturados por el desbordamiento informativo. La evaluación y el análisis de la información que se convertirá en inteligencia son aspectos clave de todo el proceso. Sin embargo, más allá de la necesaria actualización de los medios tecnológicos para la obtención, procesamiento y presentación de información y conocimiento por parte de los organismos de inteligencia…

  20. Do homo sapiens ao homo convergente. É tempo de coisas e pessoas integradas.

    Directory of Open Access Journals (Sweden)

    Deisy Fernanda Feitosa

    2013-12-01

    Full Text Available A ubiquidade do mundo digital fornece a nós a possibilidade de uma transformação do estilo de vida, extensível à vida do consumo. Entendemos que esse processo já está consolidado, embora não esteja implantado, pois esse estilo de vida será exercido pela geração que já incorporou a computação ubíqua com parte integrante das suas vidas. Porém, uma tecnologia em fase de desenvolvimento promete integrar e digitalizar o planeta e muito do que há nele, construindo cidades inteligentes, espaços e coisas que dialogam continuamente para o câmbio de informações. Tudo indica que esta será a era pós-digital, dominada pela “Internet das Coisas”, mas sempre manipulada pelas habilidades e inteligência inerentes ao homem.

  1. SIG y análisis espacial de datos: perspectivas convergentes

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2005-01-01

    Full Text Available En este artículo se identifican algunos de los desarrollos más importantes experimentados por los SIG y el análisis espacial de datos desde los inicios de los 50. Aunque tanto los SIG como el análisis espacial de datos comenzaron como dos áreas de investigación y aplicación más o menos separadas, han crecido unidos estrechamente a lo largo del tiempo. En el trabajo se mantiene que estas dos disciplinas se unen en el terreno de la Ciencia de la Información Geográfica, proporcionando cada una de ellas apoyo o añadiendo valor a la otra. El artículo comienza proporcionando una visión crítica retrospectiva de los desarrollos que han tenido lugar en los últimos cincuenta años. A continuación, se reflexiona acerca de los desafíos actuales y se especula sobre el futuro. Por último se comenta el potencial de convergencia del desarrollo de los SIG y del análisis espacial de datos bajo la rubrica de la Ciencia de la Información Geográfica (o SIGciencia.

  2. SIG y análisis espacial de datos: perspectivas convergentes

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2005-06-01

    Full Text Available En este artículo se identifican algunos de los desarrollos más importantes experimentados por los SIG y el análisis espacial de datos desde los inicios de los 50. Aunque tanto los SIG como el análisis espacial de datos comenzaron como dos áreas de investigación y aplicación más o menos separadas, han crecido unidos estrechamente a lo largo del tiempo. En el trabajo se mantiene que estas dos disciplinas se unen en el terreno de la Ciencia de la Información Geográfica, proporcionando cada una de ellas apoyo o añadiendo valor a la otra. El artículo comienza proporcionando una visión crítica retrospectiva de los desarrollos que han tenido lugar en los últimos cincuenta años. A continuación, se reflexiona acerca de los desafíos actuales y se especula sobre el futuro. Por último se comenta el potencial de convergencia del desarrollo de los SIG y del análisis espacial de datos bajo la rubrica de la Ciencia de la Información Geográfica (o SIGciencia.

  3. Divergências convergentes: a nova cultura radiofônica

    Directory of Open Access Journals (Sweden)

    Lilian Zaremba

    2001-02-01

    Full Text Available Há mais ou menos vinte anos o rádio começou a se transformar na digitalização e sob o impacto da chamada convergência dos massmedia, o computador e a telecomunicação, oferecendo terreno para o aparecimento de formas híbridas: as concepções tradicionais enfrentam o embate no mutante espaço cibernético. Mutatis Mutandis, mudar o que precisa ser mudado, o rádio se desloca no dial encruzilhada das opções. O médium das gigantescas organizações internacionais, o rádio público, comunitário ou pirata, atraídos por este ímã até então estranho da nova media eletrônica, temem ser dissolvidos por ela. A resistência dos padrões sucumbe na pressão comercial tornando urgente a preservação de algum espaço para reflexão. Este trabalho pretendeu operar nesta urgência contribuindo para o debate, vasculhando nas origens deste médium de comunicação a compreensão mesma do que seja e possa talvez continuar a ser, Rádio.

  4. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  5. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  6. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  7. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  8. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  9. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  10. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  11. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  12. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  13. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  14. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  15. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  16. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  17. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  18. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  19. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  20. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  1. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  2. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  3. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  4. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  5. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  6. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  7. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  8. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  9. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  10. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  11. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  12. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  13. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  14. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  15. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  16. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  17. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  18. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  19. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  20. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  1. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  2. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  3. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  4. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  5. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  6. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  7. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  8. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  9. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  10. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  11. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  12. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  13. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  14. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  15. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  16. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  17. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  18. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  19. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  20. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  1. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  2. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  3. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  4. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  5. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  6. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  7. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  8. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  9. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  10. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  11. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  12. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  13. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  14. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  15. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  16. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  17. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  18. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  19. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  20. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  1. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  2. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  3. The phaco machine: analysing new technology.

    Science.gov (United States)

    Fishkind, William J

    2013-01-01

    The phaco machine is frequently overlooked as the crucial surgical instrument it is. Understanding how to set parameters is initiated by understanding fundamental concepts of machine function. This study analyses the critical concepts of partial occlusion phaco, occlusion phaco and pump technology. In addition, phaco energy categories as well as variations of phaco energy production are explored. Contemporary power modulations and pump controls allow for the enhancement of partial occlusion phacoemulsification. These significant changes in the anterior chamber dynamics produce a balanced environment for phaco; less complications; and improved patient outcomes.

  4. Nuclear analyses of the Pietroasa gold hoard

    International Nuclear Information System (INIS)

    Cojocaru, V.; Besliu, C.

    1999-01-01

    By means of nuclear analyses the concentrations of Au, Ag, Cu, Ir, Os, Pt, Co and Hg were measured in the 12 artifacts of the gold hoard discovered in 1837 at Pietroasa, Buzau country in Romania. The concentrations of the first four elements were used to compare different stylistic groups assumed by historians. Comparisons with gold nuggets from the old Dacian territory and gold Roman imperial coins were also made. A good agreement was found with the oldest hypothesis which considers that the hoard is represented by three styles appropriated mainly by the Goths. (author)

  5. An evaluation of the Olympus "Quickrate" analyser.

    Science.gov (United States)

    Williams, D G; Wood, R J; Worth, H G

    1979-02-01

    The Olympus "Quickrate", a photometer built for both kinetic and end point analysis was evaluated in this laboratory. Aspartate transaminase, lactate dehydrogenase, hydroxybutyrate dehydrogenase, creatine kinase, alkaline phosphatase and gamma glutamyl transpeptidase were measured in the kinetic mode and glucose, urea, total protein, albumin, bilirubin, calcium and iron in the end point mode. Overall, good correlation was observed with routine methodologies and the precision of the methods was acceptable. An electrical evaluation was also performed. In our hands, the instrument proved to be simple to use and gave no trouble. It should prove useful for paediatric and emergency work, and as a back up for other analysers.

  6. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  7. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  8. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  9. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  10. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  11. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  12. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  13. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  14. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  15. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  16. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  17. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  18. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  19. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  20. Abundance analyses of thirty cool carbon stars

    International Nuclear Information System (INIS)

    Utsumi, Kazuhiko

    1985-01-01

    The results were previously obtained by use of the absolute gf-values and the cosmic abundance as a standard. These gf-values were found to contain large systematic errors, and as a result, the solar photospheric abundances were revised. Our previous results, therefore, must be revised by using new gf-values, and abundance analyses are extended for as many carbon stars as possible. In conclusion, in normal cool carbon stars heavy metals are overabundant by factors of 10 - 100 and rare-earth elements are overabundant by a factor of about 10, and in J-type cool carbon stars, C 12 /C 13 ratio is smaller, C 2 and CN bands and Li 6708 are stronger than in normal cool carbon stars, and the abundances of s-process elements with respect to Fe are nearly normal. (Mori, K.)

  1. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  2. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  3. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  4. Seismic analyses of structures. 1st draft

    International Nuclear Information System (INIS)

    David, M.

    1995-01-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as response to seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration

  5. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  6. Seismic analyses of structures. 1st draft

    Energy Technology Data Exchange (ETDEWEB)

    David, M [David Consulting, Engineering and Design Office (Czech Republic)

    1995-07-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as responseto seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration.

  7. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  8. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  9. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  10. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  11. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  12. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  13. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    Silva, W R; Zanardi, M C; Formiga, J K S; Cabette, R E S; Stuchi, T J

    2013-01-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  14. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  15. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  16. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  17. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  18. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  19. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  20. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  1. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  2. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  3. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  4. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  5. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  6. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  7. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  8. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  9. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  10. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  11. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  12. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  13. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  14. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  15. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  16. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  17. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  18. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  19. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  20. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  1. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  2. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    S. Tsai

    2005-01-01

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2

  3. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  5. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  6. System for analysing sickness absenteeism in Poland.

    Science.gov (United States)

    Indulski, J A; Szubert, Z

    1997-01-01

    The National System of Sickness Absenteeism Statistics has been functioning in Poland since 1977, as the part of the national health statistics. The system is based on a 15-percent random sample of copies of certificates of temporary incapacity for work issued by all health care units and authorised private medical practitioners. A certificate of temporary incapacity for work is received by every insured employee who is compelled to stop working due to sickness, accident, or due to the necessity to care for a sick member of his/her family. The certificate is required on the first day of sickness. Analyses of disease- and accident-related sickness absenteeism carried out each year in Poland within the statistical system lead to the main conclusions: 1. Diseases of the musculoskeletal and peripheral nervous systems accounting, when combined, for 1/3 of the total sickness absenteeism, are a major health problem of the working population in Poland. During the past five years, incapacity for work caused by these diseases in males increased 2.5 times. 2. Circulatory diseases, and arterial hypertension and ischaemic heart disease in particular (41% and 27% of sickness days, respectively), create an essential health problem among males at productive age, especially, in the 40 and older age group. Absenteeism due to these diseases has increased in males more than two times.

  7. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  8. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  9. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  10. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  11. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  12. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  13. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  14. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  15. The ABC (Analysing Biomolecular Contacts-database

    Directory of Open Access Journals (Sweden)

    Walter Peter

    2007-03-01

    Full Text Available As protein-protein interactions are one of the basic mechanisms in most cellular processes, it is desirable to understand the molecular details of protein-protein contacts and ultimately be able to predict which proteins interact. Interface areas on a protein surface that are involved in protein interactions exhibit certain characteristics. Therefore, several attempts were made to distinguish protein interactions from each other and to categorize them. One way of classification are the groups of transient and permanent interactions. Previously two of the authors analysed several properties for transient complexes such as the amino acid and secondary structure element composition and pairing preferences. Certainly, interfaces can be characterized by many more possible attributes and this is a subject of intense ongoing research. Although several freely available online databases exist that illuminate various aspects of protein-protein interactions, we decided to construct a new database collecting all desired interface features allowing for facile selection of subsets of complexes. As database-server we applied MySQL and the program logic was written in JAVA. Furthermore several class extensions and tools such as JMOL were included to visualize the interfaces and JfreeChart for the representation of diagrams and statistics. The contact data is automatically generated from standard PDB files by a tcl/tk-script running through the molecular visualization package VMD. Currently the database contains 536 interfaces extracted from 479 PDB files and it can be queried by various types of parameters. Here, we describe the database design and demonstrate its usefulness with a number of selected features.

  16. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  17. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  18. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  19. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  20. YALINA Booster subcritical assembly modeling and analyses

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Cao, Y.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Sadovich, S.

    2010-01-01

    Full text: Accurate simulation models of the YALINA Booster assembly of the Joint Institute for Power and Nuclear Research (JIPNR)-Sosny, Belarus have been developed by Argonne National Laboratory (ANL) of the USA. YALINA-Booster has coupled zones operating with fast and thermal neutron spectra, which requires a special attention in the modelling process. Three different uranium enrichments of 90%, 36% or 21% were used in the fast zone and 10% uranium enrichment was used in the thermal zone. Two of the most advanced Monte Carlo computer programs have been utilized for the ANL analyses: MCNP of the Los Alamos National Laboratory and MONK of the British Nuclear Fuel Limited and SERCO Assurance. The developed geometrical models for both computer programs modelled all the details of the YALINA Booster facility as described in the technical specifications defined in the International Atomic Energy Agency (IAEA) report without any geometrical approximation or material homogenization. Materials impurities and the measured material densities have been used in the models. The obtained results for the neutron multiplication factors calculated in criticality mode (keff) and in source mode (ksrc) with an external neutron source from the two Monte Carlo programs are very similar. Different external neutron sources have been investigated including californium, deuterium-deuterium (D-D), and deuterium-tritium (D-T) neutron sources. The spatial neutron flux profiles and the neutron spectra in the experimental channels were calculated. In addition, the kinetic parameters were defined including the effective delayed neutron fraction, the prompt neutron lifetime, and the neutron generation time. A new calculation methodology has been developed at ANL to simulate the pulsed neutron source experiments. In this methodology, the MCNP code is used to simulate the detector response from a single pulse of the external neutron source and a C code is used to superimpose the pulse until the

  1. Altools: a user friendly NGS data analyser.

    Science.gov (United States)

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  2. Minimización de una función normal-merit mediante un algoritmo convergente globalmente

    Directory of Open Access Journals (Sweden)

    Gómez Suárez, M.

    1998-01-01

    Full Text Available En este trabajo presentamos dos conceptos relacionados con la solución de sistemas de ecuaciones no lineales y con desigualdades. El primer concepto es el de una función normal merit, que resume las propiedades básicas que tienen distintas funciones merit conocidas. El segundo concepto es el de un operador Newtoniano, cuyos valores generalizan el concepto de Hessiana para la función normal merit. Combinando el resultado del método generalizado de Newton con ciertos métodos de primer orden, obtenemos un algoritmo de convergencia global para minimizar funciones normales merit.

  3. Los usos sociales de la ciencia: tecnologías convergentes y democratización del conocimiento

    Directory of Open Access Journals (Sweden)

    José Manuel Rodríguez Victoriano

    2009-01-01

    Full Text Available La globalización neoliberal de las últimas décadas ha traído aparejadas inmensas transformaciones económicas, políticas y sociales. La reorganización del sistema capitalista que surge de este periodo se ha concretado en el incremento de la vulnerabilidad y las desigualdades sociales, dando lugar a la emergencia de la llamada "nueva cuestión social". El planteamiento que aquí se sigue investiga el papel que el conocimiento científico y la tecnociencia juegan en el capitalismo cognitivo como intensificadores de las formas de desigualdad. Atender esto resulta necesario para entender y completar la comprensión de las diferentes dimensiones de las desigualdades económicas, sociales y culturales de la actual globalización neoliberal. La tesis de fondo sostiene que en el actual capitalismo cognitivo no se puede habla de sociedades democráticas, con rigor científico, mientras no se hayan democratizado las decisiones sobre los usos sociales de la ciencia.

  4. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  5. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  6. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  7. Forced vibration tests and simulation analyses of a nuclear reactor building. Part 2: simulation analyses

    International Nuclear Information System (INIS)

    Kuno, M.; Nakagawa, S.; Momma, T.; Naito, Y.; Niwa, M.; Motohashi, S.

    1995-01-01

    Forced vibration tests of a BWR-type reactor building. Hamaoka Unit 4, were performed. Valuable data on the dynamic characteristics of the soil-structure interaction system were obtained through the tests. Simulation analyses of the fundamental dynamic characteristics of the soil-structure system were conducted, using a basic lumped mass soil-structure model (lattice model), and strong correlation with the measured data was obtained. Furthermore, detailed simulation models were employed to investigate the effects of simultaneously induced vertical response and response of the adjacent turbine building on the lateral response of the reactor building. (author). 4 refs., 11 figs

  8. Integrated Waste Treatment Unit (IWTU) Input Coal Analyses and Off-Gass Filter (OGF) Content Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, Carol M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Missimer, David M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Guenther, Chris P. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Shekhawat, Dushyant [National Energy Technology Lab. (NETL), Morgantown, WV (United States); VanEssendelft, Dirk T. [National Energy Technology Lab. (NETL), Morgantown, WV (United States); Means, Nicholas C. [AECOM Technology Corp., Oak Ridge, TN (United States)

    2015-04-23

    in process piping and materials, in excessive off-gas absorbent loading, and in undesired process emissions. The ash content of the coal is important as the ash adds to the DMR and other vessel products which affect the final waste product mass and composition. The amount and composition of the ash also affects the reaction kinetics. Thus ash content and composition contributes to the mass balance. In addition, sodium, potassium, calcium, sulfur, and maybe silica and alumina in the ash may contribute to wall-scale formation. Sodium, potassium, and alumina in the ash will be overwhelmed by the sodium, potassium, and alumina from the feed but the impact from the other ash components needs to be quantified. A maximum coal particle size is specified so the feed system does not plug and a minimum particle size is specified to prevent excess elutriation from the DMR to the Process Gas Filter (PGF). A vendor specification was used to procure the calcined coal for IWTU processing. While the vendor supplied a composite analysis for the 22 tons of coal (Appendix A), this study compares independent analyses of the coal performed at the Savannah River National Laboratory (SRNL) and at the National Energy Technology Laboratory (NETL). Three supersacks a were sampled at three different heights within the sack in order to determine within bag variability and between bag variability of the coal. These analyses were also compared to the vendor’s composite analyses and to the coal specification. These analyses were also compared to historic data on Bestac coal analyses that had been performed at Hazen Research Inc. (HRI) between 2004-2011.

  9. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  10. Angular analyses in relativistic quantum mechanics; Analyses angulaires en mecanique quantique relativiste

    Energy Technology Data Exchange (ETDEWEB)

    Moussa, P [Commissariat a l' Energie Atomique, 91 - Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    This work describes the angular analysis of reactions between particles with spin in a fully relativistic fashion. One particle states are introduced, following Wigner's method, as representations of the inhomogeneous Lorentz group. In order to perform the angular analyses, the reduction of the product of two representations of the inhomogeneous Lorentz group is studied. Clebsch-Gordan coefficients are computed for the following couplings: l-s coupling, helicity coupling, multipolar coupling, and symmetric coupling for more than two particles. Massless and massive particles are handled simultaneously. On the way we construct spinorial amplitudes and free fields; we recall how to establish convergence theorems for angular expansions from analyticity hypothesis. Finally we substitute these hypotheses to the idea of 'potential radius', which gives at low energy the usual 'centrifugal barrier' factors. The presence of such factors had never been deduced from hypotheses compatible with relativistic invariance. (author) [French] On decrit un formalisme permettant de tenir compte de l'invariance relativiste, dans l'analyse angulaire des amplitudes de reaction entre particules de spin quelconque. Suivant Wigner, les etats a une particule sont introduits a l'aide des representations du groupe de Lorentz inhomogene. Pour effectuer les analyses angulaires, on etudie la reduction du produit de deux representations du groupe de Lorentz inhomogene. Les coefficients de Clebsch-Gordan correspondants sont calcules dans les couplages suivants: couplage l-s couplage d'helicite, couplage multipolaire, couplage symetrique pour plus de deux particules. Les particules de masse nulle et de masse non nulle sont traitees simultanement. Au passage, on introduit les amplitudes spinorielles et on construit les champs libres, on rappelle comment des hypotheses d'analyticite permettent d'etablir des theoremes de convergence pour les developpements angulaires. Enfin on fournit un substitut a la

  11. Comparative biochemical analyses of venous blood and peritoneal fluid from horses with colic using a portable analyser and an in-house analyser.

    Science.gov (United States)

    Saulez, M N; Cebra, C K; Dailey, M

    2005-08-20

    Fifty-six horses with colic were examined over a period of three months. The concentrations of glucose, lactate, sodium, potassium and chloride, and the pH of samples of blood and peritoneal fluid, were determined with a portable clinical analyser and with an in-house analyser and the results were compared. Compared with the in-house analyser, the portable analyser gave higher pH values for blood and peritoneal fluid with greater variability in the alkaline range, and lower pH values in the acidic range, lower concentrations of glucose in the range below 8.3 mmol/l, and lower concentrations of lactate in venous blood in the range below 5 mmol/l and in peritoneal fluid in the range below 2 mmol/l, with less variability. On average, the portable analyser underestimated the concentrations of lactate and glucose in peritoneal fluid in comparison with the in-house analyser. Its measurements of the concentrations of sodium and chloride in peritoneal fluid had a higher bias and were more variable than the measurements in venous blood, and its measurements of potassium in venous blood and peritoneal fluid had a smaller bias and less variability than the measurements made with the in-house analyser.

  12. ATHENA/INTRA analyses for ITER, NSSR-2

    International Nuclear Information System (INIS)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A.

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes

  13. ATHENA/INTRA analyses for ITER, NSSR-2

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Kecheng; Eriksson, John; Sjoeberg, A

    1999-02-01

    The present report is a summary report including thermal-hydraulic analyses made at Studsvik Eco and Safety AB for the ITER NSSR-2 safety documentation. The objective of the analyses was to reveal the safety characteristics of various heat transfer systems at specified operating conditions and to indicate the conditions for which there were obvious risks of jeopardising the structural integrity of the coolant systems. In the latter case also some analyses were made to indicate conceivable mitigating measures for maintaining the integrity.The analyses were primarily concerned with the First Wall and Divertor heat transfer systems. Several enveloping transients were analysed with associated specific flow and heat load boundary conditions. The analyses were performed with the ATHENA and INTRA codes 8 refs, 14 figs, 15 tabs

  14. Methods and procedures for shielding analyses for the SNS

    International Nuclear Information System (INIS)

    Popova, I.; Ferguson, F.; Gallmeier, F.X.; Iverson, E.; Lu, Wei

    2011-01-01

    In order to provide radiologically safe Spallation Neutron Source operation, shielding analyses are performed according to Oak Ridge National Laboratory internal regulations and to comply with the Code of Federal Regulations. An overview of on-going shielding work for the accelerator facility and neutrons beam lines, methods used for the analyses, and associated procedures and regulations are presented. Methods used to perform shielding analyses are described as well. (author)

  15. SENSITIVITY ANALYSIS FOR SALTSTONE DISPOSAL UNIT COLUMN DEGRADATION ANALYSES

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.

    2014-10-28

    PORFLOW related analyses supporting a Sensitivity Analysis for Saltstone Disposal Unit (SDU) column degradation were performed. Previous analyses, Flach and Taylor 2014, used a model in which the SDU columns degraded in a piecewise manner from the top and bottom simultaneously. The current analyses employs a model in which all pieces of the column degrade at the same time. Information was extracted from the analyses which may be useful in determining the distribution of Tc-99 in the various SDUs throughout time and in determining flow balances for the SDUs.

  16. Analysing harmonic motions with an iPhone’s magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Kağan Temiz, Burak

    2016-05-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone’s (or iPad’s) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone’s magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone’s screen using the Sensor Kinetics application. Data from this application was analysed with Eureqa software to establish the equation of the harmonic motion. Analyses show that the use of an iPhone’s magnetometer to analyse harmonic motion is a practical and effective method for small oscillations and frequencies less than 15-20 Hz.

  17. Discrete frequency identification using the HP 5451B Fourier analyser

    International Nuclear Information System (INIS)

    Holland, L.; Barry, P.

    1977-01-01

    The frequency analysis by the HP5451B discrete frequency Fourier analyser is studied. The advantages of cross correlation analysis to identify discrete frequencies in a background noise are discussed in conjuction with the elimination of aliasing and wraparound error. Discrete frequency identification is illustrated by a series of graphs giving the results of analysing 'electrical' and 'acoustical' white noise and sinusoidal signals [pt

  18. A Java Bytecode Metamodel for Composable Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet; Seidl, Martina; Zschaler, Steffen

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java; and access the program to be analyzed through libraries for manipulating intermediate code, such

  19. Finite strain analyses of deformations in polymer specimens

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2016-01-01

    Analyses of the stress and strain state in test specimens or structural components made of polymer are discussed. This includes the Izod impact test, based on full 3D transient analyses. Also a long thin polymer tube under internal pressure has been studied, where instabilities develop, such as b...

  20. Multipole analyses and photo-decay couplings at intermediate energies

    International Nuclear Information System (INIS)

    Workman, R.L.; Arndt, R.A.; Zhujun Li

    1992-01-01

    The authors describe the results of several multipole analyses of pion-photoproduction data to 2 GeV in the lab photon energy. Comparisons are made with previous analyses. The photo-decay couplings for the delta are examined in detail. Problems in the representation of photoproduction data are discussed, with an emphasis on the recent LEGS data. 16 refs., 4 tabs

  1. Houdbaarheid en conservering van grondwatermonsters voor anorganische analyses

    NARCIS (Netherlands)

    Cleven RFMJ; Gast LFL; Boshuis-Hilverdink ME; LAC

    1995-01-01

    The storage life and the possibilities for preservation of inorganic analyses of groundwater samples have been investigated. Groundwater samples, with and without preservation with acid, from four locations in the Netherlands have been analysed ten times over a period of three months on six

  2. Uranium price trends for use in strategy analyses

    International Nuclear Information System (INIS)

    James, R.A.

    1979-09-01

    Long-term price forecasts for mined uranium are quoted. These will be used in Ontario Hydro's nuclear fuel cycle strategy analyses. They are, of necessity, speculative. The accuracy of the forecasts is considered adequate for long-term strategy analyses, but not for other purposes. (auth)

  3. 46 CFR Appendix B to Part 154 - Stress Analyses Definitions

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Stress Analyses Definitions B Appendix B to Part 154...—Stress Analyses Definitions The following are the standard definitions of stresses for the analysis of an independent tank type B: Normal stress means the component of stress normal to the plane of reference...

  4. Aftaler om arbejdsmiljø - en analyse af udvalgte overenskomster

    DEFF Research Database (Denmark)

    Petersen, Jens Voxtrup; Wiegmann, Inger-Marie; Vogt-Nielsen, Karl

    En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift.......En analyse af overenskomsters betydning for arbejdsmiljøet indenfor industri, slagterier, rengøring, det grønne område, hotel og restauration og busdrift....

  5. The role of CFD computer analyses in hydrogen safety management

    International Nuclear Information System (INIS)

    Komen, E.M.J; Visser, D.C; Roelofs, F.; Te Lintelo, J.G.T

    2014-01-01

    The risks of hydrogen release and combustion during a severe accident in a light water reactor have attracted considerable attention after the Fukushima accident in Japan. Reliable computer analyses are needed for the optimal design of hydrogen mitigation systems, like e.g. passive autocatalytic recombiners (PARs), and for the assessment of the associated residual risk of hydrogen combustion. Traditionally, so-called Lumped Parameter (LP) computer codes are being used for these purposes. In the last decade, significant progress has been made in the development, validation, and application of more detailed, three-dimensional Computational Fluid Dynamics (CFD) simulations for hydrogen safety analyses. The objective of the current paper is to address the following questions: - When are CFD computer analyses needed complementary to the traditional LP code analyses for hydrogen safety management? - What is the validation status of the CFD computer code for hydrogen distribution, mitigation, and combustion analyses? - Can CFD computer analyses nowadays be executed in practical and reliable way for full scale containments? The validation status and reliability of CFD code simulations will be illustrated by validation analyses performed for experiments executed in the PANDA, THAI, and ENACCEF facilities. (authors)

  6. Graphite analyser upgrade for the IRIS spectrometer at ISIS

    International Nuclear Information System (INIS)

    Campbell, S.I.; Telling, M.T.F.; Carlile, C.J.

    1999-01-01

    Complete text of publication follows. The pyrolytic graphite (PG) analyser bank on the IRIS high resolution inelastic spectrometer [1] at ISIS is to be upgraded. At present the analyser consists of 1350 graphite pieces (6 rows by 225 columns) cooled to 25K [2]. The new analyser array, however, will provide a three-fold increase in area and employ 4212 crystal pieces (18 rows by 234 columns). In addition, the graphite crystals will be cooled close to liquid helium temperature to further reduce thermal diffuse scattering (TDS) and improve the sensitivity of the spectrometer [2]. For an instrument such as IRIS, with its analyser in near back-scattering geometry, optical aberration and variation in the time-of-flight of the analysed neutrons is introduced as one moves out from the horizontal scattering plane. To minimise such effects, the profile of the analyser array has been redesigned. The concept behind the design of the new analyser bank and factors that effect the overall resolution of the instrument are discussed. Results of Monte Carlo simulations of the expected resolution and intensity of the complete instrument are presented and compared to the current instrument performance. (author) [1] C.J. Carlile et al, Physica B 182 (1992) 431-440.; [2] C.J. Carlile et al, Nuclear Instruments and Methods In Physics Research A 338 (1994) 78-82

  7. A vector matching method for analysing logic Petri nets

    Science.gov (United States)

    Du, YuYue; Qi, Liang; Zhou, MengChu

    2011-11-01

    Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.

  8. Systematic Derivation of Static Analyses for Software Product Lines

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Brabrand, Claus; Wasowski, Andrzej

    2014-01-01

    A recent line of work lifts particular verification and analysis methods to Software Product Lines (SPL). In an effort to generalize such case-by-case approaches, we develop a systematic methodology for lifting program analyses to SPLs using abstract interpretation. Abstract interpretation...... for lifting analyses and Galois connections. We prove that for analyses developed using our method, the soundness of lifting follows by construction. Finally, we discuss approximating variability in an analysis and we derive variational data-flow equations for an example analysis, a constant propagation...

  9. Korte narrativer i analyser af beskæftigelsesindsatser

    DEFF Research Database (Denmark)

    Olesen, Søren Peter; Eskelinen, Leena

    2009-01-01

    beskæftigelsesindsatser imødekommes forskningsmetodisk med en relevant strategi for dataindsamling og analyse? og 2) Hvordan kan arbejdsløses synsvinkel relateres til effektstudier og evaluering? Vi tager udgangspunkt i to adskilte udviklingstendenser: en vending i narrativ analyse mod små fortællinger og en ny tilgang...... til evaluering kaldet relationel evaluering. Vi kombinerer disse tendenser i et begreb om korte narrativer om arbejdsidentitet som kvalitativ tilgang til analyse af arbejdslivsperspektiv og konsekvenser af beskæftigelsesindsatser set med de arbejdsløses øjne. Udgivelsesdato: december...

  10. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  11. Optical region elemental abundance analyses of B and A stars

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1984-01-01

    Abundance analyses using optical region data and fully line blanketed model atmospheres have been performed for six moderately sharplined middle to late B-type stars. The derived abundances have values similar to those of the Sun. (author)

  12. Thermodynamic and Quantum Thermodynamic Analyses of Brownian Movement

    OpenAIRE

    Gyftopoulos, Elias P.

    2006-01-01

    Thermodynamic and quantum thermodynamic analyses of Brownian movement of a solvent and a colloid passing through neutral thermodynamic equilibrium states only. It is shown that Brownian motors and E. coli do not represent Brownian movement.

  13. Methods for analysing cardiovascular studies with repeated measures

    NARCIS (Netherlands)

    Cleophas, T. J.; Zwinderman, A. H.; van Ouwerkerk, B. M.

    2009-01-01

    Background. Repeated measurements in a single subject are generally more similar than unrepeated measurements in different subjects. Unrepeated analyses of repeated data cause underestimation of the treatment effects. Objective. To review methods adequate for the analysis of cardiovascular studies

  14. Multielement trace analyses of SINQ materials by ICP-OES

    Energy Technology Data Exchange (ETDEWEB)

    Keil, R.; Schwikowski, M. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-09-01

    Inductively Coupled Plasma Optical Emission Spectrometry was used to analyse 70 elements in various materials used for construction of the SINQ. Detection limits for individual elements depend strongly on the matrix and had to be determined separately. (author) 1 tab.

  15. The MAFLA (Mississippi, Alabama, Florida) Study, Grain Size Analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The MAFLA (Mississippi, Alabama, Florida) Study was funded by NOAA as part of the Outer Continental Shelf Program. Dr. L.J. Doyle produced grain size analyses in the...

  16. Climate Prediction Center (CPC) US daily temperature analyses

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. daily temperature analyses are maps depicting various temperature quantities utilizing daily maximum and minimum temperature data across the US. Maps are...

  17. Finite element analyses for RF photoinjector gun cavities

    International Nuclear Information System (INIS)

    Marhauser, F.

    2006-01-01

    This paper details electromagnetical, thermal and structural 3D Finite Element Analyses (FEA) for normal conducting RF photoinjector gun cavities. The simulation methods are described extensively. Achieved results are presented. (orig.)

  18. Summary of Prometheus Radiation Shielding Nuclear Design Analyses , for information

    International Nuclear Information System (INIS)

    J. Stephens

    2006-01-01

    This report transmits a summary of radiation shielding nuclear design studies performed to support the Prometheus project. Together, the enclosures and references associated with this document describe NRPCT (KAPL and Bettis) shielding nuclear design analyses done for the project

  19. Book Review: Qualitative-Quantitative Analyses of Dutch and ...

    African Journals Online (AJOL)

    Abstract. Book Title: Qualitative-Quantitative Analyses of Dutch and Afrikaans Grammar and Lexicon. Book Author: Robert S. Kirsner. 2014. John Benjamins Publishing Company ISBN 9789027215772, price ZAR481.00. 239 pages ...

  20. Finite element analyses for RF photoinjector gun cavities

    Energy Technology Data Exchange (ETDEWEB)

    Marhauser, F. [Berliner Elektronenspeicherring-Gesellschaft fuer Synchrotronstrahlung mbH (BESSY), Berlin (Germany)

    2006-07-01

    This paper details electromagnetical, thermal and structural 3D Finite Element Analyses (FEA) for normal conducting RF photoinjector gun cavities. The simulation methods are described extensively. Achieved results are presented. (orig.)

  1. In service monitoring based on fatigue analyses, possibilities and limitations

    International Nuclear Information System (INIS)

    Dittmar, S.; Binder, F.

    2004-01-01

    German LWR reactors are equipped with monitoring systems which are to enable a comparison of real transients with load case catalogues and fatigue catalogues for fatigue analyses. The information accuracy depends on the accuracy of measurements, on the consideration of parameters influencing fatigue (medium, component surface, component size, etc.), and on the accuracy of the load analyses. The contribution attempts a critical evaluation, also inview of the fact that real fatigue damage often are impossible to quantify on the basis of fatigue analyses at a later stage. The effects of the consideration or non-consideration of various influencing factors are discussed, as well as the consequences of the scatter of material characteristics on which the analyses are based. Possible measures to be taken in operational monitoring are derived. (orig.) [de

  2. Selection of interest and inflation rates for infrastructure investment analyses.

    Science.gov (United States)

    2014-12-01

    The South Dakota Department of Transportation (SDDOT) uses engineering economic analyses (EEA) to : support planning, design, and construction decision-making such as project programming and planning, : pavement type selection, and the occasional val...

  3. Economic Analyses of Ware Yam Production in Orlu Agricultural ...

    African Journals Online (AJOL)

    Economic Analyses of Ware Yam Production in Orlu Agricultural Zone of Imo State. ... International Journal of Agriculture and Rural Development ... statistics, gross margin analysis, marginal analysis and multiple regression analysis. Results ...

  4. Analyse des formes et modes traditionnels de communication en ...

    African Journals Online (AJOL)

    Analyse des formes et modes traditionnels de communication en milieu rural ... se situe aussi bien au niveau de la forme verbale que non verbale du langage. ... In fact, expressions of message take place in verbal and no verbal language.

  5. [Binge eating disorder: Links with personality and emotionality].

    Science.gov (United States)

    Dorard, G; Khorramian-Pour, M

    2017-04-01

    Our two objectives were: (1) to investigate the relationship between binge eating disorder, dimensions of personality (according to the Big Five model of Costa and McCrae) and those of emotionality in the "tripartite" model of emotions of Watson and Clark; (2) to evaluate the correspondence between the Binge Eating Scale (BES) and the Eating Disorder Inventory (EDI-2) scores. Four self-administered questionnaires were completed on a shared doc website: the EDI-2, the BES, the BFI-Fr (Big Five Inventory-French version) and the EPN-31 (Positive and Negative Emotionality Scale). The analyses were conducted in a sample of 101 participants (36 men and 65 women), aged 20-59 years (mean age=35.28±9.76) from the general population. We found that 11% of the participants had moderate to severe binge eating disorder. Among them, nearly 4% were overweight and 4% were obese. The correlations analyses indicated that binge eating disorder was associated with two dimensions of personality, the neuroticism (P=0.001) and the consciousness (P=0.010), and with the emotions of joy (P=0.008), tenderness (P=0.036), fear (P=0.011), shame (Pbinge eating disorder get higher scores on EDI-2 subscales: search for thinness (P=0.001), bulimia (Pbinge eating disorder is associated with negative affectivity both as a personality dimension and as an emotional feeling. The patterns of associations, observed with the EDI scale, seem to confirm the good convergent validity of the Binge Eating Scale. Thus, like other eating disorders, emotional functioning should be a prime target for prevention and treatment. Copyright © 2016 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  6. Elemental abundance analyses with coadded DAO spectrograms: Pt. 5

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1988-01-01

    Elemental abundance analyses of three mercury-manganese stars were performed in a manner consistent with previous analyses of this series. A few correlations are found between the derived abundances and with the effective temperature in accordance with the expectations of radiative diffusion explanations of the derived abundances. The helium abundances are smaller than the value required to sustain the superficial helium convection zone in the atmospheres of these stars. (author)

  7. Analysing Information Systems Security In Higher Learning Institutions Of Uganda

    OpenAIRE

    Mugyenyi Raymond

    2017-01-01

    Information communication technology has increased globalisation in higher learning institution all over the world. This has been achieved through introduction of systems that ease operations related to information handling in the institutions. The paper assessed and analysed the information systems security performance status in higher learning institutions of Uganda. The existing policies that govern the information security have also been analysed together with the current status of inform...

  8. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  9. A protocol for analysing mathematics teacher educators' practices

    OpenAIRE

    Kuzle , Ana; Biehler , Rolf

    2015-01-01

    International audience; Studying practices in a teaching-learning environment, such as professional development programmes, is a complex and multi-faceted endeavour. While several frameworks exist to help researchers analyse teaching practices, none exist to analyse practices of those who organize professional development programmes, namely mathematics teacher educators. In this paper, based on theoretical as well as empirical results, we present a protocol for capturing different aspects of ...

  10. Structural analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1992-04-01

    ITER (International Thermonuclear Experimental Reactor) is intended to be an experimental thermonuclear tokamak reactor testing the basic physics performance and technologies essential to future fusion reactors. The magnet system of ITER consists essentially of 4 sub-systems, i.e. toroidal field coils (TFCs), poloidal field coils (PFCs), power supplies, and cryogenic supplies. These subsystems do not contain significant radioactivity inventories, but the large energy inventory is a potential accident initiator. The aim of the structural analyses is to prevent accidents from propagating into vacuum vessel, tritium system and cooling system, which all contain significant amounts of radioactivity. As part of design process 3 conditions are defined for PF and TF coils, at which mechanical behaviour has to be analyzed in some detail, viz: normal operating conditions, upset conditions and fault conditions. This paper describes the work carried out by ECN to create a detailed finite element model of 16 TFCs as well as results of some fault condition analyses made with the model. Due to fault conditions, either electrical or mechanical, magnetic loading of TFCs becomes abnormal and further mechanical failure of parts of the overall structure might occur (e.g. failure of coil, gravitational supports, intercoil structure). The analyses performed consist of linear elastic stress analyses and electro-magneto-structural analyses (coupled field analyses). 8 refs.; 5 figs.; 5 tabs

  11. Selection, rejection and optimisation of pyrolytic graphite (PG) crystal analysers for use on the new IRIS graphite analyser bank

    International Nuclear Information System (INIS)

    Marshall, P.J.; Sivia, D.S.; Adams, M.A.; Telling, M.T.F.

    2000-01-01

    This report discusses design problems incurred by equipping the IRIS high-resolution inelastic spectrometer at the ISIS pulsed neutron source, UK with a new 4212 piece pyrolytic graphite crystal analyser array. Of the 4212 graphite pieces required, approximately 2500 will be newly purchased PG crystals with the remainder comprising of the currently installed graphite analysers. The quality of the new analyser pieces, with respect to manufacturing specifications, is assessed, as is the optimum arrangement of new PG pieces amongst old to circumvent degradation of the spectrometer's current angular resolution. Techniques employed to achieve these criteria include accurate calliper measurements, FORTRAN programming and statistical analysis. (author)

  12. Accelerated safety analyses - structural analyses Phase I - structural sensitivity evaluation of single- and double-shell waste storage tanks

    International Nuclear Information System (INIS)

    Becker, D.L.

    1994-11-01

    Accelerated Safety Analyses - Phase I (ASA-Phase I) have been conducted to assess the appropriateness of existing tank farm operational controls and/or limits as now stipulated in the Operational Safety Requirements (OSRs) and Operating Specification Documents, and to establish a technical basis for the waste tank operating safety envelope. Structural sensitivity analyses were performed to assess the response of the different waste tank configurations to variations in loading conditions, uncertainties in loading parameters, and uncertainties in material characteristics. Extensive documentation of the sensitivity analyses conducted and results obtained are provided in the detailed ASA-Phase I report, Structural Sensitivity Evaluation of Single- and Double-Shell Waste Tanks for Accelerated Safety Analysis - Phase I. This document provides a summary of the accelerated safety analyses sensitivity evaluations and the resulting findings

  13. The SNS target station preliminary Title I shielding analyses

    International Nuclear Information System (INIS)

    Johnson, J.O.; Santoro, R.T.; Lillie, R.A.; Barnes, J.M.; McNeilly, G.S.

    2000-01-01

    The Department of Energy (DOE) has given the Spallation Neutron Source (SNS) project approval to begin Title I design of the proposed facility to be built at Oak Ridge National Laboratory (ORNL). During the conceptual design phase of the SNS project, the target station bulk-biological shield was characterized and the activation of the major targets station components was calculated. Shielding requirements were assessed with respect to weight, space, and dose-rate constraints for operating, shut-down, and accident conditions utilizing the SNS shield design criteria, DOE Order 5480.25, and requirements specified in 10 CFR 835. Since completion of the conceptual design phase, there have been major design changes to the target station as a result of the initial shielding and activation analyses, modifications brought about due to engineering concerns, and feedback from numerous external review committees. These design changes have impacted the results of the conceptual design analyses, and consequently, have required a re-investigation of the new design. Furthermore, the conceptual design shielding analysis did not address many of the details associated with the engineering design of the target station. In this paper, some of the proposed SNS target station preliminary Title I shielding design analyses will be presented. The SNS facility (with emphasis on the target station), shielding design requirements, calculational strategy, and source terms used in the analyses will be described. Preliminary results and conclusions, along with recommendations for additional analyses, will also be presented. (author)

  14. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    Science.gov (United States)

    Harper, Sam; Ruder, Eric; Roman, Henry A.; Geggel, Amelia; Nweke, Onyemaechi; Payne-Sturges, Devon; Levy, Jonathan I.

    2013-01-01

    Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative measures of health inequality in other settings, and these measures may be applicable to environmental regulatory analyses. In this paper, we provide information to assist policy decision makers in determining the viability of using measures of health inequality in the context of environmental regulatory analyses. We conclude that quantification of the distribution of inequalities in health outcomes across social groups of concern, considering both within-group and between-group comparisons, would be consistent with both the structure of regulatory analysis and the core definition of environmental justice. Appropriate application of inequality indicators requires thorough characterization of the baseline distribution of exposures and risks, leveraging data generally available within regulatory analyses. Multiple inequality indicators may be applicable to regulatory analyses, and the choice among indicators should be based on explicit value judgments regarding the dimensions of environmental justice of greatest interest. PMID:23999551

  15. Methodological Quality Assessment of Meta-analyses in Endodontics.

    Science.gov (United States)

    Kattan, Sereen; Lee, Su-Min; Kohli, Meetu R; Setzer, Frank C; Karabucak, Bekir

    2018-01-01

    The objectives of this review were to assess the methodological quality of published meta-analyses related to endodontics using the assessment of multiple systematic reviews (AMSTAR) tool and to provide a follow-up to previously published reviews. Three electronic databases were searched for eligible studies according to the inclusion and exclusion criteria: Embase via Ovid, The Cochrane Library, and Scopus. The electronic search was amended by a hand search of 6 dental journals (International Endodontic Journal; Journal of Endodontics; Australian Endodontic Journal; Oral Surgery, Oral Medicine, Oral Pathology, Oral Radiology; Endodontics and Dental Traumatology; and Journal of Dental Research). The searches were conducted to include articles published after July 2009, and the deadline for inclusion of the meta-analyses was November 30, 2016. The AMSTAR assessment tool was used to evaluate the methodological quality of all included studies. A total of 36 reports of meta-analyses were included. The overall quality of the meta-analyses reports was found to be medium, with an estimated mean overall AMSTAR score of 7.25 (95% confidence interval, 6.59-7.90). The most poorly assessed areas were providing an a priori design, the assessment of the status of publication, and publication bias. In recent publications in the field of endodontics, the overall quality of the reported meta-analyses is medium according to AMSTAR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  16. Seismic risk analyses in the German Risk Study, phase B

    International Nuclear Information System (INIS)

    Hosser, D.; Liemersdorf, H.

    1991-01-01

    The paper discusses some aspects of the seismic risk part of the German Risk Study for Nuclear Power Plants, Phase B. First simplified analyses in Phase A of the study allowed only a rough classification of structures and systems of the PWR reference plant according to their seismic risk contribution. These studies were extended in Phase B using improved models for the dynamic analyses of buildings, structures and components as well as for the probabilistic analyses of seismic loading, failure probabilities and event trees. The methodology of deriving probabilistic seismic load descriptions is explained and compared with the methods in Phase A of the study and in other studies. Some details of the linear and nonlinear dynamic analyses of structures are reported in order to demonstrate the influence of different assumptions for material behaviour and failure criteria. The probabilistic structural and event tree analyses are discussed with respect to distribution assumptions, acceptable simplifications and model uncertainties. Some results for the PWR reference plant are given. (orig.)

  17. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  18. [Methods, challenges and opportunities for big data analyses of microbiome].

    Science.gov (United States)

    Sheng, Hua-Fang; Zhou, Hong-Wei

    2015-07-01

    Microbiome is a novel research field related with a variety of chronic inflamatory diseases. Technically, there are two major approaches to analysis of microbiome: metataxonome by sequencing the 16S rRNA variable tags, and metagenome by shot-gun sequencing of the total microbial (mainly bacterial) genome mixture. The 16S rRNA sequencing analyses pipeline includes sequence quality control, diversity analyses, taxonomy and statistics; metagenome analyses further includes gene annotation and functional analyses. With the development of the sequencing techniques, the cost of sequencing will decrease, and big data analyses will become the central task. Data standardization, accumulation, modeling and disease prediction are crucial for future exploit of these data. Meanwhile, the information property in these data, and the functional verification with culture-dependent and culture-independent experiments remain the focus in future research. Studies of human microbiome will bring a better understanding of the relations between the human body and the microbiome, especially in the context of disease diagnosis and therapy, which promise rich research opportunities.

  19. Performance and Vibration Analyses of Lift-Offset Helicopters

    Directory of Open Access Journals (Sweden)

    Jeong-In Go

    2017-01-01

    Full Text Available A validation study on the performance and vibration analyses of the XH-59A compound helicopter is conducted to establish techniques for the comprehensive analysis of lift-offset compound helicopters. This study considers the XH-59A lift-offset compound helicopter using a rigid coaxial rotor system as a verification model. CAMRAD II (Comprehensive Analytical Method of Rotorcraft Aerodynamics and Dynamics II, a comprehensive analysis code, is used as a tool for the performance, vibration, and loads analyses. A general free wake model, which is a more sophisticated wake model than other wake models, is used to obtain good results for the comprehensive analysis. Performance analyses of the XH-59A helicopter with and without auxiliary propulsion are conducted in various flight conditions. In addition, vibration analyses of the XH-59A compound helicopter configuration are conducted in the forward flight condition. The present comprehensive analysis results are in good agreement with the flight test and previous analyses. Therefore, techniques for the comprehensive analysis of lift-offset compound helicopters are appropriately established. Furthermore, the rotor lifts are calculated for the XH-59A lift-offset compound helicopter in the forward flight condition to investigate the airloads characteristics of the ABC™ (Advancing Blade Concept rotor.

  20. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  1. HLA region excluded by linkage analyses of early onset periodontitis

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.; Wang, S.; Lopez, N.

    1994-09-01

    Previous studies suggested that HLA genes may influence susceptibility to early-onset periodontitis (EOP). Segregation analyses indicate that EOP may be due to a single major gene. We conducted linkage analyses to assess possible HLA effects on EOP. Fifty families with two or more close relatives affected by EOP were ascertained in Virginia and Chile. A microsatellite polymorphism within the HLA region (at the tumor necrosis factor beta locus) was typed using PCR. Linkage analyses used a donimant model most strongly supported by previous studies. Assuming locus homogeneity, our results exclude a susceptibility gene within 10 cM on either side of our marker locus. This encompasses all of the HLA region. Analyses assuming alternative models gave qualitatively similar results. Allowing for locus heterogeneity, our data still provide no support for HLA-region involvement. However, our data do not statistically exclude (LOD <-2.0) hypotheses of disease-locus heterogeneity, including models where up to half of our families could contain an EOP disease gene located in the HLA region. This is due to the limited power of even our relatively large collection of families and the inherent difficulties of mapping genes for disorders that have complex and heterogeneous etiologies. Additional statistical analyses, recruitment of families, and typing of flanking DNA markers are planned to more conclusively address these issues with respect to the HLA region and other candidate locations in the human genome. Additional results for markers covering most of the human genome will also be presented.

  2. Microcomputer-controlled thermoluminescent analyser IJS MR-200

    International Nuclear Information System (INIS)

    Mihelic, M.; Miklavzic, U.; Rupnik, Z.; Satalic, P.; Spreizer, F.; Zerovnik, I.

    1985-01-01

    Performances and concept of the multipurpose, microcomputer-controlled thermoluminescent analyser, designed for use in laboratory work TL dosemeters as well as for routine dose readings in the range from ecological to accident doses is described. The main features of the analyser are: time-linear sampling, digitalisation, storing, and subsequent displaying on the monitor time scale of the glow and and temperature curve of the TL material; digital stabilization, control and diagnostic of the analog unit; ability of storing 7 different 8-parametric heating programs; ability of storing 15 evaluation programs defined by 2 or 4 parameters and 3 different algorithms (altogether 5 types of evaluations). Analyser has several features intended for routine work: 9 function keys and possibilities of file forming on cassette or display disc, of dose calculation and averaging, of printing reports with names, and possibility of additional programming in Basic. (author)

  3. A computer program for multiple decrement life table analyses.

    Science.gov (United States)

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  4. Conducting qualitative research in mental health: Thematic and content analyses.

    Science.gov (United States)

    Crowe, Marie; Inder, Maree; Porter, Richard

    2015-07-01

    The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  5. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Science.gov (United States)

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  6. PWR plant transient analyses using TRAC-PF1

    International Nuclear Information System (INIS)

    Ireland, J.R.; Boyack, B.E.

    1984-01-01

    This paper describes some of the pressurized water reactor (PWR) transient analyses performed at Los Alamos for the US Nuclear Regulatory Commission using the Transient Reactor Analysis Code (TRAC-PF1). Many of the transient analyses performed directly address current PWR safety issues. Included in this paper are examples of two safety issues addressed by TRAC-PF1. These examples are pressurized thermal shock (PTS) and feed-and-bleed cooling for Oconee-1. The calculations performed were plant specific in that details of both the primary and secondary sides were modeled in addition to models of the plant integrated control systems. The results of these analyses show that for these two transients, the reactor cores remained covered and cooled at all times posing no real threat to the reactor system nor to the public

  7. Finite element analyses of a linear-accelerator electron gun

    Science.gov (United States)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-02-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  8. Finite element analyses of a linear-accelerator electron gun

    Energy Technology Data Exchange (ETDEWEB)

    Iqbal, M., E-mail: muniqbal.chep@pu.edu.pk, E-mail: muniqbal@ihep.ac.cn [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Wasy, A. [Department of Mechanical Engineering, Changwon National University, Changwon 641773 (Korea, Republic of); Islam, G. U. [Centre for High Energy Physics, University of the Punjab, Lahore 45590 (Pakistan); Zhou, Z. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China)

    2014-02-15

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator.

  9. Finite element analyses of a linear-accelerator electron gun

    International Nuclear Information System (INIS)

    Iqbal, M.; Wasy, A.; Islam, G. U.; Zhou, Z.

    2014-01-01

    Thermo-structural analyses of the Beijing Electron-Positron Collider (BEPCII) linear-accelerator, electron gun, were performed for the gun operating with the cathode at 1000 °C. The gun was modeled in computer aided three-dimensional interactive application for finite element analyses through ANSYS workbench. This was followed by simulations using the SLAC electron beam trajectory program EGUN for beam optics analyses. The simulations were compared with experimental results of the assembly to verify its beam parameters under the same boundary conditions. Simulation and test results were found to be in good agreement and hence confirmed the design parameters under the defined operating temperature. The gun is operating continuously since commissioning without any thermal induced failures for the BEPCII linear accelerator

  10. Towards Reproducible Research Data Analyses in LHC Particle Physics

    CERN Document Server

    Simko, Tibor

    2017-01-01

    The reproducibility of the research data analysis requires having access not only to the original datasets, but also to the computing environment, the analysis software and the workflow used to produce the original results. We present the nascent CERN Analysis Preservation platform with a set of tools developed to support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. The presentation will focus on three pillars: (i) capturing structured knowledge information about data analysis processes; (ii) capturing the computing environment, the software code, the datasets, the configuration and other information assets used in data analyses; (iii) re-instantiating of preserved analyses on a containerised computing cloud for the purposes of re-validation and re-interpretation.

  11. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    Directory of Open Access Journals (Sweden)

    Lucy Lim

    2016-01-01

    Full Text Available Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices.

  12. Scenario evolution: Interaction between event tree construction and numerical analyses

    International Nuclear Information System (INIS)

    Barr, G.E.; Barnard, R.W.; Dockery, H.A.; Dunn, E.; MacIntyre, A.T.

    1990-01-01

    Construction of well-posed scenarios for the range of conditions possible at any proposed repository site is a critical first step to assessing total system performance. Event tree construction is the method that is being used to develop potential failure scenarios for the proposed nuclear waste repository at Yucca Mountain. An event tree begins with an initial event or condition. Subsequent events are listed in a sequence, leading eventually to release of radionuclides to the accessible environment. Ensuring the validity of the scenarios requires iteration between problems constructed using scenarios contained in the event tree sequence, experimental results, and numerical analyses. Details not adequately captured within the tree initially may become more apparent as a result of analyses. To illustrate this process, the authors discuss the iterations used to develop numerical analyses for PACE-90 (Performance Assessment Calculational Exercises) using basaltic igneous activity and human-intrusion event trees

  13. Scenario evolution: Interaction between event tree construction and numerical analyses

    International Nuclear Information System (INIS)

    Barr, G.E.; Barnard, R.W.; Dockery, H.A.; Dunn, E.; MacIntyre, A.T.

    1991-01-01

    Construction of well-posed scenarios for the range of conditions possible at any proposed repository site is a critical first step to assessing total system performance. Even tree construction is the method that is being used to develop potential failure scenarios for the proposed nuclear waste repository at Yucca Mountain. An event tree begins with an initial event or condition. Subsequent events are listed in a sequence, leading eventually to release of radionuclides to the accessible environment. Ensuring the validity of the scenarios requires iteration between problems constructed using scenarios contained in the event tree sequence, experimental results, and numerical analyses. Details not adequately captured within the tree initially may become more apparent as a result of analyses. To illustrate this process, we discuss the iterations used to develop numerical analyses for PACE-90 using basaltic igneous activity and human-intrusion event trees

  14. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  15. Financial relationships in economic analyses of targeted therapies in oncology.

    Science.gov (United States)

    Valachis, Antonis; Polyzos, Nikolaos P; Nearchou, Andreas; Lind, Pehr; Mauri, Davide

    2012-04-20

    A potential financial relationship between investigators and pharmaceutical manufacturers has been associated with an increased likelihood of reporting favorable conclusions about a sponsor's proprietary agent in pharmacoeconomic studies. The purpose of this study is to investigate whether there is an association between financial relationships and outcome in economic analyses of new targeted therapies in oncology. We searched PubMed (last update June 2011) for economic analyses of targeted therapies (including monoclonal antibodies, tyrosine-kinase inhibitors, and mammalian target of rapamycin inhibitors) in oncology. The trials were qualitatively rated regarding the cost assessment as favorable, neutral, or unfavorable on the basis of prespecified criteria. Overall, 81 eligible studies were identified. Economic analyses that were funded by pharmaceutical companies were more likely to report favorable qualitative cost estimates (28 [82%] of 34 v 21 [45%] of 47; P = .003). The presence of an author affiliated with manufacturer was not associated with study outcome. Furthermore, if only studies including a conflict of interest statement were included (66 of 81), studies that reported any financial relationship with manufacturers (author affiliation and/or funding and/or other financial relationship) were more likely to report favorable results of targeted therapies compared with studies without financial relationship (32 [71%] of 45 v nine [43%] of 21; P = .025). Our study reveals a potential threat for industry-related bias in economic analyses of targeted therapies in oncology in favor of analyses with financial relationships between authors and manufacturers. A more balanced funding of economic analyses from other sources may allow greater confidence in the interpretation of their results.

  16. Publication bias in dermatology systematic reviews and meta-analyses.

    Science.gov (United States)

    Atakpo, Paul; Vassar, Matt

    2016-05-01

    Systematic reviews and meta-analyses in dermatology provide high-level evidence for clinicians and policy makers that influence clinical decision making and treatment guidelines. One methodological problem with systematic reviews is the under representation of unpublished studies. This problem is due in part to publication bias. Omission of statistically non-significant data from meta-analyses may result in overestimation of treatment effect sizes which may lead to clinical consequences. Our goal was to assess whether systematic reviewers in dermatology evaluate and report publication bias. Further, we wanted to conduct our own evaluation of publication bias on meta-analyses that failed to do so. Our study considered systematic reviews and meta-analyses from ten dermatology journals from 2006 to 2016. A PubMed search was conducted, and all full-text articles that met our inclusion criteria were retrieved and coded by the primary author. 293 articles were included in our analysis. Additionally, we formally evaluated publication bias in meta-analyses that failed to do so using trim and fill and cumulative meta-analysis by precision methods. Publication bias was mentioned in 107 articles (36.5%) and was formally evaluated in 64 articles (21.8%). Visual inspection of a funnel plot was the most common method of evaluating publication bias. Publication bias was present in 45 articles (15.3%), not present in 57 articles (19.5%) and not determined in 191 articles (65.2%). Using the trim and fill method, 7 meta-analyses (33.33%) showed evidence of publication bias. Although the trim and fill method only found evidence of publication bias in 7 meta-analyses, the cumulative meta-analysis by precision method found evidence of publication bias in 15 meta-analyses (71.4%). Many of the reviews in our study did not mention or evaluate publication bias. Further, of the 42 articles that stated following PRISMA reporting guidelines, 19 (45.2%) evaluated for publication bias. In

  17. Experimental technique of stress analyses by neutron diffraction

    International Nuclear Information System (INIS)

    Sun, Guangai; Chen, Bo; Huang, Chaoqiang

    2009-09-01

    The structures and main components of neutron diffraction stress analyses spectrometer, SALSA, as well as functions and parameters of each components are presented. The technical characteristic and structure parameters of SALSA are described. Based on these aspects, the choice of gauge volume, method of positioning sample, determination of diffraction plane and measurement of zero stress do are discussed. Combined with the practical experiments, the basic experimental measurement and the related settings are introduced, including the adjustments of components, pattern scattering, data recording and checking etc. The above can be an instruction for stress analyses experiments by neutron diffraction and neutron stress spectrometer construction. (authors)

  18. [The maintenance of automatic analysers and associated documentation].

    Science.gov (United States)

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  19. Designing and recasting LHC analyses with MadAnalysis 5

    CERN Document Server

    Conte, Eric; Fuks, Benjamin; Wymant, Chris

    2014-01-01

    We present an extension of the expert mode of the MadAnalysis 5 program dedicated to the design or reinterpretation of high-energy physics collider analyses. We detail the predefined classes, functions and methods available to the user and emphasize the most recent developments. The latter include the possible definition of multiple sub-analyses and a novel user-friendly treatment for the selection criteria. We illustrate this approach by two concrete examples: a CMS search for supersymmetric partners of the top quark and a phenomenological analysis targeting hadronically decaying monotop systems.

  20. Nuclear power plants: Results of recent safety analyses

    International Nuclear Information System (INIS)

    Steinmetz, E.

    1987-01-01

    The contributions deal with the problems posed by low radiation doses, with the information currently available from analyses of the Chernobyl reactor accident, and with risk assessments in connection with nuclear power plant accidents. Other points of interest include latest results on fission product release from reactor core or reactor building, advanced atmospheric dispersion models for incident and accident analyses, reliability studies on safety systems, and assessment of fire hazard in nuclear installations. The various contributions are found as separate entries in the database. (DG) [de

  1. Economical analyses of construction of a biomass boiler house

    International Nuclear Information System (INIS)

    Normak, A.

    2002-01-01

    To reduce the energy costs we can use cheaper fuel to fire our boiler. One of the cheapest fuels is wood biomass. It is very actual issue how to use cheaper wood biomass in heat generation to decrease energy costs and to increase biomass share in our energy balance. Before we decide to build biomass boiler house it is recommendable to analyse the economical situation and work out most profitable, efficient, reliable and ecological boiler plant design on particular conditions. The best way to perform the analyses is to use the economical model presented. It saves our time and gives objective evaluation to the project. (author)

  2. Certification of a uranium dioxide reference material for chemical analyses

    International Nuclear Information System (INIS)

    Le Duigou, Y.

    1984-01-01

    This report, issued by the Central Bureau for Nuclear Measurements (CBNM), describes the characterization of a uranium dioxide reference material with accurately determined uranium mass fraction for chemical analyses. The preparation, conditioning, homogeneity tests and the analyses performed on this material are described in Annex 1. The evaluation of the individual impurity results, total of impurities and uranium mass fraction are given in Annex 2. Information on a direct determination of uranium by titration is given in Annex 3. The uranium mass fraction (881.34+-0.13) g.kg -1 calculated in Annex 2 is given on the certificate

  3. Design and manufacture of TL analyser by using the microcomputer

    International Nuclear Information System (INIS)

    Doh, Sih Hong; Woo, Chong Ho

    1986-01-01

    This paper describes the design of the thermoluminescence analyser using microcomputer. TL analyser is designed to perform the three step heat treatment, such as pre-read heating, readout procedure and post-heating (or pre-irradiation ) anneal. We used a 12-bit A/D converter to get the precise measurement and the phase control method to control the heating temperature. Since the used Apple II microcomputer is cheap and popular, it is possible to design the economical system. Experimental results showed the successful operation with flexibility. The error of temperature control was less than ± 0.2% of the expected value. (Author)

  4. VIPRE modeling of VVER-1000 reactor core for DNB analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Y.; Nguyen, Q. [Westinghouse Electric Corporation, Pittsburgh, PA (United States); Cizek, J. [Nuclear Research Institute, Prague, (Czech Republic)

    1995-09-01

    Based on the one-pass modeling approach, the hot channels and the VVER-1000 reactor core can be modeled in 30 channels for DNB analyses using the VIPRE-01/MOD02 (VIPRE) code (VIPRE is owned by Electric Power Research Institute, Palo Alto, California). The VIPRE one-pass model does not compromise any accuracy in the hot channel local fluid conditions. Extensive qualifications include sensitivity studies of radial noding and crossflow parameters and comparisons with the results from THINC and CALOPEA subchannel codes. The qualifications confirm that the VIPRE code with the Westinghouse modeling method provides good computational performance and accuracy for VVER-1000 DNB analyses.

  5. RELAP5 analyses and support of Oconee-1 PTS studies

    International Nuclear Information System (INIS)

    Charlton, T.R.

    1983-01-01

    The integrity of a reactor vessel during a severe overcooling transient with primary system pressurization is a current safety concern and has been identified as an Unresolved Safety Issue(USI) A-49 by the US Nuclear Regulatory Commission (NRC). Resolution of USI A-49, denoted as Pressurized Thermal Shock (PTS), is being examined by the US NRC sponsored PTS integration study. In support of this study, the Idaho National Engineering Laboratory (INEL) has performed RELAP5/MOD1.5 thermal-hydraulic analyses of selected overcooling transients. These transient analyses were performed for the Oconee-1 pressurized water reactor (PWR), which is Babcock and Wilcox designed nuclear steam supply system

  6. Simulation-based Investigations of Electrostatic Beam Energy Analysers

    CERN Document Server

    Pahl, Hannes

    2015-01-01

    An energy analyser is needed to measure the beam energy profile behind the REX-EBIS at ISOLDE. The device should be able to operate with an accuracy of 1 V at voltages up to 30 kV. In order to find a working concept for an electrostatic energy analyser different designs were evaluated with simulations. A spherical device and its design issues are presented. The potential deformation effects of grids at high voltages and their influence on the energy resolution were investigated. First tests were made with a grid-free ring electrode device and show promising results.

  7. Cost-benefit analyses for the development of magma power

    International Nuclear Information System (INIS)

    Haraden, John

    1992-01-01

    Magma power is the potential generation of electricity from shallow magma bodies in the crust of the Earth. Considerable uncertainty still surrounds the development of magma power, but most of that uncertainty may be eliminated by drilling the first deep magma well. The uncertainty presents no serious impediments to the private drilling of the well. For reasons unrelated to the uncertainty, there may be no private drilling and there may be justification for public drilling. In this paper, we present cost-benefit analyses for private and public drilling of the well. Both analyses indicate there is incentive for drilling. (Author)

  8. The development of an on-line gold analyser

    International Nuclear Information System (INIS)

    Robert, R.V.D.; Ormrod, G.T.W.

    1982-01-01

    An on-line analyser to monitor the gold in solutions from the carbon-in-pulp process is described. The automatic system is based on the delivery of filtered samples of the solutions to a distribution valve for measurement by flameless atomic-absorption spectrophotometry. The samples is introduced by the aerosol-deposition method. Operation of the analyser on a pilot plant and on a full-scale carbon-in-pulp plant has shown that the system is economically feasible and capable of providing a continuous indication of the efficiency of the extraction process

  9. Design basis event consequence analyses for the Yucca Mountain project

    International Nuclear Information System (INIS)

    Orvis, D.D.; Haas, M.N.; Martin, J.H.

    1997-01-01

    Design basis event (DBE) definition and analysis is an ongoing and integrated activity among the design and analysis groups of the Yucca Mountain Project (YMP). DBE's are those that potentially lead to breach of the waste package and waste form (e.g., spent fuel rods) with consequent release of radionuclides to the environment. A Preliminary Hazards Analysis (PHA) provided a systematic screening of external and internal events that were candidate DBE's that will be subjected to analyses for radiological consequences. As preparation, pilot consequence analyses for the repository subsurface and surface facilities have been performed to define the methodology, data requirements, and applicable regulatory limits

  10. Detection of defects of Kenaf/Epoxy by Thermography Analyses

    International Nuclear Information System (INIS)

    Suriani, M J; Ali, Aidi; Sapuan, S M; Khalina, A; Abdullah, S

    2012-01-01

    There are quite a few defects can occur due to manufacturing of the composites such as voids, resin-rich zones, pockets of undispersed cross-linker, misaligned fibres and regions where resin has poorly wetted the fibres. Such defect can reduce the mechanical properties as well mechanical performance of the structure and thus must be determine. In this study, the defect of Kenaf/epoxy reinforced composite materials has been determined by thermography analyses and mechanical properties testing of the composites have been done by tensile test. 95% of the thermography analyses have proved that the defects occur in the composite has reduced the mechanical properties of the specimens.

  11. Using Microsoft Office Excel 2007 to conduct generalized matching analyses.

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.

  12. USING MICROSOFT OFFICE EXCEL® 2007 TO CONDUCT GENERALIZED MATCHING ANALYSES

    Science.gov (United States)

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law. PMID:20514196

  13. Preserving and reusing high-energy-physics data analyses

    CERN Document Server

    Simko, Tibor; Dasler, Robin; Fokianos, Pamfilos; Kuncar, Jiri; Lavasa, Artemis; Mattmann, Annemarie; Rodriguez, Diego; Trzcinska, Anna; Tsanaktsidis, Ioannis

    2017-01-01

    The revalidation, reuse and reinterpretation of data analyses require having access to the original virtual environments, datasets and software that was used to produce the original scientific result. The CERN Analysis Preservation pilot project is developing a set of tools that support particle physics researchers in preserving the knowledge around analyses so that capturing, sharing, reusing and reinterpreting data becomes easier. In this talk, we shall notably focus on the aspects of reusing a preserved analysis. We describe a system that permits to instantiate the preserved analysis workflow on the computing cloud, paving the way to allowing researchers to revalidate and reinterpret research data even many years after the original publication.

  14. Race, Gender, and Reseacher Positionality Analysed Through Memory Work

    DEFF Research Database (Denmark)

    Andreassen, Rikke; Myong, Lene

    2017-01-01

    Drawing upon feminist standpoint theory and memory work, the authors analyse racial privilege by investigating their own racialized and gendered subjectifications as academic researchers. By looking at their own experiences within academia, they show how authority and agency are contingent upon...

  15. Effects of GPS sampling intensity on home range analyses

    Science.gov (United States)

    Jeffrey J. Kolodzinski; Lawrence V. Tannenbaum; David A. Osborn; Mark C. Conner; W. Mark Ford; Karl V. Miller

    2010-01-01

    The two most common methods for determining home ranges, minimum convex polygon (MCP) and kernel analyses, can be affected by sampling intensity. Despite prior research, it remains unclear how high-intensity sampling regimes affect home range estimations. We used datasets from 14 GPS-collared, white-tailed deer (Odocoileus virginianus) to describe...

  16. Prenominal and postnominal reduced relative clauses: arguments against unitary analyses

    NARCIS (Netherlands)

    Sleeman, P.

    2007-01-01

    These last years, several analyses have been proposed in which prenominal and postnominal reduced relatives are merged in the same position. Kayne (1994) claims that both types of reduced relative clauses are the complement of the determiner. More recently, Cinque (2005) has proposed that both types

  17. Optical region elemental abundance analyses of B and A stars

    International Nuclear Information System (INIS)

    Adelman, S.J.; Young, J.M.; Baldwin, H.E.

    1984-01-01

    Abundance analyses using optical region data and fully line blanketed model atmospheres have been performed for two sharp-lined hot Am stars o Pegasi and σ Aquarii and for the sharp-lined marginally peculiar A star v Cancri. The derived abundances exhibit definite anomalies compared with those of normal B-type stars and the Sun. (author)

  18. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  19. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2013-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  20. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2012-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  1. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author)

  2. Laser Beam Caustic Measurement with Focal Spot Analyser

    DEFF Research Database (Denmark)

    Olsen, Flemming Ove; Gong, Hui; Bagger, Claus

    2005-01-01

    In industrial applications of high power CO2-lasers the caustic characteristics of the laser beam have great effects on the performance of the lasers. A welldefined high intense focused spot is essential for reliable production results. This paper presents a focal spot analyser that is developed...

  3. A bromine-based dichroic X-ray polarization analyser

    CERN Document Server

    Collins, S P; Brown, S D; Thompson, P

    2001-01-01

    We have demonstrated the advantages offered by dichroic X-ray polarization filters for linear polarization analysis, and describe such a device, based on a dibromoalkane/urea inclusion compound. The polarizer has been successfully tested by analysing the polarization of magnetic diffraction from holmium.

  4. Multitrait-Multimethod Analyses of Two Self-Concept Instruments.

    Science.gov (United States)

    Marsh, Herbert W.; Smith, Ian D.

    1982-01-01

    The multidimensionality of self-concept and the use of factor analysis in the development of self-concept instruments are supported in multitrait-multimethod analyses of the Sears and Coopersmith instruments. Convergent validity and discriminate validity of subscales in factor analysis and multitrait-multimethod analysis of longitudinal data are…

  5. Persuading Collaboration: Analysing Persuasion in Online Collaboration Projects

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    In this paper we propose that online collaborative production sites can be fruitfully analysed in terms of the general theoretical framework of Persuasive Design. OpenStreetMap and The Pirate Bay are used as examples of collaborative production sites. Results of a quantitative analysis of persuas...

  6. Global post-Kyoto scenario analyses at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Kypreos, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs.

  7. Fundamental issues in finite element analyses of localization of deformation

    NARCIS (Netherlands)

    Borst, de R.; Sluys, L.J.; Mühlhaus, H.-B.; Pamin, J.

    1993-01-01

    Classical continuum models, i.e. continuum models that do not incorporate an internal length scale, suffer from excessive mesh dependence when strain-softening models are used in numerical analyses and cannot reproduce the size effect commonly observed in quasi-brittle failure. In this contribution

  8. Installation and performance evaluation of an indigenous surface area analyser

    International Nuclear Information System (INIS)

    Pillai, S.N.; Solapurkar, M.N.; Venkatesan, V.; Prakash, A.; Khan, K.B.; Kumar, Arun; Prasad, R.S.

    2014-01-01

    An indigenously available surface area analyser was installed inside glove box and checked for its performance by analyzing uranium oxide and thorium oxide powders at RMD. The unit has been made ready for analysis of Plutonium oxide powders after incorporating several important features. (author)

  9. Analysing the performance of dynamic multi-objective optimisation algorithms

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available and the goal of the algorithm is to track a set of tradeoff solutions over time. Analysing the performance of a dynamic multi-objective optimisation algorithm (DMOA) is not a trivial task. For each environment (before a change occurs) the DMOA has to find a set...

  10. Application of digital image correlation method for analysing crack ...

    Indian Academy of Sciences (India)

    centrated strain by imitating the treatment of micro-cracks using the finite element ... water and moisture to penetrate the concrete leading to serious rust of the ... The correlations among various grey values of digital images are analysed for ...

  11. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Ihekwaba, Adoha

    2007-01-01

    A. Ihekwaba, R. Mardare. A Calculus for Modelling, Simulating and Analysing Compartmentalized Biological Systems. Case study: NFkB system. In Proc. of International Conference of Computational Methods in Sciences and Engineering (ICCMSE), American Institute of Physics, AIP Proceedings, N 2...

  12. Nuclear Analyses of Indian LLCB Test Blanket System in ITER

    Science.gov (United States)

    Swami, H. L.; Shaw, A. K.; Danani, C.; Chaudhuri, Paritosh

    2017-04-01

    Heading towards the Nuclear Fusion Reactor Program, India is developing Lead Lithium Ceramic Breeder (LLCB) tritium breeding blanket for its future fusion Reactor. A mock-up of the LLCB blanket is proposed to be tested in ITER equatorial port no.2, to ensure the overall performance of blanket in reactor relevant nuclear fusion environment. Nuclear analyses play an important role in LLCB Test Blanket System design & development. It is required for tritium breeding estimation, thermal-hydraulic design, coolants process design, radioactive waste management, equipment maintenance & replacement strategies and nuclear safety. The nuclear behaviour of LLCB test blanket module in ITER is predicated in terms of nuclear responses such as tritium production, nuclear heating, neutron fluxes and radiation damages. Radiation shielding capability of LLCB TBS inside and outside bio-shield was also assessed to fulfill ITER shielding requirements. In order to supports the rad-waste and safety assessment, nuclear activation analyses were carried out and radioactivity data were generated for LLCB TBS components. Nuclear analyses of LLCB TBS are performed using ITER recommended nuclear analyses codes (i.e. MCNP, EASY), nuclear cross section data libraries (i.e. FENDL 2.1, EAF) and neutronic model (ITER C-lite v.l). The paper describes a comprehensive nuclear performance of LLCB TBS in ITER.

  13. A review of bioinformatic methods for forensic DNA analyses.

    Science.gov (United States)

    Liu, Yao-Yuan; Harbison, SallyAnn

    2018-03-01

    Short tandem repeats, single nucleotide polymorphisms, and whole mitochondrial analyses are three classes of markers which will play an important role in the future of forensic DNA typing. The arrival of massively parallel sequencing platforms in forensic science reveals new information such as insights into the complexity and variability of the markers that were previously unseen, along with amounts of data too immense for analyses by manual means. Along with the sequencing chemistries employed, bioinformatic methods are required to process and interpret this new and extensive data. As more is learnt about the use of these new technologies for forensic applications, development and standardization of efficient, favourable tools for each stage of data processing is being carried out, and faster, more accurate methods that improve on the original approaches have been developed. As forensic laboratories search for the optimal pipeline of tools, sequencer manufacturers have incorporated pipelines into sequencer software to make analyses convenient. This review explores the current state of bioinformatic methods and tools used for the analyses of forensic markers sequenced on the massively parallel sequencing (MPS) platforms currently most widely used. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Morphometric analyses of the river basins in Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Iyer, S.D.; Wagle, B.G.

    Morphometric analyses of seven river basins in Goa, India have been carried out. The linear and areal aspects of these basins are reported here. The plots of stream order versus stream numbers and stream orders versus mean stream lengths are found...

  15. Medical Isotope Production Analyses In KIPT Neutron Source Facility

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Yousry

    2016-01-01

    Medical isotope production analyses in Kharkov Institute of Physics and Technology (KIPT) neutron source facility were performed to include the details of the irradiation cassette and the self-shielding effect. An updated detailed model of the facility was used for the analyses. The facility consists of an accelerator-driven system (ADS), which has a subcritical assembly using low-enriched uranium fuel elements with a beryllium-graphite reflector. The beryllium assemblies of the reflector have the same outer geometry as the fuel elements, which permits loading the subcritical assembly with different number of fuel elements without impacting the reflector performance. The subcritical assembly is driven by an external neutron source generated from the interaction of 100-kW electron beam with a tungsten target. The facility construction was completed at the end of 2015, and it is planned to start the operation during the year of 2016. It is the first ADS in the world, which has a coolant system for removing the generated fission power. Argonne National Laboratory has developed the design concept and performed extensive design analyses for the facility including its utilization for the production of different radioactive medical isotopes. 99 Mo is the parent isotope of 99m Tc, which is the most commonly used medical radioactive isotope. Detailed analyses were performed to define the optimal sample irradiation location and the generated activity, for several radioactive medical isotopes, as a function of the irradiation time.

  16. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    van de Wal, R.S.W.; Meijer, H.A.J.; van Rooij, M.; van der Veen, C.

    2007-01-01

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for 14CO and 14CO2 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced 14CO fraction show a very low concentration of in situ produced 14CO.

  17. Radiocarbon analyses along the EDML ice core in Antarctica

    NARCIS (Netherlands)

    Van de Wal, R. S. W.; Meijer, H. A. J.; De Rooij, M.; Van der Veen, C.

    Samples, 17 in total, from the EDML core drilled at Kohnen station Antarctica are analysed for (CO)-C-14 and (CO2)-C-14 with a dry-extraction technique in combination with accelerator mass spectrometry. Results of the in situ produced (CO)-C-14 fraction show a very low concentration of in situ

  18. Weight analyses and nitrogen balance assay in rats fed extruded ...

    African Journals Online (AJOL)

    Weight analyses and nitrogen balance assay in adult rats in raw and extruded African breadfruit (Treculia africana) based diets were carried out using response surface methodology in a central composite design. Process variables were feed composition (40 - 100 % African breadfruit, 0 - 5 % corn and 0 - 55 % soybean, ...

  19. Medical Isotope Production Analyses In KIPT Neutron Source Facility

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto [Argonne National Lab. (ANL), Argonne, IL (United States); Gohar, Yousry [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    Medical isotope production analyses in Kharkov Institute of Physics and Technology (KIPT) neutron source facility were performed to include the details of the irradiation cassette and the self-shielding effect. An updated detailed model of the facility was used for the analyses. The facility consists of an accelerator-driven system (ADS), which has a subcritical assembly using low-enriched uranium fuel elements with a beryllium-graphite reflector. The beryllium assemblies of the reflector have the same outer geometry as the fuel elements, which permits loading the subcritical assembly with different number of fuel elements without impacting the reflector performance. The subcritical assembly is driven by an external neutron source generated from the interaction of 100-kW electron beam with a tungsten target. The facility construction was completed at the end of 2015, and it is planned to start the operation during the year of 2016. It is the first ADS in the world, which has a coolant system for removing the generated fission power. Argonne National Laboratory has developed the design concept and performed extensive design analyses for the facility including its utilization for the production of different radioactive medical isotopes. 99Mo is the parent isotope of 99mTc, which is the most commonly used medical radioactive isotope. Detailed analyses were performed to define the optimal sample irradiation location and the generated activity, for several radioactive medical isotopes, as a function of the irradiation time.

  20. Karyotype analyses of the species of the genus Jurinea Cass ...

    African Journals Online (AJOL)

    In this study, karyotype analyses of 13 species belonging to the genus Jurinea Cass. (Compositae) and grown naturally in Turkey were conducted. These taxa include Jurinea alpigena C. Koch, Jurinea ancyrensis Bornm., Jurinea aucherana DC., Jurinea cadmea Boiss., Jurinea cataonica Boiss. and Hausskn., Jurinea ...

  1. Consumer Brand Choice: Individual and Group Analyses of Demand Elasticity

    Science.gov (United States)

    Oliveira-Castro, Jorge M.; Foxall, Gordon R.; Schrezenmaier, Teresa C.

    2006-01-01

    Following the behavior-analytic tradition of analyzing individual behavior, the present research investigated demand elasticity of individual consumers purchasing supermarket products, and compared individual and group analyses of elasticity. Panel data from 80 UK consumers purchasing 9 product categories (i.e., baked beans, biscuits, breakfast…

  2. Global post-Kyoto scenario analyses at PSI

    International Nuclear Information System (INIS)

    Kypreos, S.

    1999-01-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs

  3. Techniques for Scaling Up Analyses Based on Pre-interpretations

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Henriksen, Kim Steen; Banda, Gourinath

    2005-01-01

    a variety of analyses, both generic (such as mode analysis) and program-specific (with respect to a type describing some particular property of interest). Previous work demonstrated the approach using pre-interpretations over small domains. In this paper we present techniques that allow the method...

  4. Review of HEDL fuel pin transient analyses analytical programs

    International Nuclear Information System (INIS)

    Scott, J.H.; Baars, R.E.

    1975-05-01

    Methods for analysis of transient fuel pin performance are described, as represented by the steady-state SIEX code and the PECT series of codes used for steady-state and transient mechanical analyses. The empirical fuel failure correlation currently in use for analysis of transient overpower accidents is described. (U.S.)

  5. Geospatial analyses in support of heavy metal contamination ...

    African Journals Online (AJOL)

    This paper presents an exploratory assessment of heavy metal contamination along the main highways in Mafikeng, and illustrates how spatial analyses of the contamination for environmental management purposes can be supported by GIS and Remote Sensing. Roadside soil and grass (Stenotaphrum sp.) samples were ...

  6. Physico-Chemical and Bacteriological Analyses of Water Used for ...

    African Journals Online (AJOL)

    Samuel Olaleye

    Physicochemical and bacteriological analyses were carried out on well water, stream water and river water used for drinking and swimming purposes in. Abeokuta, Nigeria. The results obtained were compared with WHO and EPA standards for drinking and recreational water. With the exception of Sokori stream and a well ...

  7. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    elements. The tool features incremental syntax checking and code generation which take place while a net is being constructed. A fast simulator efficiently handles both untimed and timed nets. Full and partial state spaces can be generated and analysed, and a standard state space report contains...

  8. Application of digital-image-correlation techniques in analysing ...

    Indian Academy of Sciences (India)

    Basis theory of strain analysis using the digital image correlation method .... Type 304N Stainless Steel (Modulus of Elasticity = 193 MPa, Tensile Yield .... also proves the accuracy of the qualitative analyses by using the DIC ... We thank the National Science Council of Taiwan for supporting this research through grant. No.

  9. Preparation of Kepler light curves for asteroseismic analyses

    NARCIS (Netherlands)

    García, R.A.; Hekker, S.; Stello, D.; Gutiérrez-Soto, J.; Handberg, R.; Huber, D.; Karoff, C.; Uytterhoeven, K.; Appourchaux, T.; Chaplin, W.J.; Elsworth, Y.; Mathur, S.; Ballot, J.; Christensen-Dalsgaard, J.; Gilliland, R.L.; Houdek, G.; Jenkins, J.M.; Kjeldsen, H.; McCauliff, S.; Metcalfe, T.; Middour, C.K.; Molenda-Zakowicz, J.; Monteiro, M.J.P.F.G.; Smith, J.C.; Thompson, M.J.

    2011-01-01

    The Kepler mission is providing photometric data of exquisite quality for the asteroseismic study of different classes of pulsating stars. These analyses place particular demands on the pre-processing of the data, over a range of time-scales from minutes to months. Here, we describe processing

  10. Environmental analyses of land transportation systems in The Netherlands

    NARCIS (Netherlands)

    Bouwman, Mirjan E.; Moll, Henri C.

    Environmental analyses of the impact of transportation systems on the environment from the cradle to the grave are rare. This article makes a comparison of various Dutch passenger transportation systems by studying their complete life-cycle energy use. Moreover, systems are compared according to

  11. "Analysing Genre: Language Use in Professional Settings." A Review.

    Science.gov (United States)

    Drury, Helen

    1995-01-01

    "Analysing Genre," by Vijay K. Bhatia, is a timely addition to the literature on genre analysis in English for specific purposes. It is divided into three parts: the first provides theoretical background; the second explains how genre analysis works in different academic and professional settings; and the third exemplifies the…

  12. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  13. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  14. Post-facta Analyses of Fukushima Accident and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Fumiya [Sociotechnical Systems Safety Research Institute, Ichige (Japan)

    2014-08-15

    Independent analyses have been performed of the core melt behavior of the Unit 1, Unit 2 and Unit 3 reactors of Fukushima Daiichi Nuclear Power Station on 11-15 March 2011. The analyses are based on a phenomenological methodology with measured data investigation and a simple physical model calculation. Estimated are time variation of core water level, core material temperature and hydrogen generation rate. The analyses have revealed characteristics of accident process of each reactor. In the case of Unit 2 reactor, the calculated result suggests little hydrogen generation because of no steam generation in the core for zirconium-steam reaction during fuel damage process. It could be the reason of no hydrogen explosion in the Unit 2 reactor building. Analyses have been performed also on the core material behavior in another chaotic period of 19-31 March 2011, and it resulted in a re-melt hypothesis that core material in each reactor should have melted again due to shortage of cooling water. The hypothesis is consistent with many observed features of radioactive materials dispersion into the environment.

  15. Analyser-based phase contrast image reconstruction using geometrical optics

    International Nuclear Information System (INIS)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-01-01

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 μm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser

  16. Analyser-based phase contrast image reconstruction using geometrical optics.

    Science.gov (United States)

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  17. Safety analyses of the electrical systems on VVER NPP

    International Nuclear Information System (INIS)

    Andel, J.

    2004-01-01

    Energoprojekt Praha has been the main entity responsible for the section on 'Electrical Systems' in the safety reports of the Temelin, Dukovany and Mochovce nuclear power plants. The section comprises 2 main chapters, viz. Offsite Power System (issues of electrical energy production in main generators and the link to the offsite transmission grid) and Onsite Power Systems (AC and DC auxiliary system, both normal and safety related). In the chapter on the off-site system, attention is paid to the analysis of transmission capacity of the 400 kV lines, analysis of transient stability, multiple fault analyses, and probabilistic analyses of the grid and NPP power system reliability. In the chapter on the on-site system, attention is paid to the power balances of the electrical sources and switchboards set for various operational and accident modes, checks of loading and function of service and backup sources, short circuit current calculations, analyses of electrical protections, and analyses of the function and sizing of emergency sources (DG sets and UPS systems). (P.A.)

  18. A turbulent jet in crossflow analysed with proper orthogonal decomposition

    DEFF Research Database (Denmark)

    Meyer, Knud Erik; Pedersen, Jakob Martin; Özcan, Oktay

    2007-01-01

    and pipe diameter was 2400 and the jet to crossflow velocity ratios were R = 3.3 and R = 1.3. The experimental data have been analysed by proper orthogonal decomposition (POD). For R = 3.3, the results in several different planes indicate that the wake vortices are the dominant dynamic flow structures...

  19. Energy and exergy analyses of the diffusion absorption refrigeration system

    International Nuclear Information System (INIS)

    Yıldız, Abdullah; Ersöz, Mustafa Ali

    2013-01-01

    This paper describes the thermodynamic analyses of a DAR (diffusion absorption refrigeration) cycle. The experimental apparatus is set up to an ammonia–water DAR cycle with helium as the auxiliary inert gas. A thermodynamic model including mass, energy and exergy balance equations are presented for each component of the DAR cycle and this model is then validated by comparison with experimental data. In the thermodynamic analyses, energy and exergy losses for each component of the system are quantified and illustrated. The systems' energy and exergy losses and efficiencies are investigated. The highest energy and exergy losses occur in the solution heat exchanger. The highest energy losses in the experimental and theoretical analyses are found 25.7090 W and 25.4788 W respectively, whereas those losses as to exergy are calculated 13.7933 W and 13.9976 W. Although the values of energy efficiencies obtained from both the model and experimental studies are calculated as 0.1858, those values, in terms of exergy efficiencies are found 0.0260 and 0.0356. - Highlights: • The diffusion absorption refrigerator system is designed manufactured and tested. • The energy and exergy analyses of the system are presented theoretically and experimentally. • The energy and exergy losses are investigated for each component of the system. • The highest energy and exergy losses occur in the solution heat exchanger. • The energy and the exergy performances are also calculated

  20. Matrix Summaries Improve Research Reports: Secondary Analyses Using Published Literature

    Science.gov (United States)

    Zientek, Linda Reichwein; Thompson, Bruce

    2009-01-01

    Correlation matrices and standard deviations are the building blocks of many of the commonly conducted analyses in published research, and AERA and APA reporting standards recommend their inclusion when reporting research results. The authors argue that the inclusion of correlation/covariance matrices, standard deviations, and means can enhance…

  1. Genetic analyses for deciphering the status and role of ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 96; Issue 1. Genetic analyses for deciphering the status and role of photoperiodic and maturity genes in major Indian soybean cultivars. SANJAY GUPTA VIRENDER SINGH BHATIA GIRIRAJ KUMAWAT DEVSHREE THAKUR GOURAV SINGH RACHANA TRIPATHI GYANESH ...

  2. A new Link for Geographic analyses of Inventory Data

    Science.gov (United States)

    David Reed; Kurt Pregitzer; Scott A. Pugh; Patrick D. Miles

    2001-01-01

    The USDA Forest Service Forest Inventory and Analysis (FIA)data are widely used throughout the United States for analyses of forest status and trends, landscape-level forest composition, and other forest characteristics. A new software product, FIAMODEL, is available for analyzing FIA data within the ArcView? (ESRI, Inc.)geographic information system. The software...

  3. Analyse cognitive d'une politique publique : justice ...

    African Journals Online (AJOL)

    Analyse cognitive d'une politique publique : justice environnementale et « marchés ruraux » de bois-énergie. ... energy sources to poor urban dwellers; and to reduce the poverty of rural households by promoting sustainable forest management including income generation through producing and marketing charcoal.

  4. Analysing Harmonic Motions with an iPhone's Magnetometer

    Science.gov (United States)

    Yavuz, Ahmet; Temiz, Burak Kagan

    2016-01-01

    In this paper, we propose an experiment for analysing harmonic motion using an iPhone's (or iPad's) magnetometer. This experiment consists of the detection of magnetic field variations obtained from an iPhone's magnetometer sensor. A graph of harmonic motion is directly displayed on the iPhone's screen using the "Sensor Kinetics"…

  5. A process mining approach to analyse user behaviour

    NARCIS (Netherlands)

    Maruster, Laura; Faber, Niels R.; Jorna, Rene J.; van Haren, Rob J. F.; Cordeiro, J; Filipe, J; Hammoudi, S

    2008-01-01

    Designing and personalising systems for specific user groups encompasses a lot of effort with respect to analysing and understanding user behaviour. The goal of our paper is to provide a new methodology for determining navigational patterns of behaviour of specific user groups. We consider

  6. Diagnostic Comparison of Meteorological Analyses during the 2002 Antarctic Winter

    Science.gov (United States)

    Manney, Gloria L.; Allen, Douglas R.; Kruger, Kirstin; Naujokat, Barbara; Santee, Michelle L.; Sabutis, Joseph L.; Pawson, Steven; Swinbank, Richard; Randall, Cora E.; Simmons, Adrian J.; hide

    2005-01-01

    Several meteorological datasets, including U.K. Met Office (MetO), European Centre for Medium-Range Weather Forecasts (ECMWF), National Centers for Environmental Prediction (NCEP), and NASA's Goddard Earth Observation System (GEOS-4) analyses, are being used in studies of the 2002 Southern Hemisphere (SH) stratospheric winter and Antarctic major warming. Diagnostics are compared to assess how these studies may be affected by the meteorological data used. While the overall structure and evolution of temperatures, winds, and wave diagnostics in the different analyses provide a consistent picture of the large-scale dynamics of the SH 2002 winter, several significant differences may affect detailed studies. The NCEP-NCAR reanalysis (REAN) and NCEP-Department of Energy (DOE) reanalysis-2 (REAN-2) datasets are not recommended for detailed studies, especially those related to polar processing, because of lower-stratospheric temperature biases that result in underestimates of polar processing potential, and because their winds and wave diagnostics show increasing differences from other analyses between similar to 30 and 10 hPa (their top level). Southern Hemisphere polar stratospheric temperatures in the ECMWF 40-Yr Re-analysis (ERA-40) show unrealistic vertical structure, so this long-term reanalysis is also unsuited for quantitative studies. The NCEP/Climate Prediction Center (CPC) objective analyses give an inferior representation of the upper-stratospheric vortex. Polar vortex transport barriers are similar in all analyses, but there is large variation in the amount, patterns, and timing of mixing, even among the operational assimilated datasets (ECMWF, MetO, and GEOS-4). The higher-resolution GEOS-4 and ECMWF assimilations provide significantly better representation of filamentation and small-scale structure than the other analyses, even when fields gridded at reduced resolution are studied. The choice of which analysis to use is most critical for detailed transport

  7. Radiation physics and shielding codes and analyses applied to design-assist and safety analyses of CANDUR and ACRTM reactors

    International Nuclear Information System (INIS)

    Aydogdu, K.; Boss, C. R.

    2006-01-01

    This paper discusses the radiation physics and shielding codes and analyses applied in the design of CANDU and ACR reactors. The focus is on the types of analyses undertaken rather than the inputs supplied to the engineering disciplines. Nevertheless, the discussion does show how these analyses contribute to the engineering design. Analyses in radiation physics and shielding can be categorized as either design-assist or safety and licensing (accident) analyses. Many of the analyses undertaken are designated 'design-assist' where the analyses are used to generate recommendations that directly influence plant design. These recommendations are directed at mitigating or reducing the radiation hazard of the nuclear power plant with engineered systems and components. Thus the analyses serve a primary safety function by ensuring the plant can be operated with acceptable radiation hazards to the workers and public. In addition to this role of design assist, radiation physics and shielding codes are also deployed in safety and licensing assessments of the consequences of radioactive releases of gaseous and liquid effluents during normal operation and gaseous effluents following accidents. In the latter category, the final consequences of accident sequences, expressed in terms of radiation dose to members of the public, and inputs to accident analysis, e.g., decay heat in fuel following a loss-of-coolant accident, are also calculated. Another role of the analyses is to demonstrate that the design of the plant satisfies the principle of ALARA (as low as reasonably achievable) radiation doses. This principle is applied throughout the design process to minimize worker and public doses. The principle of ALARA is an inherent part of all design-assist recommendations and safety and licensing assessments. The main focus of an ALARA exercise at the design stage is to minimize the radiation hazards at the source. This exploits material selection and impurity specifications and relies

  8. Pollen analyses of Pleistocene hyaena coprolites from Montenegro and Serbia

    Directory of Open Access Journals (Sweden)

    Argant Jacqueline

    2007-01-01

    Full Text Available The results of pollen analyses of hyaena coprolites from the Early Pleistocene cave of Trlica in northern Montenegro and the Late Pleistocene cave of Baranica in southeast Serbia are described. The Early Pleistocene Pachycrocuta brevirostris, and the Late Pleistocene Crocuta spelaea are coprolite-producing species. Although the pollen concentration was rather low, the presented analyses add considerably to the much-needed knowledge of the vegetation of the central Balkans during the Pleistocene. Pollen extracted from a coprolite from the Baranica cave indicates an open landscape with the presence of steppe taxa, which is in accordance with the recorded conditions and faunal remains. Pollen analysis of the Early Pleistocene samples from Trlica indicate fresh and temperate humid climatic conditions, as well as the co-existence of several biotopes which formed a mosaic landscape in the vicinity of the cave.

  9. Criticality safety analyses in SKODA JS a.s

    International Nuclear Information System (INIS)

    Mikolas, P.; Svarny, J.

    1999-01-01

    This paper describes criticality safety analyses of spent fuel systems for storage and transport of spent fuel performed in SKODA JS s.r.o.. Analyses were performed for different systems both at NPP site including originally designed spent fuel pool with a large pitch between assemblies without any special absorbing material, high density spent fuel pool with an additional absorption by boron steel, depository rack for fresh fuel assemblies with a very large pitch between fuel assemblies, a container for transport of fresh fuel into the reactor pool and a cask for transport and storage of spent fuel and container for final storage depository. required subcriticality has been proven taking into account all possible unfavourable conditions, uncertainties etc. In two cases, burnup credit methodology is expected to be used. (Authors)

  10. Iterative categorization (IC): a systematic technique for analysing qualitative data

    Science.gov (United States)

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  11. Multivariate analyses of crater parameters and the classification of craters

    Science.gov (United States)

    Siegal, B. S.; Griffiths, J. C.

    1974-01-01

    Multivariate analyses were performed on certain linear dimensions of six genetic types of craters. A total of 320 craters, consisting of laboratory fluidization craters, craters formed by chemical and nuclear explosives, terrestrial maars and other volcanic craters, and terrestrial meteorite impact craters, authenticated and probable, were analyzed in the first data set in terms of their mean rim crest diameter, mean interior relief, rim height, and mean exterior rim width. The second data set contained an additional 91 terrestrial craters of which 19 were of experimental percussive impact and 28 of volcanic collapse origin, and which was analyzed in terms of mean rim crest diameter, mean interior relief, and rim height. Principal component analyses were performed on the six genetic types of craters. Ninety per cent of the variation in the variables can be accounted for by two components. Ninety-nine per cent of the variation in the craters formed by chemical and nuclear explosives is explained by the first component alone.

  12. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  13. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  14. Contribution of thermo-fluid analyses to the LHC experiments

    CERN Document Server

    Gasser, G

    2003-01-01

    The big amount of electrical and electronic equipment that will be installed in the four LHC experiments will cause important heat dissipation into the detectors’ volumes. This is a major issue for the experimental groups, as temperature stability is often a fundamental requirement for the different sub-detectors to be able to provide a good measurement quality. The thermofluid analyses that are carried out in the ST/CV group are a very efficient tool to understand and predict the thermal behaviour of the detectors. These studies are undertaken according to the needs of the experimental groups; they aim at evaluate the thermal stability for a proposed design, or to compare different technical solutions in order to choose the best one for the final design. The usual approach to carry out these studies is first presented and then, some practical examples of thermo-fluid analyses are presented focusing on the main results in order to illustrate their contribution.

  15. Numerical analyses of an aircraft crash on containment building

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Jae Min; Kim, Seung Hyun; Chang, Yoon Suk [Kyunghee University, Yongin (Korea, Republic of)

    2016-05-15

    The containment building is responsible to isolate and protect internal devices against external conditions like earthquake, hurricane and impact loading. It has also to protect leakage of radioactivity, like LOCA (Loss Of Coolant Accident), when severe accidents occurred. Meanwhile, social awareness such as terrorism has been increased globally after international aircraft crashes at World Trade Center and Pentagon. In this paper, FE (Finite Element) analyses according to variation of crash locations and speeds were carried out to examine the aircraft crash impact on a domestic containment building. In this paper, numerical analyses of aircraft crash on NPP's containment building were performed taking into account different locations and aircraft speeds. (1) Amounts of concrete failure were dependent on the crash locations and the connector was the most delicate location comparing to the dome and wall part. (2) Maximum stress values generated at the liner plate and rebars did not exceed their UTS values.

  16. A database structure for radiological optimization analyses of decommissioning operations

    International Nuclear Information System (INIS)

    Zeevaert, T.; Van de Walle, B.

    1995-09-01

    The structure of a database for decommissioning experiences is described. Radiological optimization is a major radiation protection principle in practices and interventions, involving radiological protection factors, economic costs, social factors. An important lack of knowledge with respect to these factors exists in the domain of the decommissioning of nuclear power plants, due to the low number of decommissioning operations already performed. Moreover, decommissioning takes place only once for a installation. Tasks, techniques, and procedures are in most cases rather specific, limiting the use of past experiences in the radiological optimization analyses of new decommissioning operations. Therefore, it is important that relevant data or information be acquired from decommissioning experiences. These data have to be stored in a database in a way they can be used efficiently in ALARA analyses of future decommissioning activities

  17. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    Science.gov (United States)

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  18. Quantitative metagenomic analyses based on average genome size normalization

    DEFF Research Database (Denmark)

    Frank, Jeremy Alexander; Sørensen, Søren Johannes

    2011-01-01

    provide not just a census of the community members but direct information on metabolic capabilities and potential interactions among community members. Here we introduce a method for the quantitative characterization and comparison of microbial communities based on the normalization of metagenomic data...... marine sources using both conventional small-subunit (SSU) rRNA gene analyses and our quantitative method to calculate the proportion of genomes in each sample that are capable of a particular metabolic trait. With both environments, to determine what proportion of each community they make up and how......). These analyses demonstrate how genome proportionality compares to SSU rRNA gene relative abundance and how factors such as average genome size and SSU rRNA gene copy number affect sampling probability and therefore both types of community analysis....

  19. Thermal analyses of the IF-300 shipping cask

    International Nuclear Information System (INIS)

    Meier, J.K.

    1978-07-01

    In order to supply temperature data for structural testing and analysis of shipping casks, a series of thermal analyses using the TRUMP thermal analyzer program were performed on the GE IF-300 spent fuel shipping cask. Major conclusions of the analyses are: (1) Under normal cooling conditions and a cask heat load of 262,000 BTU/h, the seal area of the cask will be roughly 100 0 C (180 0 F) above the ambient surroundings. (2) Under these same conditions the uranium shield at the midpoint of the cask will be between 69 0 C (125 0 F) and 92 0 C (166 0 F) above the ambient surroundings. (3) Significant thermal gradients are not likely to develop between the head studs and the surrounding metal. (4) A representative time constant for the cask as a whole is on the order of one day

  20. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between ne...... information such as boundedness properties and liveness properties. The functionality of the simulation engine and state space facilities are similar to the corresponding components in Design/CPN, which is a widespread tool for Coloured Petri Nets.......CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  1. Applications of MIDAS regression in analysing trends in water quality

    Science.gov (United States)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  2. Elemental abundance and analyses with coadded DAO spectrograms

    International Nuclear Information System (INIS)

    Adelman, S.J.

    1987-01-01

    One can improve the quality of elemental abundance analyses by using higher signal-to-noise data than has been the practice at high resolution. The procedures developed at the Dominion Astrophysical Observatory to coadd high-dispersion coude spectrograms are used with a minimum of 10 6.5 A mm -1 IIa-O spectrograms of each of three field horizontal-branch (FHB)A stars to increase the signal-to-noise ratio of the photographic data over a considerable wavelength region. Fine analyses of the sharp-lined prototype FHB stars HD 109995 and 161817 show an internal consistency which justifies this effort. Their photospheric elemental abundances are similar to those of Population II globular cluster giants. (author)

  3. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  4. Analysing spatially extended high-dimensional dynamics by recurrence plots

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.de [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Kurths, Jürgen [Potsdam Institute for Climate Impact Research, 14412 Potsdam (Germany); Humboldt Universität zu Berlin, Institut für Physik (Germany); Nizhny Novgorod State University, Department of Control Theory, Nizhny Novgorod (Russian Federation); Foerster, Saskia [GFZ German Research Centre for Geosciences, Section 1.4 Remote Sensing, Telegrafenberg, 14473 Potsdam (Germany)

    2015-05-08

    Recurrence plot based measures of complexity are capable tools for characterizing complex dynamics. In this letter we show the potential of selected recurrence plot measures for the investigation of even high-dimensional dynamics. We apply this method on spatially extended chaos, such as derived from the Lorenz96 model and show that the recurrence plot based measures can qualitatively characterize typical dynamical properties such as chaotic or periodic dynamics. Moreover, we demonstrate its power by analysing satellite image time series of vegetation cover with contrasting dynamics as a spatially extended and potentially high-dimensional example from the real world. - Highlights: • We use recurrence plots for analysing partially extended dynamics. • We investigate the high-dimensional chaos of the Lorenz96 model. • The approach distinguishes different spatio-temporal dynamics. • We use the method for studying vegetation cover time series.

  5. SVM models for analysing the headstreams of mine water inrush

    Energy Technology Data Exchange (ETDEWEB)

    Yan Zhi-gang; Du Pei-jun; Guo Da-zhi [China University of Science and Technology, Xuzhou (China). School of Environmental Science and Spatial Informatics

    2007-08-15

    The support vector machine (SVM) model was introduced to analyse the headstrean of water inrush in a coal mine. The SVM model, based on a hydrogeochemical method, was constructed for recognising two kinds of headstreams and the H-SVMs model was constructed for recognising multi- headstreams. The SVM method was applied to analyse the conditions of two mixed headstreams and the value of the SVM decision function was investigated as a means of denoting the hydrogeochemical abnormality. The experimental results show that the SVM is based on a strict mathematical theory, has a simple structure and a good overall performance. Moreover the parameter W in the decision function can describe the weights of discrimination indices of the headstream of water inrush. The value of the decision function can denote hydrogeochemistry abnormality, which is significant in the prevention of water inrush in a coal mine. 9 refs., 1 fig., 7 tabs.

  6. Physiological and enzymatic analyses of pineapple subjected to ionizing radiation

    International Nuclear Information System (INIS)

    Silva, Josenilda Maria da; Silva, Juliana Pizarro; Spoto, Marta Helena Fillet

    2007-01-01

    The physiological and enzymatic post-harvest characteristics of the pineapple cultivar Smooth Cayenne were evaluated after the fruits were gamma-irradiated with doses of 100 and 150 Gy and the fruits were stored for 10, 20 and 30 days at 12 deg C (±1) and relative humidity of 85% (±5). Physiological and enzymatic analyses were made for each storage period to evaluate the alterations resulting from the application of ionizing radiation. Control specimens showed higher values of soluble pectins, total pectins, reducing sugars, sucrose and total sugars and lower values of polyphenyloxidase and polygalacturonase enzyme activities. All the analyses indicated that storage time is a significantly influencing factor. The 100 Gy dosage and 20-day storage period presented the best results from the standpoint of maturation and conservation of the fruits quality. (author)

  7. Engineering analyses of ITER divertor diagnostic rack design

    Energy Technology Data Exchange (ETDEWEB)

    Modestov, Victor S., E-mail: modestov@compmechlab.com [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Nemov, Alexander S.; Borovkov, Aleksey I.; Buslakov, Igor V.; Lukin, Aleksey V. [St Petersburg State Polytechnical University, 195251 St Petersburg, 29 Polytechnicheskaya (Russian Federation); Kochergin, Mikhail M.; Mukhin, Eugene E.; Litvinov, Andrey E.; Koval, Alexandr N. [Ioffe Physico-Technical Institute, 194021 St Petersburg, 26 Polytechnicheskaya (Russian Federation); Andrew, Philip [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France)

    2013-10-15

    Highlights: • The approach developed early has been used for the assessment of new design of DTS racks and neutron shield units. • Results of most critical EM and seismic analyses indicate that introduced changes significantly improved the system behaviour under these loads. • However further research is required to finalize the design and check it upon meeting all structural, thermal, seismic, EM and fatigue requirements. -- Abstract: The divertor port racks used as a support structure of the divertor Thomson scattering equipment has been carefully analyzed to be consistent with electromagnetic and seismic loads. It follows from the foregoing simulations that namely these analyses demonstrate critical challenges associated with the structure design. Based on the results of the reference structure [2] a modified design of the diagnostic racks is proposed and updated simulation results are given. The results signify a significant improvement over the previous reference layout and the design will be continued towards finalization.

  8. A MULTIVARIATE APPROACH TO ANALYSE NATIVE FOREST TREE SPECIE SEEDS

    Directory of Open Access Journals (Sweden)

    Alessandro Dal Col Lúcio

    2006-03-01

    Full Text Available This work grouped, by species, the most similar seed tree, using the variables observed in exotic forest species of theBrazilian flora of seeds collected in the Forest Research and Soil Conservation Center of Santa Maria, Rio Grande do Sul, analyzedfrom January, 1997, to march, 2003. For the cluster analysis, all the species that possessed four or more analyses per lot wereanalyzed by the hierarchical Clustering method, of the standardized Euclidian medium distance, being also a principal componentanalysis technique for reducing the number of variables. The species Callistemon speciosus, Cassia fistula, Eucalyptus grandis,Eucalyptus robusta, Eucalyptus saligna, Eucalyptus tereticornis, Delonix regia, Jacaranda mimosaefolia e Pinus elliottii presentedmore than four analyses per lot, in which the third and fourth main components explained 80% of the total variation. The clusteranalysis was efficient in the separation of the groups of all tested species, as well as the method of the main components.

  9. Lipid analyses of fumigated vs irradiated raw and roasted almonds

    International Nuclear Information System (INIS)

    Uthman, R.S.; Toma, R.B.; Garcia, R.; Medora, N.P.; Cunningham, S.

    1998-01-01

    The purpose of this study was to compare the effects of propylene oxide (PO) and irradiation treatments on the lipid analyses of raw and roasted almonds. Eight kilograms each of raw and roasted almonds were divided into four batches (2 kg each). Three of the batches were subjected to PO treatment or irradiation treatment with a dose of 6, 10·5 kGy. The untreated batch served as control samples, they were taken from all the batches at three consecutive times during storage (day 0, 8 weeks and 16 weeks) and analysed for iodine number, peroxide value and 2-thiobarbituric acid number. Overall, irradiated almonds incurred a higher variation in lipid stability than PO tested almonds while roasted almonds incurred a higher variation than raw almonds

  10. Neoliberalism in education: Five images of critical analyses

    Directory of Open Access Journals (Sweden)

    Branislav Pupala

    2011-03-01

    Full Text Available The survey study brings information about the way that educational researchcopes with neoliberalism as a generalized form of social government in the currentwestern culture. It shows that neoliberalism is considered as a universal scope of otherchanges in the basic segments of education and those theoretical and critical analyses ofthis phenomenon represent an important part of production in the area of educationalresearch. It emphasizes the contribution of formation and development of the socalledgovernmental studies for comprehension of mechanisms and consequences ofneoliberal government of the society and shows how way the methodology of thesestudies helps to identify neoliberal strategies used in the regulation of social subjectsby education. There are five selected segments of critical analyses elaborated (fromthe concept of a lifelong learning, through preschool and university education to theeducation of teachers and PISA project that obviously show ideological and theoreticalcohesiveness of the education analysis through the scope of neoliberal governmentality.

  11. Energy and exergy analyses of electrolytic hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, M A [Ryerson Polytechnic Univ., Toronto, ON (Canada). Dept. of Mechanical Engineering

    1995-07-01

    The thermodynamic performance is investigated of a water-electrolysis process for producing hydrogen, based on current-technology equipment. Both energy and exergy analyses are used. Three cases are considered in which the principal driving energy inputs are (i) electricity, (ii) the high-temperature heat used to generate the electricity, and (iii) the heat source used to produce the high-temperature heat. The nature of the heat source (e.g.) fossil fuel, nuclear fuel, solar energy, (etc.) is left as general as possible. The analyses indicate that, when the main driving input is the hypothetical heat source, the principal thermodynamic losses are associated with water splitting, electricity generation and heat production; the losses are mainly due to the irreversibilities associated with converting a heat source to heat, and heat transfer across large temperature differences. The losses associated with the waste heat in used cooling water, because of its low quality, are not as significant as energy analysis indicates. (Author)

  12. Phylogenomic analyses data of the avian phylogenomics project

    DEFF Research Database (Denmark)

    Jarvis, Erich D; Mirarab, Siavash; Aberer, Andre J

    2015-01-01

    BACKGROUND: Determining the evolutionary relationships among the major lineages of extant birds has been one of the biggest challenges in systematic biology. To address this challenge, we assembled or collected the genomes of 48 avian species spanning most orders of birds, including all Neognathae...... and two of the five Palaeognathae orders. We used these genomes to construct a genome-scale avian phylogenetic tree and perform comparative genomic analyses. FINDINGS: Here we present the datasets associated with the phylogenomic analyses, which include sequence alignment files consisting of nucleotides......ML algorithm or when using statistical binning with the coalescence-based MP-EST algorithm (which we refer to as MP-EST*). Other data sets, such as the coding sequence of some exons, revealed other properties of genome evolution, namely convergence. CONCLUSIONS: The Avian Phylogenomics Project is the largest...

  13. Process for carrying out analyses based on concurrent reactions

    Energy Technology Data Exchange (ETDEWEB)

    Glover, J S; Shepherd, B P

    1980-01-03

    The invention refers to a process for carrying out analyses based on concurrent reactions. A part of a compound to be analysed is subjected with a standard quantity of this compound in a labelled form to a common reaction with a standard quantity of a reagent, which must be less than the sum of the two parts of the reacting compound. The parts of the marked reaction compound and the labelled final compound resulting from the concurrence are separated in a tube (e.g. by centrifuging) after forced phase change (precipitation, absorption etc.) and the radio-activity of both phases in contact is measured separately. The shielded measuring device developed for this and suitable for centrifuge tubes of known dimensions is also included in the patent claims. The insulin concentration of a defined serum is measured as an example of the applications of the method (Radioimmunoassay).

  14. Numerical analyses of an aircraft crash on containment building

    International Nuclear Information System (INIS)

    Sim, Jae Min; Kim, Seung Hyun; Chang, Yoon Suk

    2016-01-01

    The containment building is responsible to isolate and protect internal devices against external conditions like earthquake, hurricane and impact loading. It has also to protect leakage of radioactivity, like LOCA (Loss Of Coolant Accident), when severe accidents occurred. Meanwhile, social awareness such as terrorism has been increased globally after international aircraft crashes at World Trade Center and Pentagon. In this paper, FE (Finite Element) analyses according to variation of crash locations and speeds were carried out to examine the aircraft crash impact on a domestic containment building. In this paper, numerical analyses of aircraft crash on NPP's containment building were performed taking into account different locations and aircraft speeds. (1) Amounts of concrete failure were dependent on the crash locations and the connector was the most delicate location comparing to the dome and wall part. (2) Maximum stress values generated at the liner plate and rebars did not exceed their UTS values

  15. Analysing Trust Transitivity and The Effects of Unknown Dependence

    Directory of Open Access Journals (Sweden)

    Touhid Bhuiyan

    2010-03-01

    Full Text Available Trust can be used to improve online automated recommendation within a given domain. Trust transitivity is used to make it successful. But trust transitivity has different interpretations. Trust and trust transitivity; both are the human mental phenomenon and for this reason, there is no such thing as objective transitivity. Trust transitivity and trust fusion both are important elements in computational trust. This paper analyses the parameter dependence problem in trust transitivity and proposes some definitions considering the effects of base rate. In addition, it also proposes belief functions based on subjective logic to analyse trust transitivity of three specified cases with sensitive and insensitive based rate. Then it presents a quantitative analysis of the effects of unknown dependence problem in an interconnected network environment; such Internet.

  16. Report of analyses for light hydrocarbons in ground water

    International Nuclear Information System (INIS)

    Dromgoole, E.L.

    1982-04-01

    This report contains on microfiche the results of analyses for methane, ethane, propane, and butane in 11,659 ground water samples collected in 47 western and three eastern 1 0 x 2 0 quadrangles of the National Topographic Map Series (Figures 1 and 2), along with a brief description of the analytical technique used and some simple, descriptive statistics. The ground water samples were collected as part of the National Uranium Resource Evaluation (NURE) hydrogeochemical and stream sediment reconnaissance. Further information on the ground water samples can be obtained by consulting the NURE data reports for the individual quadrangles. This information includes (1) measurements characterizing water samples (pH, conductivity, and alkalinity), (2) physical measurements, where applicable (water temperature, well description, and other measurements), and (3) elemental analyses

  17. ATWS analyses for Krsko Full Scope Simulator verification

    Energy Technology Data Exchange (ETDEWEB)

    Cerne, G; Tiselj, I; Parzer, I [Reactor Engineering Div., Inst. Jozef Stefan, Ljubljana (Slovenia)

    2000-07-01

    The purpose of this analysis was to simulate Anticipated Transient without Scram transient for Krsko NPP. The results of these calculations were used for verification of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. For the thermal-hydraulic analyses the RELAP5/MOD2 code and the input card deck for NPP Krsko was used. The analyses for ATWS were performed to assess the influence and benefit of ATWS Mitigation System Actuation Circuitry (AMSAC). In the presented paper the most severe ATWS scenarios have been analyzed, starting with the loss of Main Feedwater at both steam generators. Thus, gradual loss of secondary heat sink occurred. On top of that, control rods were not supposed to scram, leaving the chain reaction to be controlled only by inherent physical properties of the fuel and moderator and eventual actions of the BOP system. The primary system response has been studied regarding the AMSAC availability. (author)

  18. The moral economy of austerity: analysing UK welfare reform.

    Science.gov (United States)

    Morris, Lydia

    2016-03-01

    This paper notes the contemporary emergence of 'morality' in both sociological argument and political rhetoric, and analyses its significance in relation to ongoing UK welfare reforms. It revisits the idea of 'moral economy' and identifies two strands in its contemporary application; that all economies depend on an internal moral schema, and that some external moral evaluation is desirable. UK welfare reform is analysed as an example of the former, with reference to three distinct orientations advanced in the work of Freeden (1996), Laclau (2014), and Lockwood (1996). In this light, the paper then considers challenges to the reform agenda, drawn from third sector and other public sources. It outlines the forms of argument present in these challenges, based respectively on rationality, legality, and morality, which together provide a basis for evaluation of the welfare reforms and for an alternative 'moral economy'. © London School of Economics and Political Science 2016.

  19. Å speide etter spiritualitet. En analyse av spiritualitetsbegrepet i speiderbevegelsen

    OpenAIRE

    Holmefjord, Aina

    2015-01-01

    Denne masteroppgaven inneholder analyser av speiderbevegelsens bruk av begrepet "spiritualitet" i to bøker skrevet av bevegelsens grunnlegger; "Scouting for Boys" og "Rovering to Succes" og to dokumenter av The World Organization of he Scoutmovement . Robert Baden-Powell grunnlag speiderbevegelsen i 1908 og hans litteratur og bøker publisert på tidlig 1900-tallet setter rammeverk for mye av dagens speiderbevegelses ideologi og visjon. Speiderbevegelsen har et r...

  20. GIS baseret analyse af landskabsændringer

    DEFF Research Database (Denmark)

    Kristensen, Søren Bech Pilgaard

    2009-01-01

    af topografiske kort i en GIS analyse er det muligt at udpege de arealer som har været stabile i mere end 100 år og som derfor potentielt rummer store naturværdier. De seneste 150 år er der sket store ændringer i det danske landskab. Mange ekstensive arealtyper (enge, overdrev, heder, etc.) er gået...

  1. Prenominal and postnominal reduced relative clauses: arguments against unitary analyses

    Directory of Open Access Journals (Sweden)

    Petra Sleeman

    2007-01-01

    Full Text Available These last years, several analyses have been proposed in which prenominal and postnominal reduced relatives are merged in the same position. Kayne (1994 claims that both types of reduced relative clauses are the complement of the determiner. More recently, Cinque (2005 has proposed that both types are merged in the functional projections of the noun, at the left edge of the modifier system. In this paper, I argue against a unitary analysis of prenominal and postnominal participial reduced relatives.

  2. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  3. Analysing the Effectiveness of the Personality Symbols/Icons

    OpenAIRE

    Halim, İpek

    2012-01-01

    Personality symbol can cover all the identifications of the brand. It can be the face or the soul of the company. Their effect on the brand image is huge. The research focuses on calculating the roles and effectives of the personality symbols. It aims to bring in suggestions for developing a successful personality symbols and lists advantages and disadvantages of different types of personality symbols. It does a detailed copy testing. Apart from conducting focus groups to analyse how the targ...

  4. Analysing public relations education through international standards: The Portuguese case

    OpenAIRE

    Gonçalves, Gisela Marques Pereira; Spínola, Susana de Carvalho; Padamo, Celma

    2013-01-01

    By using international reports on PR education as a benchmark we analyse the status of PR higher education in Portugal. Despite differences among the study programs, the findings reveal that the standard five courses recommendation by the Commission on Public Relations Education (CPRE) are a part of Portuguese undergraduate curriculum. This includes 12 of the 14 content field guidelines needed to achieve the ideal master's program. Data shows, however, the difficulty of positioning public rel...

  5. Authenticiteit en contracteren omtrent kunst : een rechtsvergelijkende analyse.

    OpenAIRE

    Demarsin, Bert

    2008-01-01

    JURIDISCH-TECHNISCHE TOELICHTING Dit proefschrift is opgebouwd in drie delen. Deel I biedt een interdisciplinair onderzoek naar het authenticiteitsbegrip dat niet alleen vanuit juridisch, maar ook vanuit kunsthistorisch en filosofisch oogpunt ontleed wordt. Delen II en III bouwen vervolgens verder op het aldus ontwikkelde begrippenapparaat. Deel II richt zich op de authenticiteitsproblematiek in de rechtstreekse verhouding tussen de koper en de verkoper van een kunstvoorwerp. De analyse ...

  6. Numerical analyses for efficient photoionization by nonmonochromatic fields

    International Nuclear Information System (INIS)

    Hasegawa, Shuichi; Suzuki, Atsuyuki

    2000-01-01

    Numerical analyses on excitation and ionization probabilities of atoms with hyperfine structures were performed in order to compare two different excitation methods, adiabatic excitation and broadband excitation. The lifetime of the intermediate states was considered in order to investigate the effect of the absorption line broadening. The dependences of the two excitation methods on the lifetime were found to be quite different. The ionization probability by the adiabatic excitation is higher than that by the broadband excitation for identical excitation laser intensity. (author)

  7. Quality assurance requirements for the computer software and safety analyses

    International Nuclear Information System (INIS)

    Husarecek, J.

    1992-01-01

    The requirements are given as placed on the development, procurement, maintenance, and application of software for the creation or processing of data during the design, construction, operation, repair, maintenance and safety-related upgrading of nuclear power plants. The verification and validation processes are highlighted, and the requirements put on the software documentation are outlined. The general quality assurance principles applied to safety analyses are characterized. (J.B.). 1 ref

  8. Progressing From Initially Ambiguous Functional Analyses: Three Case Examples

    OpenAIRE

    Tiger, Jeffrey H.; Fisher, Wayne W.; Toussaint, Karen A.; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman (1982/1994). These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or other...

  9. The availability of toxicological analyses for poisoned patients in Ireland.

    LENUS (Irish Health Repository)

    Cassidy, Nicola

    2010-05-01

    The National Poisons Information Service and the Association of Clinical Biochemists in the United Kingdom published guidelines on laboratory analyses for poisoned patients in 2002. In 2003, U.S. guidelines were prepared by an expert panel of analytical toxicologists and emergency department (ED) physicians. Some professional associations in different countries quote these guidelines but there are no data to support adherence to these recommendations in the medical literature.

  10. Analyse situationnelle de la lutte antitabac au Kenya | CRDI - Centre ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Le Kenya est l'un des 12 pays d'Afrique subsaharienne participant à l'initiative Analyses situationnelles de la lutte antitabac en Afrique (ASTA) financée par le CRDI et la Fondation Bill et Melinda Gates. L'objectif de cette initiative est de favoriser la compréhension des possibilités et des obstacles liés à la lutte antitabac et ...

  11. Analyses situationnelles sur le tabagisme en Afrique | CRDI - Centre ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Grâce à cette subvention, le programme Recherche pour la lutte mondiale contre le tabac (RMCT) du CRDI s'associera à la Fondation Bill et Melinda Gates afin de comprendre les facteurs déterminants du succès de la lutte antitabac en Afrique. L'initiative consistera en l'exécution d'analyses situationnelles visant à évaluer ...

  12. Analyses of hypothetical FCI's in a fast reactor

    International Nuclear Information System (INIS)

    Padilla, A. Jr.; Martin, F.J.; Niccoli, L.G.

    1981-01-01

    Parametric analyses using the SIMMER code were performed to evaluate the potential for a severe recriticality from a pressure-driven recompaction caused by an energetic FCI during the transition phase of a hypothetical accident in a fast reactor. For realistic and reasonable estimates for the assumed accident conditions, a severe recriticality was not predicted. The conditions under which a severe recriticality would be obtained or averted were identified. 10 figures, 2 tables

  13. Security and Privacy Analyses of Internet of Things Toys

    OpenAIRE

    Chu, Gordon; Apthorpe, Noah; Feamster, Nick

    2018-01-01

    This paper investigates the security and privacy of Internet-connected children's smart toys through case studies of three commercially-available products. We conduct network and application vulnerability analyses of each toy using static and dynamic analysis techniques, including application binary decompilation and network monitoring. We discover several publicly undisclosed vulnerabilities that violate the Children's Online Privacy Protection Rule (COPPA) as well as the toys' individual pr...

  14. Heavy water standards. Qualitative analyses, sample treating, stocking and manipulation

    International Nuclear Information System (INIS)

    Pavelescu, M.; Steflea, D.; Mihancea, I.; Varlam, M.; Irimescu, R.

    1995-01-01

    This paper presents methods and procedures for measuring heavy water concentration, and also sampling, stocking and handling of samples to be analysed. The main concentration analysis methods are: mass spectrometry, for concentrations less then 1%, densitometry, for concentrations within the range 1% - 99% and infrared spectrometry for concentrations above 99%. Procedures of sampling, processing and purification appropriate to these measuring methods were established. 1 Tab

  15. Ion Chromatographic Analyses of Sea Waters, Brines and Related Samples

    OpenAIRE

    Nataša Gros

    2013-01-01

    This review focuses on the ion chromatographic methods for the analyses of natural waters with high ionic strength. At the beginning a natural diversity in ionic composition of waters is highlighted and terminology clarified. In continuation a brief overview of other review articles of potential interest is given. A review of ion chromatographic methods is organized in four sections. The first section comprises articles focused on the determination of ionic composition of water samples as com...

  16. Analyse de la croissance de Gymnogongrus patens Agardh de la ...

    African Journals Online (AJOL)

    Rhodophyta, Phyllophoraceae) a été analysée sur des échantillons d'algues récoltés mensuellement pendant un cycle annuel, d'avril 2002 à mars 2003, sur la plage de Méhdia (Nord ouest de la côte atlantique marocaine). L'analyse des paramètres de ...

  17. Novel Space Exploration Technique for Analysing Planetary Atmospheres

    OpenAIRE

    Dekoulis, George

    2010-01-01

    The chapter presents a new reconfigurable wide-beam radio interferometer system for analysing planetary atmospheres. The system operates at frequencies, where the ionisation of the planetary plasma regions induces strong attenuation. For Earth, the attenuation is undistinguishable from the CMB at frequencies over 50 MHz. The system introduces a set of advanced specifications to this field of science, previously unseen in similar suborbital experiments. The reprogrammable dynamic range of the ...

  18. PALSfit3: A software package for analysing positron lifetime spectra

    DEFF Research Database (Denmark)

    Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard

    The present report describes a Windows based computer program called PALSfit3. The purpose of the program is to carry out analyses of spectra that have been measured by positron annihilation lifetime spectroscopy (PALS). PALSfit3 is based on the well tested PATFIT and PALS fit programs, which hav...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...

  19. Preparation of Kepler light curves for asteroseismic analyses

    DEFF Research Database (Denmark)

    García, R.A.; Hekker, Saskia; Stello, Dennis

    2011-01-01

    The Kepler mission is providing photometric data of exquisite quality for the asteroseismic study of different classes of pulsating stars. These analyses place particular demands on the pre-processing of the data, over a range of time-scales from minutes to months. Here, we describe processing...... procedures developed by the Kepler Asteroseismic Science Consortium to prepare light curves that are optimized for the asteroseismic study of solar-like oscillating stars in which outliers, jumps and drifts are corrected....

  20. Use of probabilistic safety analyses in severe accident management

    International Nuclear Information System (INIS)

    Neogy, P.; Lehner, J.

    1991-01-01

    An important consideration in the development and assessment of severe accident management strategies is that while the strategies are often built on the knowledge base of Probabilistic Safety Analyses (PSA), they must be interpretable and meaningful in terms of the control room indicators. In the following, the relationships between PSA and severe accident management are explored using ex-vessel accident management at a PWR ice-condenser plant as an example. 2 refs., 1 fig., 3 tabs

  1. APPLYING SPECTROSCOPIC METHODS ON ANALYSES OF HAZARDOUS WASTE

    OpenAIRE

    Dobrinić, Julijan; Kunić, Marija; Ciganj, Zlatko

    2000-01-01

    Abstract The paper presents results of measuring the content of heavy and other metals in waste samples from the hazardous waste disposal site of Sovjak near Rijeka. The preliminary design elaboration and the choice of the waste disposal sanification technology were preceded by the sampling and physico-chemical analyses of disposed waste, enabling its categorization. The following spectroscopic methods were applied on metal content analysis: Atomic absorption spectroscopy (AAS) and plas...

  2. Development of ITER 3D neutronics model and nuclear analyses

    International Nuclear Information System (INIS)

    Zeng, Q.; Zheng, S.; Lu, L.; Li, Y.; Ding, A.; Hu, H.; Wu, Y.

    2007-01-01

    ITER nuclear analyses rely on the calculations with the three-dimensional (3D) Monte Carlo code e.g. the widely-used MCNP. However, continuous changes in the design of the components require the 3D neutronics model for nuclear analyses should be updated. Nevertheless, the modeling of a complex geometry with MCNP by hand is a very time-consuming task. It is an efficient way to develop CAD-based interface code for automatic conversion from CAD models to MCNP input files. Based on the latest CAD model and the available interface codes, the two approaches of updating 3D nuetronics model have been discussed by ITER IT (International Team): The first is to start with the existing MCNP model 'Brand' and update it through a combination of direct modification of the MCNP input file and generation of models for some components directly from the CAD data; The second is to start from the full CAD model, make the necessary simplifications, and generate the MCNP model by one of the interface codes. MCAM as an advanced CAD-based MCNP interface code developed by FDS Team in China has been successfully applied to update the ITER 3D neutronics model by adopting the above two approaches. The Brand model has been updated to generate portions of the geometry based on the newest CAD model by MCAM. MCAM has also successfully performed conversion to MCNP neutronics model from a full ITER CAD model which is simplified and issued by ITER IT to benchmark the above interface codes. Based on the two updated 3D neutronics models, the related nuclear analyses are performed. This paper presents the status of ITER 3D modeling by using MCAM and its nuclear analyses, as well as a brief introduction of advanced version of MCAM. (authors)

  3. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  4. Analyses of karyotypes and comparative physical locations of the ...

    African Journals Online (AJOL)

    The frequencies of signal detection of the marker, RG556 and the BAC clone, 44B4, were 8.0 and 41.3% in O. sativa, while 9.0 and 42.3% in O. officinalis, respectively. Based on a comparative RFLP map of a wild rice, O. officinalis and O. sativa, comparative analyses of karyotypes of O. officinalis were demonstrated firstly ...

  5. Analysing passenger arrivals rates and waiting time at bus stops

    OpenAIRE

    Kaparias, I.; Rossetti, C.; Trozzi, V.

    2015-01-01

    The present study investigates the rather under-explored topic of passenger waiting times at public transport facilities. Using data collected from part of London’s bus network by means of physical counts, measurements and observations, and complemented by on-site passenger interviews, the waiting behaviour is analysed for a number of bus stops served by different numbers of lines. The analysis employs a wide range of statistical methods and tools, and concentrates on three aspects: passenger...

  6. Homeopathy: meta-analyses of pooled clinical data.

    Science.gov (United States)

    Hahn, Robert G

    2013-01-01

    In the first decade of the evidence-based era, which began in the mid-1990s, meta-analyses were used to scrutinize homeopathy for evidence of beneficial effects in medical conditions. In this review, meta-analyses including pooled data from placebo-controlled clinical trials of homeopathy and the aftermath in the form of debate articles were analyzed. In 1997 Klaus Linde and co-workers identified 89 clinical trials that showed an overall odds ratio of 2.45 in favor of homeopathy over placebo. There was a trend toward smaller benefit from studies of the highest quality, but the 10 trials with the highest Jadad score still showed homeopathy had a statistically significant effect. These results challenged academics to perform alternative analyses that, to demonstrate the lack of effect, relied on extensive exclusion of studies, often to the degree that conclusions were based on only 5-10% of the material, or on virtual data. The ultimate argument against homeopathy is the 'funnel plot' published by Aijing Shang's research group in 2005. However, the funnel plot is flawed when applied to a mixture of diseases, because studies with expected strong treatments effects are, for ethical reasons, powered lower than studies with expected weak or unclear treatment effects. To conclude that homeopathy lacks clinical effect, more than 90% of the available clinical trials had to be disregarded. Alternatively, flawed statistical methods had to be applied. Future meta-analyses should focus on the use of homeopathy in specific diseases or groups of diseases instead of pooling data from all clinical trials. © 2013 S. Karger GmbH, Freiburg.

  7. Finite element analyses for seismic shear wall international standard problem

    International Nuclear Information System (INIS)

    Park, Y.J.; Hofmayer, C.H.

    1998-04-01

    Two identical reinforced concrete (RC) shear walls, which consist of web, flanges and massive top and bottom slabs, were tested up to ultimate failure under earthquake motions at the Nuclear Power Engineering Corporation's (NUPEC) Tadotsu Engineering Laboratory, Japan. NUPEC provided the dynamic test results to the OECD (Organization for Economic Cooperation and Development), Nuclear Energy Agency (NEA) for use as an International Standard Problem (ISP). The shear walls were intended to be part of a typical reactor building. One of the major objectives of the Seismic Shear Wall ISP (SSWISP) was to evaluate various seismic analysis methods for concrete structures used for design and seismic margin assessment. It also offered a unique opportunity to assess the state-of-the-art in nonlinear dynamic analysis of reinforced concrete shear wall structures under severe earthquake loadings. As a participant of the SSWISP workshops, Brookhaven National Laboratory (BNL) performed finite element analyses under the sponsorship of the U.S. Nuclear Regulatory Commission (USNRC). Three types of analysis were performed, i.e., monotonic static (push-over), cyclic static and dynamic analyses. Additional monotonic static analyses were performed by two consultants, F. Vecchio of the University of Toronto (UT) and F. Filippou of the University of California at Berkeley (UCB). The analysis results by BNL and the consultants were presented during the second workshop in Yokohama, Japan in 1996. A total of 55 analyses were presented during the workshop by 30 participants from 11 different countries. The major findings on the presented analysis methods, as well as engineering insights regarding the applicability and reliability of the FEM codes are described in detail in this report. 16 refs., 60 figs., 16 tabs

  8. Monte Carlo parameter studies and uncertainty analyses with MCNP5

    International Nuclear Information System (INIS)

    Brown, F. B.; Sweezy, J. E.; Hayes, R.

    2004-01-01

    A software tool called mcnp p study has been developed to automate the setup, execution, and collection of results from a series of MCNP5 Monte Carlo calculations. This tool provides a convenient means of performing parameter studies, total uncertainty analyses, parallel job execution on clusters, stochastic geometry modeling, and other types of calculations where a series of MCNP5 jobs must be performed with varying problem input specifications. (authors)

  9. Freefem++ in THM analyses of KBS-3 deposition hole

    International Nuclear Information System (INIS)

    Lempinen, A.

    2006-12-01

    The applicability of Freefem++ as a software for thermo-hydro-mechanical analysis of KBS-3V deposition hole was evaluated. Freefem++ is software for multiphysical simulations with finite element method. A set of previously performed analyses were successfully repeated with Freefem++. The only significant problem was to impose unique values for variables at the canister surface. This problem can be circumvented with an iterative method, and it can possibly be solved later, since Freefem++ is opensource software. (orig.)

  10. [Application of big data analyses for musculoskeletal cell differentiation].

    Science.gov (United States)

    Imai, Yuuki

    2016-04-01

    Next generation sequencer has strongly progress big data analyses in life science. Among various kinds of sequencing data sets, epigenetic platform has just been important key to clarify the questions on broad and detail phenomenon in various forms of life. In this report, it is introduced that the research on identification of novel transcription factors in osteoclastogenesis using DNase-seq. Big data on musculoskeletal research will be organized by IFMRS and is getting more crucial.

  11. ANALYSING THE PRINCIPAL ELEMENTS WHICH INFLUENCE THE BUSINESS ETHICS

    Directory of Open Access Journals (Sweden)

    Laurenţia Georgeta AVRAM

    2010-03-01

    Full Text Available Along with increasing the influence of the private sector in the economical and financial life, the interest in the business constantly grows. It isn’t enough for organizations to offer new more qualitative, accessible and safer products on the market for the customers or these to offer better conditions for the employees, they must analyse the elements which influence ethics in the business, eradicating poverty, sustaining the healthy system and protecting the environment.

  12. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  13. Automated monosegmented flow analyser. Determination of glucose, creatinine and urea.

    Science.gov (United States)

    Raimundo Júnior, I M; Pasquini, C

    1997-10-01

    An automated monosegmented flow analyser containing a sampling valve and a reagent addition module and employing a laboratory-made photodiode array spectrophotometer as detection system is described. The instrument was controlled by a 386SX IBM compatible microcomputer through an IC8255 parallel port that communicates with the interface which controls the sampling valve and reagent addition module. The spectrophotometer was controlled by the same microcomputer through an RS232 serial standard interface. The software for the instrument was written in QuickBasic 4.5. Opto-switches were employed to detect the air bubbles limiting the monosegment, allowing precise sample localisation for reagent addition and signal reading. The main characteristics of the analyser are low reagent consumption and high sensitivity which is independent of the sample volume. The instrument was designed to determine glucose, creatinine or urea in blood plasma and serum without hardware modification. The results were compared against those obtained by the Clinical Hospital of UNICAMP using commercial analysers. Correlation coefficients among the methods were 0.997, 0.982 and 0.996 for glucose, creatinine and urea, respectively.

  14. Linking material and energy flow analyses and social theory

    Energy Technology Data Exchange (ETDEWEB)

    Schiller, Frank [The Open University, Faculty of Maths, Computing and Technology, Walton Hall, Milton Keynes, MK7 6AA (United Kingdom)

    2009-04-15

    The paper explores the potential of Habermas' theory of communicative action to alter the social reflexivity of material and energy flow analysis. With his social macro theory Habermas has provided an alternative, critical justification for social theory that can be distinguished from economic libertarianism and from political liberalism. Implicitly, most flow approaches draw from these theoretical traditions rather than from discourse theory. There are several types of material and energy flow analyses. While these concepts basically share a system theoretical view, they lack a specific interdisciplinary perspective that ties the fundamental insight of flows to disciplinary scientific development. Instead of simply expanding micro-models to the social macro-dimension social theory suggests infusing the very notion of flows to the progress of disciplines. With regard to the functional integration of society, material and energy flow analyses can rely on the paradigm of ecological economics and at the same time progress the debate between strong and weak sustainability within the paradigm. However, placing economics at the centre of their functional analyses may still ignore the broader social integration of society, depending on their pre-analytic outline of research and the methods used. (author)

  15. Graphic-based musculoskeletal model for biomechanical analyses and animation.

    Science.gov (United States)

    Chao, Edmund Y S

    2003-04-01

    The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.

  16. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  17. Proteomic analyses of host and pathogen responses during bovine mastitis.

    Science.gov (United States)

    Boehmer, Jamie L

    2011-12-01

    The pursuit of biomarkers for use as clinical screening tools, measures for early detection, disease monitoring, and as a means for assessing therapeutic responses has steadily evolved in human and veterinary medicine over the past two decades. Concurrently, advances in mass spectrometry have markedly expanded proteomic capabilities for biomarker discovery. While initial mass spectrometric biomarker discovery endeavors focused primarily on the detection of modulated proteins in human tissues and fluids, recent efforts have shifted to include proteomic analyses of biological samples from food animal species. Mastitis continues to garner attention in veterinary research due mainly to affiliated financial losses and food safety concerns over antimicrobial use, but also because there are only a limited number of efficacious mastitis treatment options. Accordingly, comparative proteomic analyses of bovine milk have emerged in recent years. Efforts to prevent agricultural-related food-borne illness have likewise fueled an interest in the proteomic evaluation of several prominent strains of bacteria, including common mastitis pathogens. The interest in establishing biomarkers of the host and pathogen responses during bovine mastitis stems largely from the need to better characterize mechanisms of the disease, to identify reliable biomarkers for use as measures of early detection and drug efficacy, and to uncover potentially novel targets for the development of alternative therapeutics. The following review focuses primarily on comparative proteomic analyses conducted on healthy versus mastitic bovine milk. However, a comparison of the host defense proteome of human and bovine milk and the proteomic analysis of common veterinary pathogens are likewise introduced.

  18. Nuclear power plant analysers: their approach to analysis and design

    International Nuclear Information System (INIS)

    Ancarani, A.; Zanobetti, D.

    1985-01-01

    ''Analysers'' as used for nuclear power plant simulators are powerful tools and their purpose can be variously assigned: it may vary from the aid in the design of power plants to the assistance to operators in emergency situations. A fundamental problem arising from the analysers' concept and use is the definition of the simulation capability. This can be assessed either by comparison with previous operational data statistically significant and suitably elaborated; or by comparison with theoretical (computed) values obtained from engineering codes. In both these, to take advantage of all the possibilities offered by the ''analysers'', it is mandatory that suitable terms of reference be clearly stated and agreed upon. Particular care is devoted to accuracy in the prediction of physical values both for the steady state and the transient situations. For instance, it can be seen that such evaluations can be met by specifying the maximum error on value of parameters (ordinates), save for very fast transients; the maximum error on time (abscissae) for occurrence of extreme values; the maximum error on values of extremes (ordinates); the maximum error on derivatives (slopes) for rapidly variable transients, save near extreme values. The paper also deals with a brief account of the present projects and proposals in different countries as known from various sources, and mentions a possible co-ordination at international level. (author)

  19. Analysing the Improper Pronunciation of Diphthongs by Iraqi EFL learners

    Directory of Open Access Journals (Sweden)

    Mukhalad Malik Almutalabi

    2018-04-01

    Full Text Available The current study aims at analysing the improper pronunciation of Iraqi EFL learners concerning the pronunciation of diphthongs in words of various syllables. It describes and identifies thoroughly the mispronunciations of such important sounds in English language. The study attempts at analysing such mispronunciations by clarifying and assigning the phonetic deviations of Iraqi EFL learners when they pronounce diphthongs. So the main objective of the study is to analyse the errors committed by Iraqi learners in the pronunciation of diphthongs grouping each error into its specific category. To verify the objective of the paper, 25 Iraqi EFL learners from the department of English at Cihan University/ Slemani are chosen to be the main participants of the study. The test which was conducted in the laboratory of the Department of English contained 10 words comprising various diphthongs. The results clearly revealed that mispronouncing English diphthongs by Iraqi EFL were mostly observed by replacing the required diphthong with another improper one and they also tended to use simple vowels instead of the correct required diphthongs.

  20. Effects of undetected data quality issues on climatological analyses

    Directory of Open Access Journals (Sweden)

    S. Hunziker

    2018-01-01

    Full Text Available Systematic data quality issues may occur at various stages of the data generation process. They may affect large fractions of observational datasets and remain largely undetected with standard data quality control. This study investigates the effects of such undetected data quality issues on the results of climatological analyses. For this purpose, we quality controlled daily observations of manned weather stations from the Central Andean area with a standard and an enhanced approach. The climate variables analysed are minimum and maximum temperature and precipitation. About 40 % of the observations are inappropriate for the calculation of monthly temperature means and precipitation sums due to data quality issues. These quality problems undetected with the standard quality control approach strongly affect climatological analyses, since they reduce the correlation coefficients of station pairs, deteriorate the performance of data homogenization methods, increase the spread of individual station trends, and significantly bias regional temperature trends. Our findings indicate that undetected data quality issues are included in important and frequently used observational datasets and hence may affect a high number of climatological studies. It is of utmost importance to apply comprehensive and adequate data quality control approaches on manned weather station records in order to avoid biased results and large uncertainties.

  1. Meta-analyses of HFE variants in coronary heart disease.

    Science.gov (United States)

    Lian, Jiangfang; Xu, Limin; Huang, Yi; Le, Yanping; Jiang, Danjie; Yang, Xi; Xu, Weifeng; Huang, Xiaoyan; Dong, Changzheng; Ye, Meng; Zhou, Jianqing; Duan, Shiwei

    2013-09-15

    HFE gene variants can cause hereditary hemochromatosis (HH) that often comes along with an increased risk of coronary heart disease (CHD). The goal of our study is to assess the contribution of four HFE gene variants to the risk of CHD. We conducted four meta-analyses of the studies examining the association between four HFE gene variants and the risk of CHD. A systematic search was conducted using MEDLINE, EMBASE, Web of Science and China National Knowledge Infrastructure (CNKI), Wanfang Chinese Periodical. Meta-analyses showed that HFE rs1799945-G allele was associated with a 6% increased risk of CHD (P=0.02, odds ratio (OR)=1.06, 95% confidence interval (CI)=1.01-1.11). However, no association between the other three HFE gene variants (rs1800562, rs1800730, and rs9366637) and CHD risk was observed by the meta-analyses (all P values>0.05). In addition, the results of our case-control study indicated that rs1800562 and rs1800730 were monomorphic, and that rs1799945 and rs9366637 were not associated with CHD in Han Chinese. Our meta-analysis suggested that a significant association existed between rs1799945 mutation and CHD, although this mutation was rare in Han Chinese. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Accounting for rate variation among lineages in comparative demographic analyses

    Science.gov (United States)

    Hope, Andrew G.; Ho, Simon Y. W.; Malaney, Jason L.; Cook, Joseph A.; Talbot, Sandra L.

    2014-01-01

    Genetic analyses of contemporary populations can be used to estimate the demographic histories of species within an ecological community. Comparison of these demographic histories can shed light on community responses to past climatic events. However, species experience different rates of molecular evolution, and this presents a major obstacle to comparative demographic analyses. We address this problem by using a Bayesian relaxed-clock method to estimate the relative evolutionary rates of 22 small mammal taxa distributed across northwestern North America. We found that estimates of the relative molecular substitution rate for each taxon were consistent across the range of sampling schemes that we compared. Using three different reference rates, we rescaled the relative rates so that they could be used to estimate absolute evolutionary timescales. Accounting for rate variation among taxa led to temporal shifts in our skyline-plot estimates of demographic history, highlighting both uniform and idiosyncratic evolutionary responses to directional climate trends for distinct ecological subsets of the small mammal community. Our approach can be used in evolutionary analyses of populations from multiple species, including comparative demographic studies.

  3. Multivariate differential analyses of adolescents' experiences of aggression in families

    Directory of Open Access Journals (Sweden)

    Chris Myburgh

    2011-01-01

    Full Text Available Aggression is part of South African society and has implications for the mental health of persons living in South Africa. If parents are aggressive adolescents are also likely to be aggressive and that will impact negatively on their mental health. In this article the nature and extent of adolescents' experiences of aggression and aggressive behaviour in the family are investigated. A deductive explorative quantitative approach was followed. Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers Cronbach Alpha, various consecutive first and second order factor analyses, correlations, multiple regression, MANOVA, ANOVA and Scheffè/ Dunnett tests were used. It was found that aggression correlated negatively with the independent variables; and the correlations between adolescents and their parents were significant. Regression analyses indicated that different predictors predicted aggression. Furthermore, differences between adolescents and their parents indicated that the experienced levels of aggression between adolescents and their parents were small. Implications for education are given.

  4. Stress analyses of ITER toroidal field coils under fault conditions

    International Nuclear Information System (INIS)

    Jong, C.T.J.

    1990-02-01

    The International Thermonuclear Experimental Reactor (ITER) is intended as an experimental thermonuclear tokamak reactor for testing the basic physics, performance and technologies essential to future fusion reactors. The ITER design will be based on extensive new design work, supported by new physical and technological results, and on the great body of experience built up over several years from previous national and international reactor studies. Conversely, the ITER design process should provide the fusion community with valuable insights into what key areas need further development or clarification as we move forward towards practical fusion power. As part of the design process of the ITER toroidal field coils the mechanical behaviour of the magnetic system under fault conditions has to be analysed in more detail. This paper describes the work carried out to create a detailed finite element model of two toroidal field coils as well as some results of linear elastic analyses with fault conditions. The analyses have been performed with the finite element code ANSYS. (author). 5 refs.; 8 figs.; 2 tabs

  5. Target gene analyses of 39 amelogenesis imperfecta kindreds

    Science.gov (United States)

    Chan, Hui-Chen; Estrella, Ninna M. R. P.; Milkovich, Rachel N.; Kim, Jung-Wook; Simmer, James P.; Hu, Jan C-C.

    2012-01-01

    Previously, mutational analyses identified six disease-causing mutations in 24 amelogenesis imperfecta (AI) kindreds. We have since expanded the number of AI kindreds to 39, and performed mutation analyses covering the coding exons and adjoining intron sequences for the six proven AI candidate genes [amelogenin (AMELX), enamelin (ENAM), family with sequence similarity 83, member H (FAM83H), WD repeat containing domain 72 (WDR72), enamelysin (MMP20), and kallikrein-related peptidase 4 (KLK4)] and for ameloblastin (AMBN) (a suspected candidate gene). All four of the X-linked AI families (100%) had disease-causing mutations in AMELX, suggesting that AMELX is the only gene involved in the aetiology of X-linked AI. Eighteen families showed an autosomal-dominant pattern of inheritance. Disease-causing mutations were identified in 12 (67%): eight in FAM83H, and four in ENAM. No FAM83H coding-region or splice-junction mutations were identified in three probands with autosomal-dominant hypocalcification AI (ADHCAI), suggesting that a second gene may contribute to the aetiology of ADHCAI. Six families showed an autosomal-recessive pattern of inheritance, and disease-causing mutations were identified in three (50%): two in MMP20, and one in WDR72. No disease-causing mutations were found in 11 families with only one affected member. We conclude that mutation analyses of the current candidate genes for AI have about a 50% chance of identifying the disease-causing mutation in a given kindred. PMID:22243262

  6. [Clinical research XXIII. From clinical judgment to meta-analyses].

    Science.gov (United States)

    Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O

    2014-01-01

    Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.

  7. Linking material and energy flow analyses and social theory

    International Nuclear Information System (INIS)

    Schiller, Frank

    2009-01-01

    The paper explores the potential of Habermas' theory of communicative action to alter the social reflexivity of material and energy flow analysis. With his social macro theory Habermas has provided an alternative, critical justification for social theory that can be distinguished from economic libertarianism and from political liberalism. Implicitly, most flow approaches draw from these theoretical traditions rather than from discourse theory. There are several types of material and energy flow analyses. While these concepts basically share a system theoretical view, they lack a specific interdisciplinary perspective that ties the fundamental insight of flows to disciplinary scientific development. Instead of simply expanding micro-models to the social macro-dimension social theory suggests infusing the very notion of flows to the progress of disciplines. With regard to the functional integration of society, material and energy flow analyses can rely on the paradigm of ecological economics and at the same time progress the debate between strong and weak sustainability within the paradigm. However, placing economics at the centre of their functional analyses may still ignore the broader social integration of society, depending on their pre-analytic outline of research and the methods used. (author)

  8. Unconscious analyses of visual scenes based on feature conjunctions.

    Science.gov (United States)

    Tachibana, Ryosuke; Noguchi, Yasuki

    2015-06-01

    To efficiently process a cluttered scene, the visual system analyzes statistical properties or regularities of visual elements embedded in the scene. It is controversial, however, whether those scene analyses could also work for stimuli unconsciously perceived. Here we show that our brain performs the unconscious scene analyses not only using a single featural cue (e.g., orientation) but also based on conjunctions of multiple visual features (e.g., combinations of color and orientation information). Subjects foveally viewed a stimulus array (duration: 50 ms) where 4 types of bars (red-horizontal, red-vertical, green-horizontal, and green-vertical) were intermixed. Although a conscious perception of those bars was inhibited by a subsequent mask stimulus, the brain correctly analyzed the information about color, orientation, and color-orientation conjunctions of those invisible bars. The information of those features was then used for the unconscious configuration analysis (statistical processing) of the central bars, which induced a perceptual bias and illusory feature binding in visible stimuli at peripheral locations. While statistical analyses and feature binding are normally 2 key functions of the visual system to construct coherent percepts of visual scenes, our results show that a high-level analysis combining those 2 functions is correctly performed by unconscious computations in the brain. (c) 2015 APA, all rights reserved).

  9. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  10. Review of accident analyses of RB experimental reactor

    International Nuclear Information System (INIS)

    Pesic, M.

    2003-01-01

    The RB reactor is a uranium fuel heavy water moderated critical assembly that has been put and kept in operation by the VINCA Institute of Nuclear Sciences, Belgrade, Serbia and Montenegro, since April 1958. The first complete Safety Analysis Report of the RB reactor was prepared in 1961/62; yet, the first accident analysis had been made in late 1958 with the aim to examine a power transition and the total equivalent doses received by the staff during the reactivity accident that occurred on October 15, 1958. Since 1960, the RB reactor has been modified a few times. Beside the initial natural uranium metal fuel rods, new types of fuel (TVR-S types of Russian origin) consisting of 2% enriched uranium metal and 80% enriched U0 2 , dispersed in aluminum matrix, have been available since 1962 and 1976, respectively. Modifications of the control and safety systems of the reactor were made occasionally. Special reactor cores were designed and constructed using all three types of fuel elements, as well as the coupled fast-thermal ones. The Nuclear Safety Committee of the VINCA Institute, an independent regulator)' body, approved for usage all these modifications of the RB reactor on the basis of the Preliminary Safety' Analysis Reports, which, beside proposed technical modifications and new regulation rules, included safety analyses of various possible accidents. A special attention was given (and a new safety methodology was proposed) to thorough analyses of the design-based accidents related to the coupled fast-thermal cores that included central zones of the reactor filled by the fuel elements without any moderator. In this paper, an overview of some accidents, methodologies and computation tools used for the accident analyses of the RB reactor is given. (author)

  11. A review of multivariate analyses in imaging genetics

    Directory of Open Access Journals (Sweden)

    Jingyu eLiu

    2014-03-01

    Full Text Available Recent advances in neuroimaging technology and molecular genetics provide the unique opportunity to investigate genetic influence on the variation of brain attributes. Since the year 2000, when the initial publication on brain imaging and genetics was released, imaging genetics has been a rapidly growing research approach with increasing publications every year. Several reviews have been offered to the research community focusing on various study designs. In addition to study design, analytic tools and their proper implementation are also critical to the success of a study. In this review, we survey recent publications using data from neuroimaging and genetics, focusing on methods capturing multivariate effects accommodating the large number of variables from both imaging data and genetic data. We group the analyses of genetic or genomic data into either a prior driven or data driven approach, including gene-set enrichment analysis, multifactor dimensionality reduction, principal component analysis, independent component analysis (ICA, and clustering. For the analyses of imaging data, ICA and extensions of ICA are the most widely used multivariate methods. Given detailed reviews of multivariate analyses of imaging data available elsewhere, we provide a brief summary here that includes a recently proposed method known as independent vector analysis. Finally, we review methods focused on bridging the imaging and genetic data by establishing multivariate and multiple genotype-phenotype associations, including sparse partial least squares, sparse canonical correlation analysis, sparse reduced rank regression and parallel ICA. These methods are designed to extract latent variables from both genetic and imaging data, which become new genotypes and phenotypes, and the links between the new genotype-phenotype pairs are maximized using different cost functions. The relationship between these methods along with their assumptions, advantages, and

  12. Review of accident analyses of RB experimental reactor

    Directory of Open Access Journals (Sweden)

    Pešić Milan P.

    2003-01-01

    Full Text Available The RB reactor is a uranium fuel heavy water moderated critical assembly that has been put and kept in operation by the VTNCA Institute of Nuclear Sciences, Belgrade, Serbia and Montenegro, since April 1958. The first complete Safety Analysis Report of the RB reactor was prepared in 1961/62 yet, the first accident analysis had been made in late 1958 with the aim to examine a power transition and the total equivalent doses received by the staff during the reactivity accident that occurred on October 15, 1958. Since 1960, the RB reactor has been modified a few times. Beside the initial natural uranium metal fuel rods, new types of fuel (TVR-S types of Russian origin consisting of 2% enriched uranium metal and 80% enriched UO2 dispersed in aluminum matrix, have been available since 1962 and 1976 respectively. Modifications of the control and safety systems of the reactor were made occasionally. Special reactor cores were designed and constructed using all three types of fuel elements as well as the coupled fast-thermal ones. The Nuclear Safety Committee of the VINĆA Institute, an independent regulatory body, approved for usage all these modifications of the RB reactor on the basis of the Preliminary Safety Analysis Reports, which, beside proposed technical modifications and new regulation rules, included safety analyses of various possible accidents. A special attention was given (and a new safety methodology was proposed to thorough analyses of the design-based accidents related to the coupled fast-thermal cores that included central zones of the reactor filled by the fuel elements without any moderator. In this paper, an overview of some accidents, methodologies and computation tools used for the accident analyses of the RB reactor is given.

  13. Comparison of design and probabilistic analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Campbell, R.D.

    1995-01-01

    A study was made to evaluate the margin of conservatism introduced into design in-structures response spectra by following standard design analysis procedures according to the U.S.Nuclear Regulatory Commission (NRC) Standard Review Plan and Regulatory Guides for comparing spectra produced by such a design analysis to response from median-centered probabilistic analyses. Three typical nuclear plant structures were studied: PWR reactor building, PWR auxiliary building and BWR reactor building. Each building was assumed to be situated on three idealized sites: a rock site, a medium and a soft soil site. All buildings were assumed to have embedded foundations. The PWR reactor building was also assumed to have a surface foundation. Each design analysis was performed inn accordance with the current SRP criteria. Each probabilistic analysis consisted of 30 earthquake simulations for which the free-field motions and soil and structural properties were varied; the simulated earthquakes were generated such that their mean-plus-one-standard-deviation free-field spectra approximated the Regulatory Guide (RG) 1.60 design spectra. In-structure response spectra from the design analyses were compared with the 84% non-exceedance probability (NEP) spectra from the probabilistic analyses. The comparisons showed that the design method produced conservative results for all cases. The smallest margin was about 10% for buildings on rock sites. Softer sides had larger margins of conservatism; the reactor buildings on the soft soil site had margins of as much as 100% (factor of 2),. The shorter structures and lower locations in all buildings had smaller margins. The margin of conservatism for the surface founded reactor building was about 20% more than for the embedded reactor building. (author). 3 refs., 5 figs., 1 tab

  14. Eine selbstkonsistente Carleman Linearisierung zur Analyse von Oszillatoren

    Directory of Open Access Journals (Sweden)

    H. Weber

    2017-09-01

    Full Text Available Die Analyse nichtlinearer dynamischer Schaltungen ist bis heute eine herausfordernde Aufgabe, da nur selten analytische Lösungen angegeben werden können. Daher wurden eine Vielzahl von Methoden entwickelt, um eine qualitative oder quantitative Näherung für die Lösungen der Netzwerkgleichung zu erhalten. Oftmals wird beispielsweise eine Kleinsignalanalyse mit Hilfe einer Taylorreihe in einem Arbeitspunkt durchgeführt, die nach den Gliedern erster Ordnung abgebrochen wird. Allerdings ist diese Linearisierung nur in der Nähe des stabilen Arbeitspunktes für hyperbolische Systeme gültig. Besonders für die Analyse des dynamischen Verhaltens von Oszillatoren treten jedoch nicht-hyperbolische Systeme auf, sodass diese Methode nicht angewendet werden kann Mathis(2000. Carleman hat gezeigt, dass nichtlineare Differentialgleichungen mit polynomiellen Nichtlinearitäten in ein unendliches System von linearen Differentialgleichungen transformiert werden können Carleman(1932. Wird das unendlichdimensionale Gleichungssystem für numerische Zwecke abgebrochen, kann bei Oszillatoren der Übergang in eine stationäre Schwingung (Grenzzyklus nicht wiedergegeben werden.In diesem Beitrag wird eine selbstkonsistente Carleman Linearisierung zur Untersuchung von Oszillatoren vorgestellt, die auch dann anwendbar ist, wenn die Nichtlinearitäten keinen Polynomen entsprechen. Anstelle einer linearen Näherung um einen Arbeitspunkt, erfolgt mit Hilfe der Carleman Linearisierung eine Approximation auf einem vorgegebenen Gebiet. Da es jedoch mit der selbstkonsistenten Technik nicht möglich ist, das stationäre Verhalten von Oszillatoren zu beschreiben, wird die Berechnung einer Poincaré-Abbildung durchgeführt. Mit dieser ist eine anschließende Analyse des Oszillators möglich.

  15. Eine selbstkonsistente Carleman Linearisierung zur Analyse von Oszillatoren

    Science.gov (United States)

    Weber, Harry; Mathis, Wolfgang

    2017-09-01

    Die Analyse nichtlinearer dynamischer Schaltungen ist bis heute eine herausfordernde Aufgabe, da nur selten analytische Lösungen angegeben werden können. Daher wurden eine Vielzahl von Methoden entwickelt, um eine qualitative oder quantitative Näherung für die Lösungen der Netzwerkgleichung zu erhalten. Oftmals wird beispielsweise eine Kleinsignalanalyse mit Hilfe einer Taylorreihe in einem Arbeitspunkt durchgeführt, die nach den Gliedern erster Ordnung abgebrochen wird. Allerdings ist diese Linearisierung nur in der Nähe des stabilen Arbeitspunktes für hyperbolische Systeme gültig. Besonders für die Analyse des dynamischen Verhaltens von Oszillatoren treten jedoch nicht-hyperbolische Systeme auf, sodass diese Methode nicht angewendet werden kann Mathis (2000). Carleman hat gezeigt, dass nichtlineare Differentialgleichungen mit polynomiellen Nichtlinearitäten in ein unendliches System von linearen Differentialgleichungen transformiert werden können Carleman (1932). Wird das unendlichdimensionale Gleichungssystem für numerische Zwecke abgebrochen, kann bei Oszillatoren der Übergang in eine stationäre Schwingung (Grenzzyklus) nicht wiedergegeben werden. In diesem Beitrag wird eine selbstkonsistente Carleman Linearisierung zur Untersuchung von Oszillatoren vorgestellt, die auch dann anwendbar ist, wenn die Nichtlinearitäten keinen Polynomen entsprechen. Anstelle einer linearen Näherung um einen Arbeitspunkt, erfolgt mit Hilfe der Carleman Linearisierung eine Approximation auf einem vorgegebenen Gebiet. Da es jedoch mit der selbstkonsistenten Technik nicht möglich ist, das stationäre Verhalten von Oszillatoren zu beschreiben, wird die Berechnung einer Poincaré-Abbildung durchgeführt. Mit dieser ist eine anschließende Analyse des Oszillators möglich.

  16. Some Examples of Accident Analyses for RB Reactor

    International Nuclear Information System (INIS)

    Pesic, M.

    2002-01-01

    The RB reactor is heavy water critical assembly operated in the Vinca Institute of Nuclear Sciences, Belgrade, Yugoslavia, since April 1959. The first Safety Analysis Report of the RB critical assembly was prepared in 1961/62. But, the first accidental analysis was done in late 1958 in aim the examine power transient and total equivalent doses received by the staff during the reactivity accident occurred on October 15, 1958. Since 1960, the RB reactor is modified few times. Beside initial natural uranium metal fuel rods, new fuel (TVR-S types) from 2% enriched metal uranium and 80% enriched UO 2 were available since 1962 and 1976, respectively. Also, modifications in control and safety systems of the reactor were done occasionally. Special reactor cores were created using all three types of fuel elements, among them, the coupled fast-thermal ones. Nuclear Safety Committee of the Vinca Institute, an independent regulatory body approved for usage all these modifications of the RB reactor. For those decisions of the Committee, the Preliminary Safety Analysis Reports were prepared that, beside proposed technical modifications and new regulation rules had included analyses of various possible accidents. Special attention is given and new methodology was proposed for thoroughly analyses of design based accidents related to coupled fast-thermal cores, that include reactor central zones filled by fuel elements without moderator. In these accidents, during assumed flooding of the fast zone by moderator, a very high reactivity could be inserted in the system with very high reactivity rate. It was necessary to provide that the safety system of the reactor had fast response to that accident and had enough high (negative) reactivity to shut down the reactor timely. In this paper, a brief overview of some accidents, methodology and computation tools used for the accident analyses at RB reactor are given. (author)

  17. Study of thermal-hydraulic analyses with CIP method

    International Nuclear Information System (INIS)

    Doi, Yoshihiro

    1996-09-01

    New type of numerical scheme CIP has been proposed for solving hyperbolic type equations and the CIP is focused on as a less numerical diffusive scheme. C-CUP method with the CIP scheme is adopted to numerical simulations that treat compressible and incompressible fluids, phase change phenomena and Mixture fluids. To evaluate applicabilities of the CIP scheme and C-CUP method for thermal hydraulic analyses related to Fast Breeder Reactors (FBRs), the scheme and the method were reviewed. Feature of the CIP scheme and procedure of the C-CUP method were presented. The CIP scheme is used to solve linear hyperbolic type equations for advection term in basic equations of fluids. Key issues of the scheme is that profile between grid points is described to solve the equation by cubic polynomial and spatial derivatives of the polynomial. The scheme can capture steep change of solution and suppress numerical error. In the C-CUP method, the basic equations of fluids are divided into advection terms and the other terms. The advection terms is solved with CIP scheme and the other terms is solved with difference method. The C-CUP method is robust for numerical instability, but mass of fluid will be in unfair preservation with nonconservative equations for fluids. Numerical analyses with the CIP scheme and the C-CUP method has been performed for phase change, mixture and moving object. These analyses are depend on characteristics of that the scheme and the method are robust for steep change of density and useful for interface tracking. (author)

  18. Multicentre evaluation of the new ORTHO VISION® analyser.

    Science.gov (United States)

    Lazarova, E; Scott, Y; van den Bos, A; Wantzin, P; Atugonza, R; Solkar, S; Carpio, N

    2017-10-01

    Implementation of fully automated analysers has become a crucial security step in the blood bank; it reduces human errors, allows standardisation and improves turnaround time (TAT). We aimed at evaluating the ease of use and the efficiency of the ORTHO VISION ® Analyser (VISION) in comparison to the ORTHO AutoVue ® Innova System (AutoVue) in six different laboratories. After initial training and system configuration, VISION was used in parallel to AutoVue following the daily workload, both automates being based on ORTHO BioVue ® System column agglutination technology. Each participating laboratory provided data and scored the training, system configuration, quality control, maintenance and system efficiency. A total of 1049 individual samples were run: 266 forward and reverse grouping and antibody screens with 10 urgent samples, 473 ABD forward grouping and antibody screens with 22 urgent samples, 160 ABD forward grouping, 42 antibody screens and a series of 108 specific case profiles. The VISION instrument was more rapid than the AutoVue with a mean performing test time of 27·9 min compared to 36 min; for various test type comparisons, the TAT data obtained from VISION was shorter than that from AutoVue. Moreover, VISION analysed urgent STAT samples faster. Regarding the ease of use, VISION was intuitive and user friendly. VISION is a robust, reproducible system performing the most types of analytical determinations needed for pre-transfusion testing today, thus accommodating a wide range of clinical needs. VISION brings appreciated new features that could further secure blood transfusions. © 2017 The Authors. Transfusion Medicine published by John Wiley & Sons Ltd on behalf of British Blood Transfusion Society.

  19. Improving Climate Communication through Comprehensive Linguistic Analyses Using Computational Tools

    Science.gov (United States)

    Gann, T. M.; Matlock, T.

    2014-12-01

    An important lesson on climate communication research is that there is no single way to reach out and inform the public. Different groups conceptualize climate issues in different ways and different groups have different values and assumptions. This variability makes it extremely difficult to effectively and objectively communicate climate information. One of the main challenges is the following: How do we acquire a better understanding of how values and assumptions vary across groups, including political groups? A necessary starting point is to pay close attention to the linguistic content of messages used across current popular media sources. Careful analyses of that information—including how it is realized in language for conservative and progressive media—may ultimately help climate scientists, government agency officials, journalists and others develop more effective messages. Past research has looked at partisan media coverage of climate change, but little attention has been given to the fine-grained linguistic content of such media. And when researchers have done detailed linguistic analyses, they have relied primarily on hand-coding, an approach that is costly, labor intensive, and time-consuming. Our project, building on recent work on partisan news media (Gann & Matlock, 2014; under review) uses high dimensional semantic analyses and other methods of automated classification techniques from the field of natural language processing to quantify how climate issues are characterized in media sources that differ according to political orientation. In addition to discussing varied linguistic patterns, we share new methods for improving climate communication for varied stakeholders, and for developing better assessments of their effectiveness.

  20. Distinguishing Nonpareil marketing group almond cultivars through multivariate analyses.

    Science.gov (United States)

    Ledbetter, Craig A; Sisterson, Mark S

    2013-09-01

    More than 80% of the world's almonds are grown in California with several dozen almond cultivars available commercially. To facilitate promotion and sale, almond cultivars are categorized into marketing groups based on kernel shape and appearance. Several marketing groups are recognized, with the Nonpareil Marketing Group (NMG) demanding the highest prices. Placement of cultivars into the NMG is historical and no objective standards exist for deciding whether newly developed cultivars belong in the NMG. Principal component analyses (PCA) were used to identify nut and kernel characteristics best separating the 4 NMG cultivars (Nonpareil, Jeffries, Kapareil, and Milow) from a representative of the California Marketing Group (cultivar Carmel) and the Mission Marketing Group (cultivar Padre). In addition, discriminant analyses were used to determine cultivar misclassification rates between and within the marketing groups. All 19 evaluated carpological characters differed significantly among the 6 cultivars and during 2 harvest seasons. A clear distinction of NMG cultivars from representatives of the California and Mission Marketing Groups was evident from a PCA involving the 6 cultivars. Further, NMG kernels were successfully discriminated from kernels representing the California and Mission Marketing Groups with overall kernel misclassification of only 2% using 16 of the 19 evaluated characters. Pellicle luminosity was the most discriminating character, regardless of the character set used in analyses. Results provide an objective classification of NMG almond kernels, clearly distinguishing them from kernels of cultivars representing the California and Mission Marketing Groups. Journal of Food Science © 2013 Institute of Food Technologists® No claim to original US government works.

  1. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  2. Quality assurance in road traffic analyses in Switzerland.

    Science.gov (United States)

    Briellmann, Thomas A; Sigrist, Thomas; Augsburger, Marc; Favrat, Bernard; Oestreich, Andrea; Deom, André

    2010-05-20

    Swiss laboratories performing toxicological road traffic analyses have been authorized for many years by the Swiss Federal Roads Office (FEDRO). In 2003 FEDRO signed a contract with the Swiss Society of Legal Medicine (SSLM) to organize the complete quality management concerning road traffic analyses. For this purpose a multidisciplinary working group was established under the name of "road traffic commission (RTC)". RTC has to organize external quality control, to interpret the results of these controls, to perform audits in the laboratories and to report all results to FEDRO. Furthermore the working group can be mandated for special tasks by FEDRO. As an independent organization the Swiss Center for Quality Control (CSCQ) in Geneva manages the external quality controls in the laboratory over the past years. All tested drugs and psychoactive substances are listed in a federal instruction. The so-called 'zero tolerance substances' (THC, morphine, cocaine, amphetamine, methamphetamine, MDMA and MDEA) and their metabolites have to be tested once a year, all other substances (benzodiazepines, zolpidem, phenobarbital, etc.) periodically. Results over the last years show that all laboratories are generally within the confidence interval of +/-30% of the mean value. In cases of non-conformities measures have to be taken immediately and reported to the working group. External audits are performed triennially but accredited laboratories can combine this audit with the approval of the Swiss Accreditation Service (SAS). During the audits a special checklist filled in by the laboratory director is assessed. Non-conformities have to be corrected. During the process of establishing a new legislation, RTC had an opportunity of advising FEDRO. In collaboration with FEDRO, RTC and hence SSLM can work actively on improving of quality assurance in road traffic toxicological analyses, and has an opportunity to bring its professional requests to the federal authorities.

  3. FY01 Supplemental Science and Performance Analyses, Volume 1: Scientific Bases and Analyses, Part 1 and 2

    International Nuclear Information System (INIS)

    Dobson, David

    2001-01-01

    The U.S. Department of Energy (DOE) is considering the possible recommendation of a site at Yucca Mountain, Nevada, for development as a geologic repository for the disposal of high-level radioactive waste and spent nuclear fuel. To facilitate public review and comment, in May 2001 the DOE released the Yucca Mountain Science and Engineering Report (S and ER) (DOE 2001 [DIRS 153849]), which presents technical information supporting the consideration of the possible site recommendation. The report summarizes the results of more than 20 years of scientific and engineering studies. A decision to recommend the site has not been made: the DOE has provided the S and ER and its supporting documents as an aid to the public in formulating comments on the possible recommendation. When the S and ER (DOE 2001 [DIRS 153849]) was released, the DOE acknowledged that technical and scientific analyses of the site were ongoing. Therefore, the DOE noted in the Federal Register Notice accompanying the report (66 FR 23 013 [DIRS 155009], p. 2) that additional technical information would be released before the dates, locations, and times for public hearings on the possible recommendation were announced. This information includes: (1) the results of additional technical studies of a potential repository at Yucca Mountain, contained in this FY01 Supplemental Science and Performance Analyses: Vol. 1, Scientific Bases and Analyses; and FY01 Supplemental Science and Performance Analyses: Vol. 2, Performance Analyses (McNeish 2001 [DIRS 155023]) (collectively referred to as the SSPA) and (2) a preliminary evaluation of the Yucca Mountain site's preclosure and postclosure performance against the DOE's proposed site suitability guidelines (10 CFR Part 963 [64 FR 67054] [DIRS 124754]). By making the large amount of information developed on Yucca Mountain available in stages, the DOE intends to provide the public and interested parties with time to review the available materials and to formulate

  4. 3D analyses of cavitation instabilities accounting for plastic anisotropy

    DEFF Research Database (Denmark)

    Legarth, Brian Nyvang; Tvergaard, Viggo

    2010-01-01

    Full three dimensional cell model analyses are carried out for a solid containing a single small void, in order to determine the critical stress levels for the occurrence of cavitation instabilities. The material models applied are elastic‐viscoplastic, with a small rate‐hardening exponent...... that the quasi‐static solution is well approximated. A special procedure is used to strongly reduce the loading rate a little before the instability occurs. It is found that plastic anisotropy has a significant effect on the level of the critical stress for cavitation instabilities....

  5. Performance of neutron kinetics models for ADS transient analyses

    International Nuclear Information System (INIS)

    Rineiski, A.; Maschek, W.; Rimpault, G.

    2002-01-01

    Within the framework of the SIMMER code development, neutron kinetics models for simulating transients and hypothetical accidents in advanced reactor systems, in particular in Accelerator Driven Systems (ADSs), have been developed at FZK/IKET in cooperation with CE Cadarache. SIMMER is a fluid-dynamics/thermal-hydraulics code, coupled with a structure model and a space-, time- and energy-dependent neutronics module for analyzing transients and accidents. The advanced kinetics models have also been implemented into KIN3D, a module of the VARIANT/TGV code (stand-alone neutron kinetics) for broadening application and for testing and benchmarking. In the paper, a short review of the SIMMER and KIN3D neutron kinetics models is given. Some typical transients related to ADS perturbations are analyzed. The general models of SIMMER and KIN3D are compared with more simple techniques developed in the context of this work to get a better understanding of the specifics of transients in subcritical systems and to estimate the performance of different kinetics options. These comparisons may also help in elaborating new kinetics models and extending existing computation tools for ADS transient analyses. The traditional point-kinetics model may give rather inaccurate transient reaction rate distributions in an ADS even if the material configuration does not change significantly. This inaccuracy is not related to the problem of choosing a 'right' weighting function: the point-kinetics model with any weighting function cannot take into account pronounced flux shape variations related to possible significant changes in the criticality level or to fast beam trips. To improve the accuracy of the point-kinetics option for slow transients, we have introduced a correction factor technique. The related analyses give a better understanding of 'long-timescale' kinetics phenomena in the subcritical domain and help to evaluate the performance of the quasi-static scheme in a particular case. One

  6. Sensitivity analyses of fast reactor systems including thorium and uranium

    International Nuclear Information System (INIS)

    Marable, J.H.; Weisbin, C.R.

    1978-01-01

    The Cross Section Evaluation Working Group (CSEWG) has, in conjunction with the development of the fifth version of ENDF/B, assembled new evaluations for 232 Th and 233 U. It is the purpose of this paper to describe briefly some of the more important features of these evaluations relative to ENDF/B-4 to project the change in reactor performance based upon the newer evaluated files and sensitivity coefficients for interesting design problems, and to indicate preliminary results from ongoing uncertainty analyses

  7. Robotic sample preparation for radiochemical plutonium and americium analyses

    International Nuclear Information System (INIS)

    Stalnaker, N.; Beugelsdijk, T.; Thurston, A.; Quintana, J.

    1985-01-01

    A Zymate robotic system has been assembled and programmed to prepare samples for plutonium and americium analyses by radioactivity counting. The system performs two procedures: a simple dilution procedure and a TTA (xylene) extraction of plutonium. To perform the procedures, the robotic system executes 11 unit operations such as weighing, pipetting, mixing, etc. Approximately 150 programs, which require 64 kilobytes of memory, control the system. The system is now being tested with high-purity plutonium metal and plutonium oxide samples. Our studies indicate that the system can give results that agree within 5% at the 95% confidence level with determinations performed manually. 1 ref., 1 fig., 1 tab

  8. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  9. The ASSET intercomparison of stratosphere and lower mesosphere humidity analyses

    Directory of Open Access Journals (Sweden)

    H. E. Thornton

    2009-02-01

    Full Text Available This paper presents results from the first detailed intercomparison of stratosphere-lower mesosphere water vapour analyses; it builds on earlier results from the EU funded framework V "Assimilation of ENVISAT Data" (ASSET project. Stratospheric water vapour plays an important role in many key atmospheric processes and therefore an improved understanding of its daily variability is desirable. With the availability of high resolution, good quality Michelson Interferometer for Passive Atmospheric Sounding (MIPAS water vapour profiles, the ability of four different atmospheric models to assimilate these data is tested. MIPAS data have been assimilated over September 2003 into the models of the European Centre for Medium Range Weather Forecasts (ECMWF, the Belgian Institute for Space and Aeronomy (BIRA-IASB, the French Service d'Aéronomie (SA-IPSL and the UK Met Office. The resultant middle atmosphere humidity analyses are compared against independent satellite data from the Halogen Occultation Experiment (HALOE, the Polar Ozone and Aerosol Measurement (POAM III and the Stratospheric Aerosol and Gas Experiment (SAGE II. The MIPAS water vapour profiles are generally well assimilated in the ECMWF, BIRA-IASB and SA systems, producing stratosphere-mesosphere water vapour fields where the main features compare favourably with the independent observations. However, the models are less capable of assimilating the MIPAS data where water vapour values are locally extreme or in regions of strong humidity gradients, such as the southern hemisphere lower stratosphere polar vortex. Differences in the analyses can be attributed to the choice of humidity control variable, how the background error covariance matrix is generated, the model resolution and its complexity, the degree of quality control of the observations and the use of observations near the model boundaries. Due to the poor performance of the Met Office analyses the results are not included in

  10. The ASSET intercomparison of stratosphere and lower mesosphere humidity analyses

    Science.gov (United States)

    Thornton, H. E.; Jackson, D. R.; Bekki, S.; Bormann, N.; Errera, Q.; Geer, A. J.; Lahoz, W. A.; Rharmili, S.

    2009-02-01

    This paper presents results from the first detailed intercomparison of stratosphere-lower mesosphere water vapour analyses; it builds on earlier results from the EU funded framework V "Assimilation of ENVISAT Data" (ASSET) project. Stratospheric water vapour plays an important role in many key atmospheric processes and therefore an improved understanding of its daily variability is desirable. With the availability of high resolution, good quality Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) water vapour profiles, the ability of four different atmospheric models to assimilate these data is tested. MIPAS data have been assimilated over September 2003 into the models of the European Centre for Medium Range Weather Forecasts (ECMWF), the Belgian Institute for Space and Aeronomy (BIRA-IASB), the French Service d'Aéronomie (SA-IPSL) and the UK Met Office. The resultant middle atmosphere humidity analyses are compared against independent satellite data from the Halogen Occultation Experiment (HALOE), the Polar Ozone and Aerosol Measurement (POAM III) and the Stratospheric Aerosol and Gas Experiment (SAGE II). The MIPAS water vapour profiles are generally well assimilated in the ECMWF, BIRA-IASB and SA systems, producing stratosphere-mesosphere water vapour fields where the main features compare favourably with the independent observations. However, the models are less capable of assimilating the MIPAS data where water vapour values are locally extreme or in regions of strong humidity gradients, such as the southern hemisphere lower stratosphere polar vortex. Differences in the analyses can be attributed to the choice of humidity control variable, how the background error covariance matrix is generated, the model resolution and its complexity, the degree of quality control of the observations and the use of observations near the model boundaries. Due to the poor performance of the Met Office analyses the results are not included in the intercomparison

  11. New ventures require accurate risk analyses and adjustments.

    Science.gov (United States)

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  12. Usage of data warehouse for analysing software's bugs

    Science.gov (United States)

    Živanov, Danijel; Krstićev, Danijela Boberić; Mirković, Duško

    2017-07-01

    We analysed the database schema of Bugzilla system and taking into account user's requirements for reporting, we presented a dimensional model for the data warehouse which will be used for reporting software defects. The idea proposed in this paper is not to throw away Bugzilla system because it certainly has many strengths, but to make integration of Bugzilla and the proposed data warehouse. Bugzilla would continue to be used for recording bugs that occur during the development and maintenance of software while the data warehouse would be used for storing data on bugs in an appropriate form, which is more suitable for analysis.

  13. Analysing Old Testament poetry: Basic issues in contemporary exegesis

    Directory of Open Access Journals (Sweden)

    G. T. M. Prinsloo

    1991-08-01

    Full Text Available The wealth of publications on matters relating to Old Testament poetry is witness to the fact that this subject has become a focal point in Old Testament studies. In this paper, an overview of contemporary publications is given. The basic issues, both on the level of poetic theory and practical application, are pointed out. A tendency towards a comprehensive literary approach is definitely present and should be encouraged. Only when a poem is analysed on all levels and by all means, will the richness of its meaning be appreciated.

  14. Analyses of anticipated transient without scram events in SMART

    International Nuclear Information System (INIS)

    Kim, Hyung Rae; Chun, Ji Han; Kim, Soo Hyoung; Yang, Soo Hyung; Bae, Kyoo Hwan

    2012-01-01

    SMART is a small integral reactor, which was developed at KAERI and acquired standard design approval in 2012. SMART works like a pressurized light water reactor in principle though it is more compact than loop type large commercial reactors. ATWS(Anticipated Transient Without Scram) event is an AOO(Anticipated Operational Occurrence) where RPS fails to trip the reactor when requested. SMART incorporated a DPS(diverse protection system) to protect the reactor system when RPS(reactor protection system) fails to trip the reactor. The results of transient analyses show that DPS in SMART effectively mitigates the consequence of ATWS

  15. Analyses of Current And Wave Forces on Velocity Caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Buhrkall, Jeppe; Eskesen, Mark C. D.

    2015-01-01

    Velocity caps are often used in connection with for instance offshore intake sea water for the use of for cooling water for power plants or as a source for desalinization plants. The intakes can also be used for river intakes. The velocity cap is placed on top of a vertical pipe. The vertical pipe......) this paper investigates the current and wave forces on the velocity cap and the vertical cylinder. The Morison’s force model was used in the analyses of the extracted force time series in from the CFD model. Further the distribution of the inlet velocities around the velocity cap was also analyzed in detail...

  16. Use of flow models to analyse loss of coolant accidents

    International Nuclear Information System (INIS)

    Pinet, Bernard

    1978-01-01

    This article summarises current work on developing the use of flow models to analyse loss-of-coolant accident in pressurized-water plants. This work is being done jointly, in the context of the LOCA Technical Committee, by the CEA, EDF and FRAMATOME. The construction of the flow model is very closely based on some theoretical studies of the two-fluid model. The laws of transfer at the interface and at the wall are tested experimentally. The representativity of the model then has to be checked in experiments involving several elementary physical phenomena [fr

  17. A real-time transfer function analyser program for PFR

    International Nuclear Information System (INIS)

    McWilliam, D.

    1980-03-01

    A transfer function analyser software package has been produced which is believed to constitute a significant advance over others reported in the literature. The main advantages of the system are its operating speed, especially at low frequencies, which is due to its use of part-cycle integration and its high degree of interactive operator control. The driving sine wave, the return signals and the computed vector diagrams are displayed on TV type visual display units. Data output is by means of an incremental graph plotter or an IBM typewriter. (author)

  18. Elastodynamic fracture analyses of large crack-arrest experiments

    International Nuclear Information System (INIS)

    Bass, B.R.; Pugh, C.E.; Walker, J.K.

    1985-01-01

    Results obtained to date show that the essence of the run-arrest events, including dynamic behavior, is being modeled. Refined meshes and optimum solution algorithms are important parameters in elastodynamic analysis programs to give sufficient resolution to the geometric and time-dependent aspects of fracture analyses. Further refinements in quantitative representation of material parameters and the inclusion of rate dependence through viscoplastic modeling is expected to give an even more accurate basis for assessing the fracture behavior of reactor pressure vessels under PTS and other off-normal loading conditions

  19. Performance Analyses in an Assistive Technology Service Delivery Process

    DEFF Research Database (Denmark)

    Petersen, Anne Karin

    Performance Analyses in an Assistive Technology Service Delivery Process.Keywords: process model, occupational performance, assistive technologiesThe Poster is about teaching students, using models and theory in education and practice. It is related to Occupational therapy process and professional...... af top-til-bund, klientcentreret og aktivitetsbaseret interventioner, ERGO/MunksgaardFisher, A. &, Griswold, L. A., 2014. Performance Skills. I: B.Schell red.2014 Occupational Therapy. Willard &Spackman’s occupational therapy. -12th ed., p.249-264Cook A.M., Polgar J.M. (2015) Assistive Technologies...

  20. Systematic realisation of control flow analyses for CML

    DEFF Research Database (Denmark)

    Gasser, K.L.S.; Nielson, Flemming; Nielson, Hanne Riis

    1997-01-01

    We present a methodology for the systematic realisation of control flow analyses and illustrate it for Concurrent ML. We start with an abstract specification of the analysis that is next proved semantically sound with respect to a traditional small-step operational semantics; this result holds......) to be defined in a syntax-directed manner, and (iii) to generate a set of constraints that subsequently can be solved by standard techniques. We prove equivalence results between the different versions of the analysis; in particular it follows that the least solution to the constraints generated...