WorldWideScience

Sample records for visual numeric scale

  1. Visualization of numerically simulated aerodynamic flow fields

    International Nuclear Information System (INIS)

    Hian, Q.L.; Damodaran, M.

    1991-01-01

    The focus of this paper is to describe the development and the application of an interactive integrated software to visualize numerically simulated aerodynamic flow fields so as to enable the practitioner of computational fluid dynamics to diagnose the numerical simulation and to elucidate essential flow physics from the simulation. The input to the software is the numerical database crunched by a supercomputer and typically consists of flow variables and computational grid geometry. This flow visualization system (FVS), written in C language is targetted at the Personal IRIS Workstations. In order to demonstrate the various visualization modules, the paper also describes the application of this software to visualize two- and three-dimensional flow fields past aerodynamic configurations which have been numerically simulated on the NEC-SXIA Supercomputer. 6 refs

  2. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  3. Visualization techniques in plasma numerical simulations

    International Nuclear Information System (INIS)

    Kulhanek, P.; Smetana, M.

    2004-01-01

    Numerical simulations of plasma processes usually yield a huge amount of raw numerical data. Information about electric and magnetic fields and particle positions and velocities can be typically obtained. There are two major ways of elaborating these data. First of them is called plasma diagnostics. We can calculate average values, variances, correlations of variables, etc. These results may be directly comparable with experiments and serve as the typical quantitative output of plasma simulations. The second possibility is the plasma visualization. The results are qualitative only, but serve as vivid display of phenomena in the plasma followed-up. An experience with visualizing electric and magnetic fields via Line Integral Convolution method is described in the first part of the paper. The LIC method serves for visualization of vector fields in two dimensional section of the three dimensional plasma. The field values can be known only in grid points of three-dimensional grid. The second part of the paper is devoted to the visualization techniques of the charged particle motion. The colour tint can be used for particle temperature representation. The motion can be visualized by a trace fading away with the distance from the particle. In this manner the impressive animations of the particle motion can be achieved. (author)

  4. Utility of numerical and visual analog scales for evaluating the post-operative pain in rural patients.

    Science.gov (United States)

    Mudgalkar, Nikhil; Bele, Samir D; Valsangkar, Sameer; Bodhare, Trupti N; Gorre, Mahipal

    2012-11-01

    Visual analog scales (VAS) and numeric analog scales (NAS) are used to assess post-operative pain, but few studies indicate their usefulness in rural illiterate population in India. This study was designed to 1) Compare the impact of literacy on the ability to indicate pain rating on VAS and NAS in post-operative rural patients. 2) Assess the level of agreement between the pain scales. Cross sectional, hospital based study. Informed consent was obtained from patients prior to undergoing surgical procedures in a teaching hospital. Post surgery, patients who were conscious and coherent, were asked to rate pain on both VAS and NAS. The pain ratings were obtained within 24 hours of surgery and within 5 minutes of each other. Percentages, chi square test, regression analysis. A total of 105 patients participated in the study. 43 (41%) of the sample was illiterate. 82 (78.1%) were able to rate pain on VAS while 81 (77.1%) were able to rate pain on NAS. There was no significant association between pain ratings and type of surgery, duration of surgery and nature of anaesthesia. In multivariate analysis, age, sex and literacy had no significant association with the ability to rate pain on VAS (P value 0.652, 0.967, 0.328 respectively). Similarly, no significant association was obtained between age, sex and literacy and ability to rate pain on NAS (P value 0.713, 0.405, 0.875 respectively). Correlation coefficient between the scales was 0.693. VAS and NAS can be used interchangeably in Indian rural population as post-operative pain assessment tools irrespective of literacy status.

  5. Utility of numerical and visual analog scales for evaluating the post-operative pain in rural patients

    Directory of Open Access Journals (Sweden)

    Nikhil Mudgalkar

    2012-01-01

    Full Text Available Background: Visual analog scales (VAS and numeric analog scales (NAS are used to assess post-operative pain, but few studies indicate their usefulness in rural illiterate population in India. Aims: This study was designed to 1 Compare the impact of literacy on the ability to indicate pain rating on VAS and NAS in post-operative rural patients. 2 Assess the level of agreement between the pain scales. Setting and Design: Cross sectional, hospital based study. Methods: Informed consent was obtained from patients prior to undergoing surgical procedures in a teaching hospital. Post surgery, patients who were conscious and coherent, were asked to rate pain on both VAS and NAS. The pain ratings were obtained within 24 hours of surgery and within 5 minutes of each other. Statistical Methods: Percentages, chi square test, regression analysis. Results: A total of 105 patients participated in the study. 43 (41% of the sample was illiterate. 82 (78.1% were able to rate pain on VAS while 81 (77.1% were able to rate pain on NAS. There was no significant association between pain ratings and type of surgery, duration of surgery and nature of anaesthesia. In multivariate analysis, age, sex and literacy had no significant association with the ability to rate pain on VAS (P value 0.652, 0.967, 0.328 respectively. Similarly, no significant association was obtained between age, sex and literacy and ability to rate pain on NAS (P value 0.713, 0.405, 0.875 respectively. Correlation coefficient between the scales was 0.693. Conclusion: VAS and NAS can be used interchangeably in Indian rural population as post-operative pain assessment tools irrespective of literacy status.

  6. Experiments at Scale with In-Situ Visualization Using ParaView/Catalyst in RAGE

    Energy Technology Data Exchange (ETDEWEB)

    Kares, Robert John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-10-31

    In this paper I describe some numerical experiments performed using the ParaView/Catalyst in-situ visualization infrastructure deployed in the Los Alamos RAGE radiation-hydrodynamics code to produce images from a running large scale 3D ICF simulation on the Cielo supercomputer at Los Alamos. The detailed procedures for the creation of the visualizations using ParaView/Catalyst are discussed and several images sequences from the ICF simulation problem produced with the in-situ method are presented. My impressions and conclusions concerning the use of the in-situ visualization method in RAGE are discussed.

  7. The Application of Visual Basic Computer Programming Language to Simulate Numerical Iterations

    Directory of Open Access Journals (Sweden)

    Abdulkadir Baba HASSAN

    2006-06-01

    Full Text Available This paper examines the application of Visual Basic Computer Programming Language to Simulate Numerical Iterations, the merit of Visual Basic as a Programming Language and the difficulties faced when solving numerical iterations analytically, this research paper encourage the uses of Computer Programming methods for the execution of numerical iterations and finally fashion out and develop a reliable solution using Visual Basic package to write a program for some selected iteration problems.

  8. Ultra-Scale Visualization: Research and Education

    International Nuclear Information System (INIS)

    Ma, K-L; Ross, Robert; Huang Jian; Humphreys, Greg; Max, Nelson; Moreland, Kenneth; Owens, John D; Shen, H-W

    2007-01-01

    Understanding the science behind large-scale simulations and high-throughput experiments requires extracting meaning from data sets of hundreds of terabytes or more. Visualization is the most intuitive means for scientists to understand data at this scale, and the most effective way to communicate their findings with others. Even though visualization technology has matured over the past twenty years, it is still limited by the extent and scale of the data that it can be applied to, and also by the functionalities that were mostly designed for single-user, single-variable, and single-space investigation. The Institute for Ultra-Scale Visualization (IUSV), funded by the DOE SciDAC-2 program, has the mission to advance visualization technologies to enable knowledge discovery and dissemination for peta-scale applications. By working with the SciDAC application projects, Centers for Enabling Technology, and other Institutes, IUSV aims to lead the research innovation that can create new visualization capabilities needed for gleaning insights from data at petascale and beyond to solve forefront scientific problems. This paper outlines what we see as some of the biggest research challenges facing the visualization community, and how we can approach education and outreach to put successful research in the hands of scientists

  9. Can responses to basic non-numerical visual features explain neural numerosity responses?

    Science.gov (United States)

    Harvey, Ben M; Dumoulin, Serge O

    2017-04-01

    Humans and many animals can distinguish between stimuli that differ in numerosity, the number of objects in a set. Human and macaque parietal lobes contain neurons that respond to changes in stimulus numerosity. However, basic non-numerical visual features can affect neural responses to and perception of numerosity, and visual features often co-vary with numerosity. Therefore, it is debated whether numerosity or co-varying low-level visual features underlie neural and behavioral responses to numerosity. To test the hypothesis that non-numerical visual features underlie neural numerosity responses in a human parietal numerosity map, we analyze responses to a group of numerosity stimulus configurations that have the same numerosity progression but vary considerably in their non-numerical visual features. Using ultra-high-field (7T) fMRI, we measure responses to these stimulus configurations in an area of posterior parietal cortex whose responses are believed to reflect numerosity-selective activity. We describe an fMRI analysis method to distinguish between alternative models of neural response functions, following a population receptive field (pRF) modeling approach. For each stimulus configuration, we first quantify the relationships between numerosity and several non-numerical visual features that have been proposed to underlie performance in numerosity discrimination tasks. We then determine how well responses to these non-numerical visual features predict the observed fMRI responses, and compare this to the predictions of responses to numerosity. We demonstrate that a numerosity response model predicts observed responses more accurately than models of responses to simple non-numerical visual features. As such, neural responses in cognitive processing need not reflect simpler properties of early sensory inputs. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Earth History databases and visualization - the TimeScale Creator system

    Science.gov (United States)

    Ogg, James; Lugowski, Adam; Gradstein, Felix

    2010-05-01

    The "TimeScale Creator" team (www.tscreator.org) and the Subcommission on Stratigraphic Information (stratigraphy.science.purdue.edu) of the International Commission on Stratigraphy (www.stratigraphy.org) has worked with numerous geoscientists and geological surveys to prepare reference datasets for global and regional stratigraphy. All events are currently calibrated to Geologic Time Scale 2004 (Gradstein et al., 2004, Cambridge Univ. Press) and Concise Geologic Time Scale (Ogg et al., 2008, Cambridge Univ. Press); but the array of intercalibrations enable dynamic adjustment to future numerical age scales and interpolation methods. The main "global" database contains over 25,000 events/zones from paleontology, geomagnetics, sea-level and sequence stratigraphy, igneous provinces, bolide impacts, plus several stable isotope curves and image sets. Several regional datasets are provided in conjunction with geological surveys, with numerical ages interpolated using a similar flexible inter-calibration procedure. For example, a joint program with Geoscience Australia has compiled an extensive Australian regional biostratigraphy and a full array of basin lithologic columns with each formation linked to public lexicons of all Proterozoic through Phanerozoic basins - nearly 500 columns of over 9,000 data lines plus hot-curser links to oil-gas reference wells. Other datapacks include New Zealand biostratigraphy and basin transects (ca. 200 columns), Russian biostratigraphy, British Isles regional stratigraphy, Gulf of Mexico biostratigraphy and lithostratigraphy, high-resolution Neogene stable isotope curves and ice-core data, human cultural episodes, and Circum-Arctic stratigraphy sets. The growing library of datasets is designed for viewing and chart-making in the free "TimeScale Creator" JAVA package. This visualization system produces a screen display of the user-selected time-span and the selected columns of geologic time scale information. The user can change the

  11. Preferred Presentation of the Visual Analog Scale for Measurement of Postoperative Pain

    DEFF Research Database (Denmark)

    Kjeldsen, Helle Birgitte; Klausen, Tobias Wirenfeldt; Rosenberg, Jacob

    2015-01-01

    BACKGROUND: The aim of this study was to evaluate differences in pain scores with different visual analog scale (VAS) presentations and to compare those differences with a numeric rating scale. We also asked the patients for preference of the different methods. METHODS: Prior to the trial, we...... performed power calculations to estimate a preferred sample size, and 62 postoperative patients supplied a complete set of data to the study. Inclusion criteria were newly operated patients within the first 5 days after surgery. Every patient included was with 1-minute intervals and presented with one...... of the following 100-mm VAS lines: VAS horizontal with or without stop lines at the endings, or VAS vertical with or without stop lines. They also completed a numeric rating scale (NRS). RESULTS: We did not find differences in pain scores between the four VAS measures. The NRS had slightly higher pain scores than...

  12. Can stroke patients use visual analogue scales?

    Science.gov (United States)

    Price, C I; Curless, R H; Rodgers, H

    1999-07-01

    Visual analogue scales (VAS) have been used for the subjective measurement of mood, pain, and health status after stroke. In this study we investigated how stroke-related impairments could alter the ability of subjects to answer accurately. Consent was obtained from 96 subjects with a clinical stroke (mean age, 72.5 years; 50 men) and 48 control subjects without cerebrovascular disease (mean age, 71.5 years; 29 men). Patients with reduced conscious level or severe dysphasia were excluded. Subjects were asked to rate the tightness that they could feel on the (unaffected) upper arm after 3 low-pressure inflations with a standard sphygmomanometer cuff, which followed a predetermined sequence (20 mm Hg, 40 mm Hg, 0 mm Hg). Immediately after each change, they rated the perceived tightness on 5 scales presented in a random order: 4-point rating scale (none, mild, moderate, severe), 0 to 10 numerical rating scale, mechanical VAS, horizontal VAS, and vertical VAS. Standard tests recorded deficits in language, cognition, and visuospatial awareness. Inability to complete scales with the correct pattern was associated with any stroke (P<0.001). There was a significant association between success using scales and milder clinical stroke subtype (P<0.01). Within the stroke group, logistic regression analysis identified significant associations (P<0.05) between impairments (cognitive and visuospatial) and inability to complete individual scales correctly. Many patients after a stroke are unable to successfully complete self-report measurement scales, including VAS.

  13. Selecting numerical scales for pairwise comparisons

    International Nuclear Information System (INIS)

    Elliott, Michael A.

    2010-01-01

    It is often desirable in decision analysis problems to elicit from an individual the rankings of a population of attributes according to the individual's preference and to understand the degree to which each attribute is preferred to the others. A common method for obtaining this information involves the use of pairwise comparisons, which allows an analyst to convert subjective expressions of preference between two attributes into numerical values indicating preferences across the entire population of attributes. Key to the use of pairwise comparisons is the underlying numerical scale that is used to convert subjective linguistic expressions of preference into numerical values. This scale represents the psychological manner in which individuals perceive increments of preference among abstract attributes and it has important implications about the distribution and consistency of an individual's preferences. Three popular scale types, the traditional integer scales, balanced scales and power scales are examined. Results of a study of 64 individuals responding to a hypothetical decision problem show that none of these scales can accurately capture the preferences of all individuals. A study of three individuals working on an actual engineering decision problem involving the design of a decay heat removal system for a nuclear fission reactor show that the choice of scale can affect the preferred decision. It is concluded that applications of pairwise comparisons would benefit from permitting participants to choose the scale that best models their own particular way of thinking about the relative preference of attributes.

  14. Generating descriptive visual words and visual phrases for large-scale image applications.

    Science.gov (United States)

    Zhang, Shiliang; Tian, Qi; Hua, Gang; Huang, Qingming; Gao, Wen

    2011-09-01

    Bag-of-visual Words (BoWs) representation has been applied for various problems in the fields of multimedia and computer vision. The basic idea is to represent images as visual documents composed of repeatable and distinctive visual elements, which are comparable to the text words. Notwithstanding its great success and wide adoption, visual vocabulary created from single-image local descriptors is often shown to be not as effective as desired. In this paper, descriptive visual words (DVWs) and descriptive visual phrases (DVPs) are proposed as the visual correspondences to text words and phrases, where visual phrases refer to the frequently co-occurring visual word pairs. Since images are the carriers of visual objects and scenes, a descriptive visual element set can be composed by the visual words and their combinations which are effective in representing certain visual objects or scenes. Based on this idea, a general framework is proposed for generating DVWs and DVPs for image applications. In a large-scale image database containing 1506 object and scene categories, the visual words and visual word pairs descriptive to certain objects or scenes are identified and collected as the DVWs and DVPs. Experiments show that the DVWs and DVPs are informative and descriptive and, thus, are more comparable with the text words than the classic visual words. We apply the identified DVWs and DVPs in several applications including large-scale near-duplicated image retrieval, image search re-ranking, and object recognition. The combination of DVW and DVP performs better than the state of the art in large-scale near-duplicated image retrieval in terms of accuracy, efficiency and memory consumption. The proposed image search re-ranking algorithm: DWPRank outperforms the state-of-the-art algorithm by 12.4% in mean average precision and about 11 times faster in efficiency.

  15. Absence of visual experience modifies the neural basis of numerical thinking.

    Science.gov (United States)

    Kanjlia, Shipra; Lane, Connor; Feigenson, Lisa; Bedny, Marina

    2016-10-04

    In humans, the ability to reason about mathematical quantities depends on a frontoparietal network that includes the intraparietal sulcus (IPS). How do nature and nurture give rise to the neurobiology of numerical cognition? We asked how visual experience shapes the neural basis of numerical thinking by studying numerical cognition in congenitally blind individuals. Blind (n = 17) and blindfolded sighted (n = 19) participants solved math equations that varied in difficulty (e.g., 27 - 12 = x vs. 7 - 2 = x), and performed a control sentence comprehension task while undergoing fMRI. Whole-cortex analyses revealed that in both blind and sighted participants, the IPS and dorsolateral prefrontal cortices were more active during the math task than the language task, and activity in the IPS increased parametrically with equation difficulty. Thus, the classic frontoparietal number network is preserved in the total absence of visual experience. However, surprisingly, blind but not sighted individuals additionally recruited a subset of early visual areas during symbolic math calculation. The functional profile of these "visual" regions was identical to that of the IPS in blind but not sighted individuals. Furthermore, in blindness, number-responsive visual cortices exhibited increased functional connectivity with prefrontal and IPS regions that process numbers. We conclude that the frontoparietal number network develops independently of visual experience. In blindness, this number network colonizes parts of deafferented visual cortex. These results suggest that human cortex is highly functionally flexible early in life, and point to frontoparietal input as a mechanism of cross-modal plasticity in blindness.

  16. The visual communication in the optonometric scales.

    Science.gov (United States)

    Dantas, Rosane Arruda; Pagliuca, Lorita Marlena Freitag

    2006-01-01

    Communication through vision involves visual apprenticeship that demands ocular integrity, which results in the importance of the evaluation of visual acuity. The scale of images, formed by optotypes, is a method for the verification of visual acuity in kindergarten children. To identify the optotype the child needs to know the image in analysis. Given the importance of visual communication during the process of construction of the scale of images, one presents a bibliographic, analytical study aiming at thinking about the principles for the construction of those tables. One considers the draw inserted as an optotype as a non-verbal symbolic expression of the body and/or of the environment constructed based on the caption of experiences by the individual. One contests the indiscriminate use of images, for one understands that there must be previous knowledge. Despite the subjectivity of the optotypes, the scales continue valid if one adapts images to those of the universe of the children to be examined.

  17. Frameworks for visualization at the extreme scale

    International Nuclear Information System (INIS)

    Joy, Kenneth I; Miller, Mark; Childs, Hank; Bethel, E Wes; Clyne, John; Ostrouchov, George; Ahern, Sean

    2007-01-01

    The challenges of visualization at the extreme scale involve issues of scale, complexity, temporal exploration and uncertainty. The Visualization and Analytics Center for Enabling Technologies (VACET) focuses on leveraging scientific visualization and analytics software technology as an enabling technology to increased scientific discovery and insight. In this paper, we introduce new uses of visualization frameworks through the introduction of Equivalence Class Functions (ECFs). These functions give a new class of derived quantities designed to greatly expand the ability of the end user to explore and visualize data. ECFs are defined over equivalence classes (i.e., groupings) of elements from an original mesh, and produce summary values for the classes as output. ECFs can be used in the visualization process to directly analyze data, or can be used to synthesize new derived quantities on the original mesh. The design of ECFs enable a parallel implementation that allows the use of these techniques on massive data sets that require parallel processing

  18. Visual coherence for large-scale line-plot visualizations

    KAUST Repository

    Muigg, Philipp

    2011-06-01

    Displaying a large number of lines within a limited amount of screen space is a task that is common to many different classes of visualization techniques such as time-series visualizations, parallel coordinates, link-node diagrams, and phase-space diagrams. This paper addresses the challenging problems of cluttering and overdraw inherent to such visualizations. We generate a 2x2 tensor field during line rasterization that encodes the distribution of line orientations through each image pixel. Anisotropic diffusion of a noise texture is then used to generate a dense, coherent visualization of line orientation. In order to represent features of different scales, we employ a multi-resolution representation of the tensor field. The resulting technique can easily be applied to a wide variety of line-based visualizations. We demonstrate this for parallel coordinates, a time-series visualization, and a phase-space diagram. Furthermore, we demonstrate how to integrate a focus+context approach by incorporating a second tensor field. Our approach achieves interactive rendering performance for large data sets containing millions of data items, due to its image-based nature and ease of implementation on GPUs. Simulation results from computational fluid dynamics are used to evaluate the performance and usefulness of the proposed method. © 2011 The Author(s).

  19. Visual coherence for large-scale line-plot visualizations

    KAUST Repository

    Muigg, Philipp; Hadwiger, Markus; Doleisch, Helmut; Grö ller, Eduard M.

    2011-01-01

    Displaying a large number of lines within a limited amount of screen space is a task that is common to many different classes of visualization techniques such as time-series visualizations, parallel coordinates, link-node diagrams, and phase-space diagrams. This paper addresses the challenging problems of cluttering and overdraw inherent to such visualizations. We generate a 2x2 tensor field during line rasterization that encodes the distribution of line orientations through each image pixel. Anisotropic diffusion of a noise texture is then used to generate a dense, coherent visualization of line orientation. In order to represent features of different scales, we employ a multi-resolution representation of the tensor field. The resulting technique can easily be applied to a wide variety of line-based visualizations. We demonstrate this for parallel coordinates, a time-series visualization, and a phase-space diagram. Furthermore, we demonstrate how to integrate a focus+context approach by incorporating a second tensor field. Our approach achieves interactive rendering performance for large data sets containing millions of data items, due to its image-based nature and ease of implementation on GPUs. Simulation results from computational fluid dynamics are used to evaluate the performance and usefulness of the proposed method. © 2011 The Author(s).

  20. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  1. 3D visualization of numeric planetary data using JMARS

    Science.gov (United States)

    Dickenshied, S.; Christensen, P. R.; Anwar, S.; Carter, S.; Hagee, W.; Noss, D.

    2013-12-01

    JMARS (Java Mission-planning and Analysis for Remote Sensing) is a free geospatial application developed by the Mars Space Flight Facility at Arizona State University. Originally written as a mission planning tool for the THEMIS instrument on board the MARS Odyssey Spacecraft, it was released as an analysis tool to the general public in 2003. Since then it has expanded to be used for mission planning and scientific data analysis by additional NASA missions to Mars, the Moon, and Vesta, and it has come to be used by scientists, researchers and students of all ages from more than 40 countries around the world. The public version of JMARS now also includes remote sensing data for Mercury, Venus, Earth, the Moon, Mars, and a number of the moons of Jupiter and Saturn. Additional datasets for asteroids and other smaller bodies are being added as they becomes available and time permits. In addition to visualizing multiple datasets in context with one another, significant effort has been put into on-the-fly projection of georegistered data over surface topography. This functionality allows a user to easily create and modify 3D visualizations of any regional scene where elevation data is available in JMARS. This can be accomplished through the use of global topographic maps or regional numeric data such as HiRISE or HRSC DTMs. Users can also upload their own regional or global topographic dataset and use it as an elevation source for 3D rendering of their scene. The 3D Layer in JMARS allows the user to exaggerate the z-scale of any elevation source to emphasize the vertical variance throughout a scene. In addition, the user can rotate, tilt, and zoom the scene to any desired angle and then illuminate it with an artificial light source. This scene can be easily overlain with additional JMARS datasets such as maps, images, shapefiles, contour lines, or scale bars, and the scene can be easily saved as a graphic image for use in presentations or publications.

  2. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  3. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  4. Numerical study on visualization method for material distribution using photothermal effect

    International Nuclear Information System (INIS)

    Kim, Moo Joong; Yoo, Jai Suk; Kim, Dong Kwon; Kim, Hyun Jung

    2015-01-01

    Visualization and imaging techniques have become increasingly essential in a wide range of industrial fields. A few imaging methods such as X-ray imaging, computed tomography and magnetic resonance imaging have been developed for medical applications to materials that are basically transparent or X-ray penetrable; however, reliable techniques for optically opaque materials such as semiconductors or metallic circuits have not been suggested yet. The photothermal method has been developed mainly for the measurement of thermal properties using characteristics that exhibit photothermal effects depending on the thermal properties of the materials. This study attempts to numerically investigate the feasibility of using photothermal effects to visualize or measure the material distribution of opaque substances. For this purpose, we conducted numerical analyses of various intaglio patterns with approximate sizes of 1.2-6 mm in stainless steel 0.5 mm below copper. In addition, images of the intaglio patterns in stainless steel were reconstructed by two-dimensional numerical scanning. A quantitative comparison of the reconstructed results and the original geometries showed an average difference of 0.172 mm and demonstrated the possibility of application to experimental imaging.

  5. Selective visual scaling of time-scale processes facilitates broadband learning of isometric force frequency tracking.

    Science.gov (United States)

    King, Adam C; Newell, Karl M

    2015-10-01

    The experiment investigated the effect of selectively augmenting faster time scales of visual feedback information on the learning and transfer of continuous isometric force tracking tasks to test the generality of the self-organization of 1/f properties of force output. Three experimental groups tracked an irregular target pattern either under a standard fixed gain condition or with selectively enhancement in the visual feedback display of intermediate (4-8 Hz) or high (8-12 Hz) frequency components of the force output. All groups reduced tracking error over practice, with the error lowest in the intermediate scaling condition followed by the high scaling and fixed gain conditions, respectively. Selective visual scaling induced persistent changes across the frequency spectrum, with the strongest effect in the intermediate scaling condition and positive transfer to novel feedback displays. The findings reveal an interdependence of the timescales in the learning and transfer of isometric force output frequency structures consistent with 1/f process models of the time scales of motor output variability.

  6. A Computer-Based Visual Analog Scale,

    Science.gov (United States)

    1992-06-01

    34 keys on the computer keyboard or other input device. The initial position of the arrow is always in the center of the scale to prevent biasing the...3 REFERENCES 1. Gift, A.G., "Visual Analogue Scales: Measurement of Subjective Phenomena." Nursing Research, Vol. 38, pp. 286-288, 1989. 2. Ltmdberg...3. Menkes, D.B., Howard, R.C., Spears, G.F., and Cairns, E.R., "Salivary THC Following Cannabis Smoking Correlates With Subjective Intoxication and

  7. A new Snellen's visual acuity chart with 'Indian' numerals.

    OpenAIRE

    Al-Salem, M

    1987-01-01

    'Indian' numerals, which are popular among the Arab population, were used to devise a new Snellen's visual acuity chart. The new chart has the advantages of a reading chart. It keeps the patient's interest, does not miss alexic patients, and is quicker to perform. It is also devoid of the many disadvantages of a kinetic response chart (the capital E letter or Landolt's broken rings), especially that of the limited option of test objects.

  8. Validity and reliability of the Rosenberg Self-Esteem Scale-Thai version as compared to the Self-Esteem Visual Analog Scale.

    Science.gov (United States)

    Piyavhatkul, Nawanant; Aroonpongpaisal, Suwanna; Patjanasoontorn, Niramol; Rongbutsri, Somchit; Maneeganondh, Somchit; Pimpanit, Wijitra

    2011-07-01

    To compare the validity and reliability of the Thai version of the Rosenberg Self-Esteem Scale with the Self-Esteem Visual Analog Scale. The Rosenberg Self-Esteem Scale was translated into Thai and its content-validity checked by bacA translation. The reliability of the Rosenberg Self-Esteem Scale compared with the Self-Esteem Visual Analog Scale was ther tested between February and March 2008 on 270 volunteers, including 135 patients with psychiatric illness and 135 normal volunteers. The authors analyzed the internal consistency and factor structure of the Rosenberg Self-Esteem Scale-Thai version and the correlation between it and the Visual Analog Scale. The Cronbach's Alpha for the Rosenberg Self-Esteem scale-Thai version was 0.849 and the Pearson's correlation between it and the Self-Esteem Visual Analog Scale 0.618 (p = 0.01). Two factors, viz, the positively and negatively framea items, from the Rosenberg Self-Esteem Scale-Thai version accounted for 44.04% and 12.10% of the variance, respectively. The Rosenberg Self-Esteem Scale-Thai version has acceptable reliability. The Self-Esteem Visual Analog Scale provides an effective measure of self-esteem.

  9. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  10. EXTENDED SCALING LAWS IN NUMERICAL SIMULATIONS OF MAGNETOHYDRODYNAMIC TURBULENCE

    International Nuclear Information System (INIS)

    Mason, Joanne; Cattaneo, Fausto; Perez, Jean Carlos; Boldyrev, Stanislav

    2011-01-01

    Magnetized turbulence is ubiquitous in astrophysical systems, where it notoriously spans a broad range of spatial scales. Phenomenological theories of MHD turbulence describe the self-similar dynamics of turbulent fluctuations in the inertial range of scales. Numerical simulations serve to guide and test these theories. However, the computational power that is currently available restricts the simulations to Reynolds numbers that are significantly smaller than those in astrophysical settings. In order to increase computational efficiency and, therefore, probe a larger range of scales, one often takes into account the fundamental anisotropy of field-guided MHD turbulence, with gradients being much slower in the field-parallel direction. The simulations are then optimized by employing the reduced MHD equations and relaxing the field-parallel numerical resolution. In this work we explore a different possibility. We propose that there exist certain quantities that are remarkably stable with respect to the Reynolds number. As an illustration, we study the alignment angle between the magnetic and velocity fluctuations in MHD turbulence, measured as the ratio of two specially constructed structure functions. We find that the scaling of this ratio can be extended surprisingly well into the regime of relatively low Reynolds number. However, the extended scaling easily becomes spoiled when the dissipation range in the simulations is underresolved. Thus, taking the numerical optimization methods too far can lead to spurious numerical effects and erroneous representation of the physics of MHD turbulence, which in turn can affect our ability to identify correctly the physical mechanisms that are operating in astrophysical systems.

  11. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  12. A novel visual facial anxiety scale for assessing preoperative anxiety.

    Directory of Open Access Journals (Sweden)

    Xuezhao Cao

    Full Text Available There is currently no widely accepted instrument for measuring preoperative anxiety. The objective of this study was to develop a simple visual facial anxiety scale (VFAS for assessing acute preoperative anxiety.The initial VFAS was comprised of 11 similarly styled stick-figure reflecting different types of facial expressions (Fig 1. After obtaining IRB approval, a total of 265 participant-healthcare providers (e.g., anesthesiologists, anesthesiology residents, and perioperative nurses were recruited to participate in this study. The participants were asked to: (1 rank the 11 faces from 0-10 (0 = no anxiety, while 10 = highest anxiety and then to (2 match one of the 11 facial expression with a numeric verbal rating scale (NVRS (0 = no anxiety and 10 = highest level of anxiety and a specific categorical level of anxiety, namely no anxiety, mild, mild-moderate, moderate, moderate-high or highest anxiety. Based on these data, the Spearman correlation and frequencies of the 11 faces in relation to the 11-point numerical anxiety scale and 6 categorical anxiety levels were calculated. The highest frequency of a face assigned to a level of the numerical anxiety scale resulted in a finalized order of faces corresponding to the 11-point numeric rating scale.The highest frequency for each of the NVRS anxiety scores were as follow: A0, A1, A2, A3, A4, A5, A7, A6, A8, A9 and A10 (Fig 2. For the six categorical anxiety levels, a total of 260 (98.1% participants chose the face A0 as representing 'no' anxiety, 250 (94.3% participants chose the face A10 as representing 'highest' anxiety and 147 (55.5% participants chose the face A8 as representing 'moderate-high' anxiety. Spearman analysis showed a significant correlation between the faces A3 and A5 assigned to the mild-moderate anxiety category (r = 0.58, but A5 was ultimately chosen due to its higher frequency compared to the frequency of A3 (30.6% vs 24.9%(Fig 3. Similarly, the correlation of the faces A7

  13. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  14. Visual search for conjunctions of physical and numerical size shows that they are processed independently.

    Science.gov (United States)

    Sobel, Kenith V; Puri, Amrita M; Faulkenberry, Thomas J; Dague, Taylor D

    2017-03-01

    The size congruity effect refers to the interaction between numerical magnitude and physical digit size in a symbolic comparison task. Though this effect is well established in the typical 2-item scenario, the mechanisms at the root of the interference remain unclear. Two competing explanations have emerged in the literature: an early interaction model and a late interaction model. In the present study, we used visual conjunction search to test competing predictions from these 2 models. Participants searched for targets that were defined by a conjunction of physical and numerical size. Some distractors shared the target's physical size, and the remaining distractors shared the target's numerical size. We held the total number of search items fixed and manipulated the ratio of the 2 distractor set sizes. The results from 3 experiments converge on the conclusion that numerical magnitude is not a guiding feature for visual search, and that physical and numerical magnitude are processed independently, which supports a late interaction model of the size congruity effect. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Can responses to basic non-numerical visual features explain neural numerosity responses?

    NARCIS (Netherlands)

    Harvey, Ben M; Dumoulin, Serge O

    2017-01-01

    Humans and many animals can distinguish between stimuli that differ in numerosity, the number of objects in a set. Human and macaque parietal lobes contain neurons that respond to changes in stimulus numerosity. However, basic non-numerical visual features can affect neural responses to and

  16. Development and Standardization of an Alienation Scale for Visually Impaired Students

    Science.gov (United States)

    Punia, Poonam; Berwal, Sandeep

    2017-01-01

    Introduction: The present study was undertaken to develop a valid and reliable scale for measuring a feeling of alienation in students with visual impairments (that is, those who are blind or have low vision). Methods: In this study, a pool of 60 items was generated to develop an Alienation Scale for Visually Impaired Students (AL-VI) based on a…

  17. Spatial Scaling of the Profile of Selective Attention in the Visual Field.

    Science.gov (United States)

    Gannon, Matthew A; Knapp, Ashley A; Adams, Thomas G; Long, Stephanie M; Parks, Nathan A

    2016-01-01

    Neural mechanisms of selective attention must be capable of adapting to variation in the absolute size of an attended stimulus in the ever-changing visual environment. To date, little is known regarding how attentional selection interacts with fluctuations in the spatial expanse of an attended object. Here, we use event-related potentials (ERPs) to investigate the scaling of attentional enhancement and suppression across the visual field. We measured ERPs while participants performed a task at fixation that varied in its attentional demands (attentional load) and visual angle (1.0° or 2.5°). Observers were presented with a stream of task-relevant stimuli while foveal, parafoveal, and peripheral visual locations were probed by irrelevant distractor stimuli. We found two important effects in the N1 component of visual ERPs. First, N1 modulations to task-relevant stimuli indexed attentional selection of stimuli during the load task and further correlated with task performance. Second, with increased task size, attentional modulation of the N1 to distractor stimuli showed a differential pattern that was consistent with a scaling of attentional selection. Together, these results demonstrate that the size of an attended stimulus scales the profile of attentional selection across the visual field and provides insights into the attentional mechanisms associated with such spatial scaling.

  18. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-04-30

    We explore visualization and abstraction approaches to represent neuronal data. Neuroscientists acquire electron microscopy volumes to reconstruct a complete wiring diagram of the neurons in the brain, called the connectome. This will be crucial to understanding brains and their development. However, the resulting data is complex and large, posing a big challenge to existing visualization techniques in terms of clarity and scalability. We describe solutions to tackle the problems of scalability and cluttered presentation. We first show how a query-guided interactive approach to visual exploration can reduce the clutter and help neuroscientists explore their data dynamically. We use a knowledge-based query algebra that facilitates the interactive creation of queries. This allows neuroscientists to pose domain-specific questions related to their research. Simple queries can be combined to form complex queries to answer more sophisticated questions. We then show how visual abstractions from 3D to 2D can significantly reduce the visual clutter and add clarity to the visualization so that scientists can focus more on the analysis. We abstract the topology of 3D neurons into a multi-scale, relative distance-preserving subway map visualization that allows scientists to interactively explore the morphological and connectivity features of neuronal cells. We then focus on the process of acquisition, where neuroscientists segment electron microscopy images to reconstruct neurons. The segmentation process of such data is tedious, time-intensive, and usually performed using a diverse set of tools. We present a novel web-based visualization system for tracking the state, progress, and evolution of segmentation data in neuroscience. Our multi-user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large

  19. The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): Data Analysis and Visualization for Geoscience Data

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Doutriaux, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Patchett, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Ross [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steed, Chad [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Krishnan, Harinarayan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Silva, Claudio [NYU Polytechnic School of Engineering, New York, NY (United States); Chaudhary, Aashish [Kitware, Inc., Clifton Park, NY (United States); Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Childs, Hank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Mr. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Bauer, Andrew [Kitware, Inc., Clifton Park, NY (United States); Pletzer, Alexander [Tech-X Corp., Boulder, CO (United States); Poco, Jorge [NYU Polytechnic School of Engineering, New York, NY (United States); Ellqvist, Tommy [NYU Polytechnic School of Engineering, New York, NY (United States); Santos, Emanuele [Federal Univ. of Ceara, Fortaleza (Brazil); Potter, Gerald [NASA Johnson Space Center, Houston, TX (United States); Smith, Brian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Maxwell, Thomas [NASA Johnson Space Center, Houston, TX (United States); Kindig, David [Tech-X Corp., Boulder, CO (United States); Koop, David [NYU Polytechnic School of Engineering, New York, NY (United States)

    2013-05-01

    To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in the interactive visual analysis of distributed large-scale data for the climate community.

  20. Spatial Scaling of the Profile of Selective Attention in the Visual Field.

    Directory of Open Access Journals (Sweden)

    Matthew A Gannon

    Full Text Available Neural mechanisms of selective attention must be capable of adapting to variation in the absolute size of an attended stimulus in the ever-changing visual environment. To date, little is known regarding how attentional selection interacts with fluctuations in the spatial expanse of an attended object. Here, we use event-related potentials (ERPs to investigate the scaling of attentional enhancement and suppression across the visual field. We measured ERPs while participants performed a task at fixation that varied in its attentional demands (attentional load and visual angle (1.0° or 2.5°. Observers were presented with a stream of task-relevant stimuli while foveal, parafoveal, and peripheral visual locations were probed by irrelevant distractor stimuli. We found two important effects in the N1 component of visual ERPs. First, N1 modulations to task-relevant stimuli indexed attentional selection of stimuli during the load task and further correlated with task performance. Second, with increased task size, attentional modulation of the N1 to distractor stimuli showed a differential pattern that was consistent with a scaling of attentional selection. Together, these results demonstrate that the size of an attended stimulus scales the profile of attentional selection across the visual field and provides insights into the attentional mechanisms associated with such spatial scaling.

  1. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  2. Front-end vision and multi-scale image analysis multi-scale computer vision theory and applications, written in Mathematica

    CERN Document Server

    Romeny, Bart M Haar

    2008-01-01

    Front-End Vision and Multi-Scale Image Analysis is a tutorial in multi-scale methods for computer vision and image processing. It builds on the cross fertilization between human visual perception and multi-scale computer vision (`scale-space') theory and applications. The multi-scale strategies recognized in the first stages of the human visual system are carefully examined, and taken as inspiration for the many geometric methods discussed. All chapters are written in Mathematica, a spectacular high-level language for symbolic and numerical manipulations. The book presents a new and effective

  3. Comparison of scale analysis and numerical simulation for saturated zone convective mixing processes

    International Nuclear Information System (INIS)

    Oldenburg, C.M.

    1998-01-01

    Scale analysis can be used to predict a variety of quantities arising from natural systems where processes are described by partial differential equations. For example, scale analysis can be applied to estimate the effectiveness of convective missing on the dilution of contaminants in groundwater. Scale analysis involves substituting simple quotients for partial derivatives and identifying and equating the dominant terms in an order-of-magnitude sense. For free convection due to sidewall heating of saturated porous media, scale analysis shows that vertical convective velocity in the thermal boundary layer region is proportional to the Rayleigh number, horizontal convective velocity is proportional to the square root of the Rayleigh number, and thermal boundary layer thickness is proportional to the inverse square root of the Rayleigh number. These scale analysis estimates are corroborated by numerical simulations of an idealized system. A scale analysis estimate of mixing time for a tracer mixing by hydrodynamic dispersion in a convection cell also agrees well with numerical simulation for two different Rayleigh numbers. Scale analysis for the heating-from-below scenario produces estimates of maximum velocity one-half as large as the sidewall case. At small values of the Rayleigh number, this estimate is confirmed by numerical simulation. For larger Rayleigh numbers, simulation results suggest maximum velocities are similar to the sidewall heating scenario. In general, agreement between scale analysis estimates and numerical simulation results serves to validate the method of scale analysis. Application is to radioactive repositories

  4. Dynamical properties of fractal networks: Scaling, numerical simulations, and physical realizations

    International Nuclear Information System (INIS)

    Nakayama, T.; Yakubo, K.; Orbach, R.L.

    1994-01-01

    This article describes the advances that have been made over the past ten years on the problem of fracton excitations in fractal structures. The relevant systems to this subject are so numerous that focus is limited to a specific structure, the percolating network. Recent progress has followed three directions: scaling, numerical simulations, and experiment. In a happy coincidence, large-scale computations, especially those involving array processors, have become possible in recent years. Experimental techniques such as light- and neutron-scattering experiments have also been developed. Together, they form the basis for a review article useful as a guide to understanding these developments and for charting future research directions. In addition, new numerical simulation results for the dynamical properties of diluted antiferromagnets are presented and interpreted in terms of scaling arguments. The authors hope this article will bring the major advances and future issues facing this field into clearer focus, and will stimulate further research on the dynamical properties of random systems

  5. How to ask about patient satisfaction? The visual analogue scale is less vulnerable to confounding factors and ceiling effect than a symmetric Likert scale.

    Science.gov (United States)

    Voutilainen, Ari; Pitkäaho, Taina; Kvist, Tarja; Vehviläinen-Julkunen, Katri

    2016-04-01

    To study the effects of scale type (visual analogue scale vs. Likert), item order (systematic vs. random), item non-response and patient-related characteristics (age, gender, subjective health, need for assistance with filling out the questionnaire and length of stay) on the results of patient satisfaction surveys. Although patient satisfaction is one of the most intensely studied issues in the health sciences, research information about the effects of possible instrument-related confounding factors on patient satisfaction surveys is scant. A quasi-experimental design was employed. A non-randomized sample of 150 surgical patients was gathered to minimize possible alterations in care quality. Data were collected in May-September 2014 from one tertiary hospital in Finland using the Revised Humane Caring Scale instrument. New versions of the instrument were created for the present purposes. In these versions, items were either in a visual analogue format or Likert-scaled, in systematic or random order. The data were analysed using an analysis of covariance and a paired samples t-test. The visual analogue scale items were less vulnerable to bias from confounding factors than were the Likert-scaled items. The visual analogue scale also avoided the ceiling effect better than Likert and the time needed to complete the visual analogue scale questionnaire was 28% shorter than that needed to complete the Likert-scaled questionnaire. The present results supported the use of visual analogue scale rather than Likert scaling in patient satisfaction surveys and stressed the need to account for as many potential confounding factors as possible. © 2015 John Wiley & Sons Ltd.

  6. HD-MTL: Hierarchical Deep Multi-Task Learning for Large-Scale Visual Recognition.

    Science.gov (United States)

    Fan, Jianping; Zhao, Tianyi; Kuang, Zhenzhong; Zheng, Yu; Zhang, Ji; Yu, Jun; Peng, Jinye

    2017-02-09

    In this paper, a hierarchical deep multi-task learning (HD-MTL) algorithm is developed to support large-scale visual recognition (e.g., recognizing thousands or even tens of thousands of atomic object classes automatically). First, multiple sets of multi-level deep features are extracted from different layers of deep convolutional neural networks (deep CNNs), and they are used to achieve more effective accomplishment of the coarseto- fine tasks for hierarchical visual recognition. A visual tree is then learned by assigning the visually-similar atomic object classes with similar learning complexities into the same group, which can provide a good environment for determining the interrelated learning tasks automatically. By leveraging the inter-task relatedness (inter-class similarities) to learn more discriminative group-specific deep representations, our deep multi-task learning algorithm can train more discriminative node classifiers for distinguishing the visually-similar atomic object classes effectively. Our hierarchical deep multi-task learning (HD-MTL) algorithm can integrate two discriminative regularization terms to control the inter-level error propagation effectively, and it can provide an end-to-end approach for jointly learning more representative deep CNNs (for image representation) and more discriminative tree classifier (for large-scale visual recognition) and updating them simultaneously. Our incremental deep learning algorithms can effectively adapt both the deep CNNs and the tree classifier to the new training images and the new object classes. Our experimental results have demonstrated that our HD-MTL algorithm can achieve very competitive results on improving the accuracy rates for large-scale visual recognition.

  7. A visual analytics system for optimizing the performance of large-scale networks in supercomputing systems

    Directory of Open Access Journals (Sweden)

    Takanori Fujiwara

    2018-03-01

    Full Text Available The overall efficiency of an extreme-scale supercomputer largely relies on the performance of its network interconnects. Several of the state of the art supercomputers use networks based on the increasingly popular Dragonfly topology. It is crucial to study the behavior and performance of different parallel applications running on Dragonfly networks in order to make optimal system configurations and design choices, such as job scheduling and routing strategies. However, in order to study these temporal network behavior, we would need a tool to analyze and correlate numerous sets of multivariate time-series data collected from the Dragonfly’s multi-level hierarchies. This paper presents such a tool–a visual analytics system–that uses the Dragonfly network to investigate the temporal behavior and optimize the communication performance of a supercomputer. We coupled interactive visualization with time-series analysis methods to help reveal hidden patterns in the network behavior with respect to different parallel applications and system configurations. Our system also provides multiple coordinated views for connecting behaviors observed at different levels of the network hierarchies, which effectively helps visual analysis tasks. We demonstrate the effectiveness of the system with a set of case studies. Our system and findings can not only help improve the communication performance of supercomputing applications, but also the network performance of next-generation supercomputers. Keywords: Supercomputing, Parallel communication network, Dragonfly networks, Time-series data, Performance analysis, Visual analytics

  8. Quantitative regional validation of the visual rating scale for posterior cortical atrophy

    International Nuclear Information System (INIS)

    Moeller, Christiane; Benedictus, Marije R.; Koedam, Esther L.G.M.; Scheltens, Philip; Flier, Wiesje M. van der; Versteeg, Adriaan; Wattjes, Mike P.; Barkhof, Frederik; Vrenken, Hugo

    2014-01-01

    Validate the four-point visual rating scale for posterior cortical atrophy (PCA) on magnetic resonance images (MRI) through quantitative grey matter (GM) volumetry and voxel-based morphometry (VBM) to justify its use in clinical practice. Two hundred twenty-nine patients with probable Alzheimer's disease and 128 with subjective memory complaints underwent 3T MRI. PCA was rated according to the visual rating scale. GM volumes of six posterior structures and the total posterior region were extracted using IBASPM and compared among PCA groups. To determine which anatomical regions contributed most to the visual scores, we used binary logistic regression. VBM compared local GM density among groups. Patients were categorised according to their PCA scores: PCA-0 (n = 122), PCA-1 (n = 143), PCA-2 (n = 79), and PCA-3 (n = 13). All structures except the posterior cingulate differed significantly among groups. The inferior parietal gyrus volume discriminated the most between rating scale levels. VBM showed that PCA-1 had a lower GM volume than PCA-0 in the parietal region and other brain regions, whereas between PCA-1 and PCA-2/3 GM atrophy was mostly restricted to posterior regions. The visual PCA rating scale is quantitatively validated and reliably reflects GM atrophy in parietal regions, making it a valuable tool for the daily radiological assessment of dementia. (orig.)

  9. Quantitative regional validation of the visual rating scale for posterior cortical atrophy

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, Christiane; Benedictus, Marije R.; Koedam, Esther L.G.M.; Scheltens, Philip [VU University Medical Center, Alzheimer Center and Department of Neurology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Flier, Wiesje M. van der [VU University Medical Center, Alzheimer Center and Department of Neurology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); VU University Medical Center, Department of Epidemiology and Biostatistics, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Versteeg, Adriaan; Wattjes, Mike P.; Barkhof, Frederik [VU University Medical Center, Department of Radiology and Nuclear Medicine, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Vrenken, Hugo [VU University Medical Center, Department of Radiology and Nuclear Medicine, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); VU University Medical Center, Department of Physics and Medical Technology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands)

    2014-02-15

    Validate the four-point visual rating scale for posterior cortical atrophy (PCA) on magnetic resonance images (MRI) through quantitative grey matter (GM) volumetry and voxel-based morphometry (VBM) to justify its use in clinical practice. Two hundred twenty-nine patients with probable Alzheimer's disease and 128 with subjective memory complaints underwent 3T MRI. PCA was rated according to the visual rating scale. GM volumes of six posterior structures and the total posterior region were extracted using IBASPM and compared among PCA groups. To determine which anatomical regions contributed most to the visual scores, we used binary logistic regression. VBM compared local GM density among groups. Patients were categorised according to their PCA scores: PCA-0 (n = 122), PCA-1 (n = 143), PCA-2 (n = 79), and PCA-3 (n = 13). All structures except the posterior cingulate differed significantly among groups. The inferior parietal gyrus volume discriminated the most between rating scale levels. VBM showed that PCA-1 had a lower GM volume than PCA-0 in the parietal region and other brain regions, whereas between PCA-1 and PCA-2/3 GM atrophy was mostly restricted to posterior regions. The visual PCA rating scale is quantitatively validated and reliably reflects GM atrophy in parietal regions, making it a valuable tool for the daily radiological assessment of dementia. (orig.)

  10. Visualization and parallel I/O at extreme scale

    International Nuclear Information System (INIS)

    Ross, R B; Peterka, T; Shen, H-W; Hong, Y; Ma, K-L; Yu, H; Moreland, K

    2008-01-01

    In our efforts to solve ever more challenging problems through computational techniques, the scale of our compute systems continues to grow. As we approach petascale, it becomes increasingly important that all the resources in the system be used as efficiently as possible, not just the floating-point units. Because of hardware, software, and usability challenges, storage resources are often one of the most poorly used and performing components of today's compute systems. This situation can be especially true in the case of the analysis phases of scientific workflows. In this paper we discuss the impact of large-scale data on visual analysis operations and examine a collection of approaches to I/O in the visual analysis process. First we examine the performance of volume rendering on a leadership-computing platform and assess the relative cost of I/O, rendering, and compositing operations. Next we analyze the performance implications of eliminating preprocessing from this example workflow. Then we describe a technique that uses data reorganization to improve access times for data-intensive volume rendering

  11. Numerical Modeling and Experimental Analysis of Scale Horizontal Axis Marine Hydrokinetic (MHK) Turbines

    Science.gov (United States)

    Javaherchi, Teymour; Stelzenmuller, Nick; Seydel, Joseph; Aliseda, Alberto

    2013-11-01

    We investigate, through a combination of scale model experiments and numerical simulations, the evolution of the flow field around the rotor and in the wake of Marine Hydrokinetic (MHK) turbines. Understanding the dynamics of this flow field is the key to optimizing the energy conversion of single devices and the arrangement of turbines in commercially viable arrays. This work presents a comparison between numerical and experimental results from two different case studies of scaled horizontal axis MHK turbines (45:1 scale). In the first case study, we investigate the effect of Reynolds number (Re = 40,000 to 100,000) and Tip Speed Ratio (TSR = 5 to 12) variation on the performance and wake structure of a single turbine. In the second case, we study the effect of the turbine downstream spacing (5d to 14d) on the performance and wake development in a coaxial configuration of two turbines. These results provide insights into the dynamics of Horizontal Axis Hydrokinetic Turbines, and by extension to Horizontal Axis Wind Turbines in close proximity to each other, and highlight the capabilities and limitations of the numerical models. Once validated at laboratory scale, the numerical model can be used to address other aspects of MHK turbines at full scale. Supported by DOE through the National Northwest Marine Renewable Energy Center.

  12. Visualization environment and its utilization in the ITBL building

    International Nuclear Information System (INIS)

    Yasuhara, Yuko

    2004-12-01

    In recent years, visualization techniques have become more and more important in various fields. Especially in scientific fields, a large amount of numerical output data crucially needs to be changed into visualized form, because computations have grown to larger and larger scales as well as have become more complicated, so that computed results must be intuitively comprehensible by using various visualization techniques like 3D or stereo image construction. In the visualization room in the ITBL building, a 3-screen Virtual Reality system, a Portable Virtual Reality system, a Mixed Reality system, and Visualization tools like alchemy etc. are installed for the above-mentioned use. These devices enable us to easily change numerical data into visualized images of a virtual reality world with the use of eye-glasses or a head-mount-display device. This article describes the visualization environment in the ITBL building, it's use, and the tasks to be solved. (author)

  13. A Framework for Parallel Numerical Simulations on Multi-Scale Geometries

    KAUST Repository

    Varduhn, Vasco

    2012-06-01

    In this paper, an approach on performing numerical multi-scale simulations on fine detailed geometries is presented. In particular, the focus lies on the generation of sufficient fine mesh representations, whereas a resolution of dozens of millions of voxels is inevitable in order to sufficiently represent the geometry. Furthermore, the propagation of boundary conditions is investigated by using simulation results on the coarser simulation scale as input boundary conditions on the next finer scale. Finally, the applicability of our approach is shown on a two-phase simulation for flooding scenarios in urban structures running from a city wide scale to a fine detailed in-door scale on feature rich building geometries. © 2012 IEEE.

  14. Perceived visual informativeness (PVI): construct and scale development to assess visual information in printed materials.

    Science.gov (United States)

    King, Andy J; Jensen, Jakob D; Davis, LaShara A; Carcioppolo, Nick

    2014-01-01

    There is a paucity of research on the visual images used in health communication messages and campaign materials. Even though many studies suggest further investigation of these visual messages and their features, few studies provide specific constructs or assessment tools for evaluating the characteristics of visual messages in health communication contexts. The authors conducted 2 studies to validate a measure of perceived visual informativeness (PVI), a message construct assessing visual messages presenting statistical or indexical information. In Study 1, a 7-item scale was created that demonstrated good internal reliability (α = .91), as well as convergent and divergent validity with related message constructs such as perceived message quality, perceived informativeness, and perceived attractiveness. PVI also converged with a preference for visual learning but was unrelated to a person's actual vision ability. In addition, PVI exhibited concurrent validity with a number of important constructs including perceived message effectiveness, decisional satisfaction, and three key public health theory behavior predictors: perceived benefits, perceived barriers, and self-efficacy. Study 2 provided more evidence that PVI is an internally reliable measure and demonstrates that PVI is a modifiable message feature that can be tested in future experimental work. PVI provides an initial step to assist in the evaluation and testing of visual messages in campaign and intervention materials promoting informed decision making and behavior change.

  15. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    International Nuclear Information System (INIS)

    Schroeder, William J.

    2011-01-01

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem

  16. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    Energy Technology Data Exchange (ETDEWEB)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally

  17. Gait in children with cerebral palsy : observer reliability of Physician Rating Scale and Edinburgh Visual Gait Analysis Interval Testing scale

    NARCIS (Netherlands)

    Maathuis, KGB; van der Schans, CP; van Iperen, A; Rietman, HS; Geertzen, JHB

    2005-01-01

    The aim of this study was to test the inter- and intra-observer reliability of the Physician Rating Scale (PRS) and the Edinburgh Visual Gait Analysis Interval Testing (GAIT) scale for use in children with cerebral palsy (CP). Both assessment scales are quantitative observational scales, evaluating

  18. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Silva, Claudio [New York Univ. (NYU), NY (United States). Computer Science and Engineering Dept.

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integration or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.

  19. Direct Numerical Simulation of Low Capillary Number Pore Scale Flows

    Science.gov (United States)

    Esmaeilzadeh, S.; Soulaine, C.; Tchelepi, H.

    2017-12-01

    The arrangement of void spaces and the granular structure of a porous medium determines multiple macroscopic properties of the rock such as porosity, capillary pressure, and relative permeability. Therefore, it is important to study the microscopic structure of the reservoir pores and understand the dynamics of fluid displacements through them. One approach for doing this, is direct numerical simulation of pore-scale flow that requires a robust numerical tool for prediction of fluid dynamics and a detailed understanding of the physical processes occurring at the pore-scale. In pore scale flows with a low capillary number, Eulerian multiphase methods are well-known to produce additional vorticity close to the interface. This is mainly due to discretization errors which lead to an imbalance of capillary pressure and surface tension forces that causes unphysical spurious currents. At the pore scale, these spurious currents can become significantly stronger than the average velocity in the phases, and lead to unphysical displacement of the interface. In this work, we first investigate the capability of the algebraic Volume of Fluid (VOF) method in OpenFOAM for low capillary number pore scale flow simulations. Afterward, we compare VOF results with a Coupled Level-Set Volume of Fluid (CLSVOF) method and Iso-Advector method. It has been shown that the former one reduces the VOF's unphysical spurious currents in some cases, and both are known to capture interfaces sharper than VOF. As the conclusion, we will investigate that whether the use of CLSVOF or Iso-Advector will lead to less spurious velocities and more accurate results for capillary driven pore-scale multiphase flows or not. Keywords: Pore-scale multiphase flow, Capillary driven flows, Spurious currents, OpenFOAM

  20. Numerical simulation of small scale soft impact tests

    International Nuclear Information System (INIS)

    Varpasuo, Pentti

    2008-01-01

    This paper describes the small scale soft missile impact tests. The purpose of the test program is to provide data for the calibration of the numerical simulation models for impact simulation. In the experiments, both dry and fluid filled missiles are used. The tests with fluid filled missiles investigate the release speed and the droplet size of the fluid release. This data is important in quantifying the fire hazard of flammable liquid after the release. The spray release velocity and droplet size are also input data for analytical and numerical simulation of the liquid spread in the impact. The behaviour of the impact target is the second investigative goal of the test program. The response of reinforced and pre-stressed concrete walls is studied with the aid of displacement and strain monitoring. (authors)

  1. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna

    2015-05-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. \\'output-sensitive\\' algorithms and system designs. This leads to recent output-sensitive approaches that are \\'ray-guided\\', \\'visualization-driven\\' or \\'display-aware\\'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  2. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna; Hadwiger, Markus; Pfister, Hanspeter

    2015-01-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. 'output-sensitive' algorithms and system designs. This leads to recent output-sensitive approaches that are 'ray-guided', 'visualization-driven' or 'display-aware'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  3. Multi-scale modelling and numerical simulation of electronic kinetic transport

    International Nuclear Information System (INIS)

    Duclous, R.

    2009-11-01

    This research thesis which is at the interface between numerical analysis, plasma physics and applied mathematics, deals with the kinetic modelling and numerical simulations of the electron energy transport and deposition in laser-produced plasmas, having in view the processes of fuel assembly to temperature and density conditions necessary to ignite fusion reactions. After a brief review of the processes at play in the collisional kinetic theory of plasmas, with a focus on basic models and methods to implement, couple and validate them, the author focuses on the collective aspect related to the free-streaming electron transport equation in the non-relativistic limit as well as in the relativistic regime. He discusses the numerical development and analysis of the scheme for the Vlasov-Maxwell system, and the selection of a validation procedure and numerical tests. Then, he investigates more specific aspects of the collective transport: the multi-specie transport, submitted to phase-space discontinuities. Dealing with the multi-scale physics of electron transport with collision source terms, he validates the accuracy of a fast Monte Carlo multi-grid solver for the Fokker-Planck-Landau electron-electron collision operator. He reports realistic simulations for the kinetic electron transport in the frame of the shock ignition scheme, the development and validation of a reduced electron transport angular model. He finally explores the relative importance of the processes involving electron-electron collisions at high energy by means a multi-scale reduced model with relativistic Boltzmann terms

  4. Continuous modelling study of numerical volumes - Applications to the visualization of anatomical structures

    International Nuclear Information System (INIS)

    Goret, C.

    1990-12-01

    Several technics of imaging (IRM, image scanners, tomoscintigraphy, echography) give numerical informations presented by means of a stack of parallel cross-sectional images. Since many years, 3-D mathematical tools have been developed and allow the 3 D images synthesis of surfaces. In first part, we give the technics of numerical volume exploitation and their medical applications to diagnosis and therapy. The second part is about a continuous modelling of the volume with a tensor product of cubic splines. We study the characteristics of this representation and its clinical validation. Finally, we treat of the problem of surface visualization of objects contained in the volume. The results show the interest of this model and allow to propose specifications for 3-D workstation realization [fr

  5. Numerical integration methods and layout improvements in the context of dynamic RNA visualization.

    Science.gov (United States)

    Shabash, Boris; Wiese, Kay C

    2017-05-30

    RNA visualization software tools have traditionally presented a static visualization of RNA molecules with limited ability for users to interact with the resulting image once it is complete. Only a few tools allowed for dynamic structures. One such tool is jViz.RNA. Currently, jViz.RNA employs a unique method for the creation of the RNA molecule layout by mapping the RNA nucleotides into vertexes in a graph, which we call the detailed graph, and then utilizes a Newtonian mechanics inspired system of forces to calculate a layout for the RNA molecule. The work presented here focuses on improvements to jViz.RNA that allow the drawing of RNA secondary structures according to common drawing conventions, as well as dramatic run-time performance improvements. This is done first by presenting an alternative method for mapping the RNA molecule into a graph, which we call the compressed graph, and then employing advanced numerical integration methods for the compressed graph representation. Comparing the compressed graph and detailed graph implementations, we find that the compressed graph produces results more consistent with RNA drawing conventions. However, we also find that employing the compressed graph method requires a more sophisticated initial layout to produce visualizations that would require minimal user interference. Comparing the two numerical integration methods demonstrates the higher stability of the Backward Euler method, and its resulting ability to handle much larger time steps, a high priority feature for any software which entails user interaction. The work in this manuscript presents the preferred use of compressed graphs to detailed ones, as well as the advantages of employing the Backward Euler method over the Forward Euler method. These improvements produce more stable as well as visually aesthetic representations of the RNA secondary structures. The results presented demonstrate that both the compressed graph representation, as well as the Backward

  6. GPU-accelerated brain connectivity reconstruction and visualization in large-scale electron micrographs

    KAUST Repository

    Jeong, Wonki

    2011-01-01

    This chapter introduces a GPU-accelerated interactive, semiautomatic axon segmentation and visualization system. Two challenging problems have been addressed: the interactive 3D axon segmentation and the interactive 3D image filtering and rendering of implicit surfaces. The reconstruction of neural connections to understand the function of the brain is an emerging and active research area in neuroscience. With the advent of high-resolution scanning technologies, such as 3D light microscopy and electron microscopy (EM), reconstruction of complex 3D neural circuits from large volumes of neural tissues has become feasible. Among them, only EM data can provide sufficient resolution to identify synapses and to resolve extremely narrow neural processes. These high-resolution, large-scale datasets pose challenging problems, for example, how to process and manipulate large datasets to extract scientifically meaningful information using a compact representation in a reasonable processing time. The running time of the multiphase level set segmentation method has been measured on the CPU and GPU. The CPU version is implemented using the ITK image class and the ITK distance transform filter. The numerical part of the CPU implementation is similar to the GPU implementation for fair comparison. The main focus of this chapter is introducing the GPU algorithms and their implementation details, which are the core components of the interactive segmentation and visualization system. © 2011 Copyright © 2011 NVIDIA Corporation and Wen-mei W. Hwu Published by Elsevier Inc. All rights reserved..

  7. Equipment of visualization environment of a large-scale structural analysis system. Visualization using AVS/Express of an ADVENTURE system

    International Nuclear Information System (INIS)

    Miyazaki, Mikiya

    2004-02-01

    The data display work of visualization is done in many research fields, and a lot of special softwares for the specific purposes exist today. But such softwares have an interface to only a small number of solvers. In many simulations, data conversion for visualization software is required between analysis and visualization for the practical use. This report describes the equipment work of the data visualization environment where AVS/Express was installed in corresponding to many requests from the users of the large-scale structural analysis system which is prepared as an ITBL community software. This environment enables to use the ITBL visualization server as a visualization device after the computation on the ITBL computer. Moreover, a lot of use will be expected within the community in the ITBL environment by merging it into ITBL/AVS environment in the future. (author)

  8. Multi-scale image segmentation method with visual saliency constraints and its application

    Science.gov (United States)

    Chen, Yan; Yu, Jie; Sun, Kaimin

    2018-03-01

    Object-based image analysis method has many advantages over pixel-based methods, so it is one of the current research hotspots. It is very important to get the image objects by multi-scale image segmentation in order to carry out object-based image analysis. The current popular image segmentation methods mainly share the bottom-up segmentation principle, which is simple to realize and the object boundaries obtained are accurate. However, the macro statistical characteristics of the image areas are difficult to be taken into account, and fragmented segmentation (or over-segmentation) results are difficult to avoid. In addition, when it comes to information extraction, target recognition and other applications, image targets are not equally important, i.e., some specific targets or target groups with particular features worth more attention than the others. To avoid the problem of over-segmentation and highlight the targets of interest, this paper proposes a multi-scale image segmentation method with visually saliency graph constraints. Visual saliency theory and the typical feature extraction method are adopted to obtain the visual saliency information, especially the macroscopic information to be analyzed. The visual saliency information is used as a distribution map of homogeneity weight, where each pixel is given a weight. This weight acts as one of the merging constraints in the multi- scale image segmentation. As a result, pixels that macroscopically belong to the same object but are locally different can be more likely assigned to one same object. In addition, due to the constraint of visual saliency model, the constraint ability over local-macroscopic characteristics can be well controlled during the segmentation process based on different objects. These controls will improve the completeness of visually saliency areas in the segmentation results while diluting the controlling effect for non- saliency background areas. Experiments show that this method works

  9. A Spreadsheet-Based Visualized Mindtool for Improving Students' Learning Performance in Identifying Relationships between Numerical Variables

    Science.gov (United States)

    Lai, Chiu-Lin; Hwang, Gwo-Jen

    2015-01-01

    In this study, a spreadsheet-based visualized Mindtool was developed for improving students' learning performance when finding relationships between numerical variables by engaging them in reasoning and decision-making activities. To evaluate the effectiveness of the proposed approach, an experiment was conducted on the "phenomena of climate…

  10. Programmed Control of Optical Grating Scales for Visual Research.

    Science.gov (United States)

    1980-12-01

    A -AOO .9 AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOO--ETC F/6 14/2 PROGRAMMED CONTROL OF OPTI CAL GRATING SCALES FOR VISUAL RESEARC --ETC(fl...custom system for AMRL. The cost in memory parts alone was $40,000, a good indication that the market is not over-priced. Ca-? western Reserve

  11. Vortex locking in direct numerical simulations of quantum turbulence.

    Science.gov (United States)

    Morris, Karla; Koplik, Joel; Rouson, Damian W I

    2008-07-04

    Direct numerical simulations are used to examine the locking of quantized superfluid vortices and normal fluid vorticity in evolving turbulent flows. The superfluid is driven by the normal fluid, which undergoes either a decaying Taylor-Green flow or a linearly forced homogeneous isotropic turbulent flow, although the back reaction of the superfluid on the normal fluid flow is omitted. Using correlation functions and wavelet transforms, we present numerical and visual evidence for vortex locking on length scales above the intervortex spacing.

  12. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  13. LongLine: Visual Analytics System for Large-scale Audit Logs

    Directory of Open Access Journals (Sweden)

    Seunghoon Yoo

    2018-03-01

    Full Text Available Audit logs are different from other software logs in that they record the most primitive events (i.e., system calls in modern operating systems. Audit logs contain a detailed trace of an operating system, and thus have received great attention from security experts and system administrators. However, the complexity and size of audit logs, which increase in real time, have hindered analysts from understanding and analyzing them. In this paper, we present a novel visual analytics system, LongLine, which enables interactive visual analyses of large-scale audit logs. LongLine lowers the interpretation barrier of audit logs by employing human-understandable representations (e.g., file paths and commands instead of abstract indicators of operating systems (e.g., file descriptors as well as revealing the temporal patterns of the logs in a multi-scale fashion with meaningful granularity of time in mind (e.g., hourly, daily, and weekly. LongLine also streamlines comparative analysis between interesting subsets of logs, which is essential in detecting anomalous behaviors of systems. In addition, LongLine allows analysts to monitor the system state in a streaming fashion, keeping the latency between log creation and visualization less than one minute. Finally, we evaluate our system through a case study and a scenario analysis with security experts.

  14. Large Scale Visual Recommendations From Street Fashion Images

    OpenAIRE

    Jagadeesh, Vignesh; Piramuthu, Robinson; Bhardwaj, Anurag; Di, Wei; Sundaresan, Neel

    2014-01-01

    We describe a completely automated large scale visual recommendation system for fashion. Our focus is to efficiently harness the availability of large quantities of online fashion images and their rich meta-data. Specifically, we propose four data driven models in the form of Complementary Nearest Neighbor Consensus, Gaussian Mixture Models, Texture Agnostic Retrieval and Markov Chain LDA for solving this problem. We analyze relative merits and pitfalls of these algorithms through extensive e...

  15. Visualization and analysis of flow structures in an open cavity

    Science.gov (United States)

    Liu, Jun; Cai, Jinsheng; Yang, Dangguo; Wu, Junqiang; Wang, Xiansheng

    2018-05-01

    A numerical study is performed on the supersonic flow over an open cavity at Mach number of 1.5. A newly developed visualization method is employed to visualize the complicated flow structures, which provide an insight into major flow physics. Four types of shock/compressive waves which existed in experimental schlieren are observed in numerical visualization results. Furthermore, other flow structures such as multi-scale vortices are also obtained in the numerical results. And a new type of shocklet which is beneath large vortices is found. The shocklet beneath the vortex originates from leading edge, then, is strengthened by successive interactions between feedback compressive waves and its attached vortex. Finally, it collides against the trailing surface and generates a large number of feedback compressive waves and intensive pressure fluctuations. It is suggested that the shocklets beneath vortex play an important role of cavity self-sustained oscillation.

  16. Exploring Multi-Scale Spatiotemporal Twitter User Mobility Patterns with a Visual-Analytics Approach

    Directory of Open Access Journals (Sweden)

    Junjun Yin

    2016-10-01

    Full Text Available Understanding human mobility patterns is of great importance for urban planning, traffic management, and even marketing campaign. However, the capability of capturing detailed human movements with fine-grained spatial and temporal granularity is still limited. In this study, we extracted high-resolution mobility data from a collection of over 1.3 billion geo-located Twitter messages. Regarding the concerns of infringement on individual privacy, such as the mobile phone call records with restricted access, the dataset is collected from publicly accessible Twitter data streams. In this paper, we employed a visual-analytics approach to studying multi-scale spatiotemporal Twitter user mobility patterns in the contiguous United States during the year 2014. Our approach included a scalable visual-analytics framework to deliver efficiency and scalability in filtering large volume of geo-located tweets, modeling and extracting Twitter user movements, generating space-time user trajectories, and summarizing multi-scale spatiotemporal user mobility patterns. We performed a set of statistical analysis to understand Twitter user mobility patterns across multi-level spatial scales and temporal ranges. In particular, Twitter user mobility patterns measured by the displacements and radius of gyrations of individuals revealed multi-scale or multi-modal Twitter user mobility patterns. By further studying such mobility patterns in different temporal ranges, we identified both consistency and seasonal fluctuations regarding the distance decay effects in the corresponding mobility patterns. At the same time, our approach provides a geo-visualization unit with an interactive 3D virtual globe web mapping interface for exploratory geo-visual analytics of the multi-level spatiotemporal Twitter user movements.

  17. An Experimental-Numerical Study of Small Scale Flow Interaction with Bioluminescent Plankton

    National Research Council Canada - National Science Library

    Latz, Michael

    1998-01-01

    Numerical and experimental approaches were used to investigate the effects of quantified flow stimuli on bioluminescence sUmulatidn at the small length and time scales appropriate for individual plankton...

  18. Developing Local Scale, High Resolution, Data to Interface with Numerical Hurricane Models

    Science.gov (United States)

    Witkop, R.; Becker, A.

    2017-12-01

    In 2017, the University of Rhode Island's (URI's) Graduate School of Oceanography (GSO) developed hurricane models that specify wind speed, inundation, and erosion around Rhode Island with enough precision to incorporate impacts on individual facilities. At the same time, URI's Marine Affairs Visualization Lab (MAVL) developed a way to realistically visualize these impacts in 3-D. Since climate change visualizations and water resource simulations have been shown to promote resiliency action (Sheppard, 2015) and increase credibility (White et al., 2010) when local knowledge is incorporated, URI's hurricane models and visualizations may also more effectively enable hurricane resilience actions if they include Facility Manager (FM) and Emergency Manager (EM) perceived hurricane impacts. This study determines how FM's and EM's perceive their assets as being vulnerable to quantifiable hurricane-related forces at the individual facility scale while exploring methods to elicit this information from FMs and EMs in a format usable for incorporation into URI GSO's hurricane models.

  19. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  20. Numerical simulation of lubrication mechanisms at mesoscopic scale

    DEFF Research Database (Denmark)

    Hubert, C.; Bay, Niels; Christiansen, Peter

    2011-01-01

    The mechanisms of liquid lubrication in metal forming are studied at a mesoscopic scale, adopting a 2D sequential fluid-solid weak coupling approach earlier developed in the first author's laboratory. This approach involves two computation steps. The first one is a fully coupled fluid-structure F...... of pyramidal indentations. The tests are performed with variable reduction and drawing speed under controlled front and back tension forces. Visual observations through a transparent die of the fluid entrapment and escape from the cavities using a CCD camera show the mechanisms of Micro......PlastoHydroDynamic Lubrication (MPHDL) as well as cavity shrinkage due to lubricant compression and escape and strip deformation....

  1. Aerosol numerical modelling at local scale

    International Nuclear Information System (INIS)

    Albriet, Bastien

    2007-01-01

    At local scale and in urban areas, an important part of particulate pollution is due to traffic. It contributes largely to the high number concentrations observed. Two aerosol sources are mainly linked to traffic. Primary emission of soot particles and secondary nanoparticle formation by nucleation. The emissions and mechanisms leading to the formation of such bimodal distribution are still badly understood nowadays. In this thesis, we try to provide an answer to this problematic by numerical modelling. The Modal Aerosol Model MAM is used, coupled with two 3D-codes: a CFD (Mercure Saturne) and a CTM (Polair3D). A sensitivity analysis is performed, at the border of a road but also in the first meters of an exhaust plume, to identify the role of each process involved and the sensitivity of different parameters used in the modelling. (author) [fr

  2. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  3. Valuing Treatments for Parkinson Disease Incorporating Process Utility: Performance of Best-Worst Scaling, Time Trade-Off, and Visual Analogue Scales

    NARCIS (Netherlands)

    Weernink, Marieke Geertruida Maria; Groothuis-Oudshoorn, Catharina Gerarda Maria; IJzerman, Maarten Joost; van Til, Janine Astrid

    2016-01-01

    Objective The objective of this study was to compare treatment profiles including both health outcomes and process characteristics in Parkinson disease using best-worst scaling (BWS), time trade-off (TTO), and visual analogue scales (VAS). Methods From the model comprising of seven attributes with

  4. Statistical inference and visualization in scale-space for spatially dependent images

    KAUST Repository

    Vaughan, Amy; Jun, Mikyoung; Park, Cheolwoo

    2012-01-01

    SiZer (SIgnificant ZERo crossing of the derivatives) is a graphical scale-space visualization tool that allows for statistical inferences. In this paper we develop a spatial SiZer for finding significant features and conducting goodness-of-fit tests

  5. AdjScales: Visualizing Differences between Adjectives for Language Learners

    Science.gov (United States)

    Sheinman, Vera; Tokunaga, Takenobu

    In this study we introduce AdjScales, a method for scaling similar adjectives by their strength. It combines existing Web-based computational linguistic techniques in order to automatically differentiate between similar adjectives that describe the same property by strength. Though this kind of information is rarely present in most of the lexical resources and dictionaries, it may be useful for language learners that try to distinguish between similar words. Additionally, learners might gain from a simple visualization of these differences using unidimensional scales. The method is evaluated by comparison with annotation on a subset of adjectives from WordNet by four native English speakers. It is also compared against two non-native speakers of English. The collected annotation is an interesting resource in its own right. This work is a first step toward automatic differentiation of meaning between similar words for language learners. AdjScales can be useful for lexical resource enhancement.

  6. Visual attention mitigates information loss in small- and large-scale neural codes

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  7. Correlation of MRI Visual Scales with Neuropsychological Profile in Mild Cognitive Impairment of Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Luiz Felipe Vasconcellos

    2017-01-01

    Full Text Available Few studies have evaluated magnetic resonance imaging (MRI visual scales in Parkinson’s disease-Mild Cognitive Impairment (PD-MCI. We selected 79 PD patients and 92 controls (CO to perform neurologic and neuropsychological evaluation. Brain MRI was performed to evaluate the following scales: Global Cortical Atrophy (GCA, Fazekas, and medial temporal atrophy (MTA. The analysis revealed that both PD groups (amnestic and nonamnestic showed worse performance on several tests when compared to CO. Memory, executive function, and attention impairment were more severe in amnestic PD-MCI group. Overall analysis of frequency of MRI visual scales by MCI subtype did not reveal any statistically significant result. Statistically significant inverse correlation was observed between GCA scale and Mini-Mental Status Examination (MMSE, Montreal Cognitive Assessment (MoCA, semantic verbal fluency, Stroop test, figure memory test, trail making test (TMT B, and Rey Auditory Verbal Learning Test (RAVLT. The MTA scale correlated with Stroop test and Fazekas scale with figure memory test, digit span, and Stroop test according to the subgroup evaluated. Visual scales by MRI in MCI should be evaluated by cognitive domain and might be more useful in more severely impaired MCI or dementia patients.

  8. Investigating measurement equivalence of visual analogue scales and Likert-type scales in Internet-based personality questionnaires.

    Science.gov (United States)

    Kuhlmann, Tim; Dantlgraber, Michael; Reips, Ulf-Dietrich

    2017-12-01

    Visual analogue scales (VASs) have shown superior measurement qualities in comparison to traditional Likert-type response scales in previous studies. The present study expands the comparison of response scales to properties of Internet-based personality scales in a within-subjects design. A sample of 879 participants filled out an online questionnaire measuring Conscientiousness, Excitement Seeking, and Narcissism. The questionnaire contained all instruments in both answer scale versions in a counterbalanced design. Results show comparable reliabilities, means, and SDs for the VAS versions of the original scales, in comparison to Likert-type scales. To assess the validity of the measurements, age and gender were used as criteria, because all three constructs have shown non-zero correlations with age and gender in previous research. Both response scales showed a high overlap and the proposed relationships with age and gender. The associations were largely identical, with the exception of an increase in explained variance when predicting age from the VAS version of Excitement Seeking (B10 = 1318.95, ΔR(2) = .025). VASs showed similar properties to Likert-type response scales in most cases.

  9. Learning about the scale of the solar system using digital planetarium visualizations

    Science.gov (United States)

    Yu, Ka Chun; Sahami, Kamran; Dove, James

    2017-07-01

    We studied the use of a digital planetarium for teaching relative distances and sizes in introductory undergraduate astronomy classes. Inspired in part by the classic short film The Powers of Ten and large physical scale models of the Solar System that can be explored on foot, we created lectures using virtual versions of these two pedagogical approaches for classes that saw either an immersive treatment in the planetarium or a non-immersive version in the regular classroom (with N = 973 students participating in total). Students who visited the planetarium had not only the greatest learning gains, but their performance increased with time, whereas students who saw the same visuals projected onto a flat display in their classroom showed less retention over time. The gains seen in the students who visited the planetarium reveal that this medium is a powerful tool for visualizing scale over multiple orders of magnitude. However the modest gains for the students in the regular classroom also show the utility of these visualization approaches for the broader category of classroom physics simulations.

  10. ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.

    Science.gov (United States)

    Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng

    2017-08-30

    While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.

  11. Numerical Investigation of Multiple-, Interacting-Scale Variable-Density Ground Water Flow Systems

    Science.gov (United States)

    Cosler, D.; Ibaraki, M.

    2004-12-01

    The goal of our study is to elucidate the nonlinear processes that are important for multiple-, interacting-scale flow and solute transport in subsurface environments. In particular, we are focusing on the influence of small-scale instability development on variable-density ground water flow behavior in large-scale systems. Convective mixing caused by these instabilities may mix the fluids to a greater extent than would be the case with classical, Fickian dispersion. Most current numerical schemes for interpreting field-scale variable-density flow systems do not explicitly account for the complexities caused by small-scale instabilities and treat such processes as "lumped" Fickian dispersive mixing. Such approaches may greatly underestimate the mixing behavior and misrepresent the overall large-scale flow field dynamics. The specific objectives of our study are: (i) to develop an adaptive (spatial and temporal scales) three-dimensional numerical model that is fully capable of simulating field-scale variable-density flow systems with fine resolution (~1 cm); and (ii) to evaluate the importance of scale-dependent process interactions by performing a series of simulations on different problem scales ranging from laboratory experiments to field settings, including an aquifer storage and freshwater recovery (ASR) system similar to those planned for the Florida Everglades and in-situ contaminant remediation systems. We are examining (1) methods to create instabilities in field-scale systems, (2) porous media heterogeneity effects, and (3) the relation between heterogeneity characteristics (e.g., permeability variance and correlation length scales) and the mixing scales that develop for varying degrees of unstable stratification. Applications of our work include the design of new water supply and conservation measures (e.g., ASR systems), assessment of saltwater intrusion problems in coastal aquifers, and the design of in-situ remediation systems for aquifer restoration

  12. The development and validation of the Visual Analogue Self-Esteem Scale (VASES).

    Science.gov (United States)

    Brumfitt, S M; Sheeran, P

    1999-11-01

    To develop a visual analogue measure of self-esteem and test its psychometric properties. Two correlational studies involving samples of university students and aphasic speakers. Two hundred and forty-three university students completed multiple measures of self-esteem, depression and anxiety as well as measures of transitory mood and social desirability (Study 1). Two samples of aphasic speakers (N = 14 and N = 20) completed the Visual Analogue Self-Esteem Scale (VASES), the Rosenberg (1965) self-esteem scale and measures of depression and anxiety. (Study 2). Study 1 found evidence of good internal and test-retest reliability, construct validity and convergent and discriminant validity for a 10-item VASES. Study 2 demonstrated good internal reliability among aphasic speakers. The VASES is a short and easy to administer measure of self-esteem that possesses good psychometric properties.

  13. Visual attention mitigates information loss in small- and large-scale neural codes.

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  15. Scale-adaptive Local Patches for Robust Visual Object Tracking

    Directory of Open Access Journals (Sweden)

    Kang Sun

    2014-04-01

    Full Text Available This paper discusses the problem of robustly tracking objects which undergo rapid and dramatic scale changes. To remove the weakness of global appearance models, we present a novel scheme that combines object’s global and local appearance features. The local feature is a set of local patches that geometrically constrain the changes in the target’s appearance. In order to adapt to the object’s geometric deformation, the local patches could be removed and added online. The addition of these patches is constrained by the global features such as color, texture and motion. The global visual features are updated via the stable local patches during tracking. To deal with scale changes, we adapt the scale of patches in addition to adapting the object bound box. We evaluate our method by comparing it to several state-of-the-art trackers on publicly available datasets. The experimental results on challenging sequences confirm that, by using this scale-adaptive local patches and global properties, our tracker outperforms the related trackers in many cases by having smaller failure rate as well as better accuracy.

  16. Multi-Scale Analysis for Characterizing Near-Field Constituent Concentrations in the Context of a Macro-Scale Semi-Lagrangian Numerical Model

    Science.gov (United States)

    Yearsley, J. R.

    2017-12-01

    The semi-Lagrangian numerical scheme employed by RBM, a model for simulating time-dependent, one-dimensional water quality constituents in advection-dominated rivers, is highly scalable both in time and space. Although the model has been used at length scales of 150 meters and time scales of three hours, the majority of applications have been at length scales of 1/16th degree latitude/longitude (about 5 km) or greater and time scales of one day. Applications of the method at these scales has proven successful for characterizing the impacts of climate change on water temperatures in global rivers and on the vulnerability of thermoelectric power plants to changes in cooling water temperatures in large river systems. However, local effects can be very important in terms of ecosystem impacts, particularly in the case of developing mixing zones for wastewater discharges with pollutant loadings limited by regulations imposed by the Federal Water Pollution Control Act (FWPCA). Mixing zone analyses have usually been decoupled from large-scale watershed influences by developing scenarios that represent critical scenarios for external processes associated with streamflow and weather conditions . By taking advantage of the particle-tracking characteristics of the numerical scheme, RBM can provide results at any point in time within the model domain. We develop a proof of concept for locations in the river network where local impacts such as mixing zones may be important. Simulated results from the semi-Lagrangian numerical scheme are treated as input to a finite difference model of the two-dimensional diffusion equation for water quality constituents such as water temperature or toxic substances. Simulations will provide time-dependent, two-dimensional constituent concentration in the near-field in response to long-term basin-wide processes. These results could provide decision support to water quality managers for evaluating mixing zone characteristics.

  17. Historical perspective on the use of visual grading scales in evaluating skin irritation and sensitization

    DEFF Research Database (Denmark)

    Farage, Miranda A; Maibach, Howard I; Andersen, Klaus Ejner

    2011-01-01

    quality in comparison with current testing methods that rely on visual assessment. In addition, such measuring techniques can add considerably to the complexity of testing protocols. When benefits and cost are weighed in the balance, the visual assessment scales popularized by Draize and others remain...

  18. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  19. Using a familiar risk comparison within a risk ladder to improve risk understanding by low numerates: a study of visual attention.

    Science.gov (United States)

    Keller, Carmen

    2011-07-01

    Previous experimental research provides evidence that a familiar risk comparison within a risk ladder is understood by low- and high-numerate individuals. It especially helps low numerates to better evaluate risk. In the present study, an eye tracker was used to capture individuals' visual attention to a familiar risk comparison, such as the risk associated with smoking. Two parameters of information processing-efficiency and level-were derived from visual attention. A random sample of participants from the general population (N= 68) interpreted a given risk level with the help of the risk ladder. Numeracy was negatively correlated with overall visual attention on the risk ladder (r(s) =-0.28, p= 0.01), indicating that the lower the numeracy, the more the time spent looking at the whole risk ladder. Numeracy was positively correlated with the efficiency of processing relevant frequency (r(s) = 0.34, p improving risk communication formats. © 2011 Society for Risk Analysis.

  20. Numerical assessment of the ion turbulent thermal transport scaling laws

    International Nuclear Information System (INIS)

    Ottaviani, M.; Manfredi, G.

    2001-01-01

    Numerical simulations of ion temperature gradient (ITG) driven turbulence were carried out to investigate the parametric dependence of the ion thermal transport on the reduced gyroradius and on the local safety factor. Whereas the simulations show a clear proportionality of the conductivity to the gyroradius, the dependence on the safety factor cannot be represented as a simple power law like the one exhibited by the empirical scaling laws. (author)

  1. Improving the seismic small-scale modelling by comparison with numerical methods

    Science.gov (United States)

    Pageot, Damien; Leparoux, Donatienne; Le Feuvre, Mathieu; Durand, Olivier; Côte, Philippe; Capdeville, Yann

    2017-10-01

    The potential of experimental seismic modelling at reduced scale provides an intermediate step between numerical tests and geophysical campaigns on field sites. Recent technologies such as laser interferometers offer the opportunity to get data without any coupling effects. This kind of device is used in the Mesures Ultrasonores Sans Contact (MUSC) measurement bench for which an automated support system makes possible to generate multisource and multireceivers seismic data at laboratory scale. Experimental seismic modelling would become a great tool providing a value-added stage in the imaging process validation if (1) the experimental measurement chain is perfectly mastered, and thus if the experimental data are perfectly reproducible with a numerical tool, as well as if (2) the effective source is reproducible along the measurement setup. These aspects for a quantitative validation concerning devices with piezoelectrical sources and a laser interferometer have not been yet quantitatively studied in published studies. Thus, as a new stage for the experimental modelling approach, these two key issues are tackled in the proposed paper in order to precisely define the quality of the experimental small-scale data provided by the bench MUSC, which are available in the scientific community. These two steps of quantitative validation are dealt apart any imaging techniques in order to offer the opportunity to geophysicists who want to use such data (delivered as free data) of precisely knowing their quality before testing any imaging technique. First, in order to overcome the 2-D-3-D correction usually done in seismic processing when comparing 2-D numerical data with 3-D experimental measurement, we quantitatively refined the comparison between numerical and experimental data by generating accurate experimental line sources, avoiding the necessity of geometrical spreading correction for 3-D point-source data. The comparison with 2-D and 3-D numerical modelling is based on

  2. Systematic study of the effects of scaling techniques in numerical simulations with application to enhanced geothermal systems

    Science.gov (United States)

    Heinze, Thomas; Jansen, Gunnar; Galvan, Boris; Miller, Stephen A.

    2016-04-01

    Numerical modeling is a well established tool in rock mechanics studies investigating a wide range of problems. Especially for estimating seismic risk of a geothermal energy plants a realistic rock mechanical model is needed. To simulate a time evolving system, two different approaches need to be separated: Implicit methods for solving linear equations are unconditionally stable, while explicit methods are limited by the time step. However, explicit methods are often preferred because of their limited memory demand, their scalability in parallel computing, and simple implementation of complex boundary conditions. In numerical modeling of explicit elastoplastic dynamics the time step is limited by the rock density. Mass scaling techniques, which increase the rock density artificially by several orders, can be used to overcome this limit and significantly reduce computation time. In the context of geothermal energy this is of great interest because in a coupled hydro-mechanical model the time step of the mechanical part is significantly smaller than for the fluid flow. Mass scaling can also be combined with time scaling, which increases the rate of physical processes, assuming that processes are rate independent. While often used, the effect of mass and time scaling and how it may influence the numerical results is rarely-mentioned in publications, and choosing the right scaling technique is typically performed by trial and error. Also often scaling techniques are used in commercial software packages, hidden from the untrained user. To our knowledge, no systematic studies have addressed how mass scaling might affect the numerical results. In this work, we present results from an extensive and systematic study of the influence of mass and time scaling on the behavior of a variety of rock-mechanical models. We employ a finite difference scheme to model uniaxial and biaxial compression experiments using different mass and time scaling factors, and with physical models

  3. Formulation and numerical implementation of micro-scale boundary conditions for particle aggregates

    NARCIS (Netherlands)

    Liu, J.; Bosco, E.; Suiker, A.S.J.

    2017-01-01

    Novel numerical algorithms are presented for the implementation of micro-scale boundary conditions of particle aggregates modelled with the discrete element method. The algorithms are based on a servo-control methodology, using a feedback principle comparable to that of algorithms commonly applied

  4. Network-state modulation of power-law frequency-scaling in visual cortical neurons.

    Science.gov (United States)

    El Boustani, Sami; Marre, Olivier; Béhuret, Sébastien; Baudot, Pierre; Yger, Pierre; Bal, Thierry; Destexhe, Alain; Frégnac, Yves

    2009-09-01

    Various types of neural-based signals, such as EEG, local field potentials and intracellular synaptic potentials, integrate multiple sources of activity distributed across large assemblies. They have in common a power-law frequency-scaling structure at high frequencies, but it is still unclear whether this scaling property is dominated by intrinsic neuronal properties or by network activity. The latter case is particularly interesting because if frequency-scaling reflects the network state it could be used to characterize the functional impact of the connectivity. In intracellularly recorded neurons of cat primary visual cortex in vivo, the power spectral density of V(m) activity displays a power-law structure at high frequencies with a fractional scaling exponent. We show that this exponent is not constant, but depends on the visual statistics used to drive the network. To investigate the determinants of this frequency-scaling, we considered a generic recurrent model of cortex receiving a retinotopically organized external input. Similarly to the in vivo case, our in computo simulations show that the scaling exponent reflects the correlation level imposed in the input. This systematic dependence was also replicated at the single cell level, by controlling independently, in a parametric way, the strength and the temporal decay of the pairwise correlation between presynaptic inputs. This last model was implemented in vitro by imposing the correlation control in artificial presynaptic spike trains through dynamic-clamp techniques. These in vitro manipulations induced a modulation of the scaling exponent, similar to that observed in vivo and predicted in computo. We conclude that the frequency-scaling exponent of the V(m) reflects stimulus-driven correlations in the cortical network activity. Therefore, we propose that the scaling exponent could be used to read-out the "effective" connectivity responsible for the dynamical signature of the population signals measured

  5. Numerical and Experimental Investigation of Cavitating Characteristics in Centrifugal Pump with Gap Impeller

    Science.gov (United States)

    Zhu, Bing; Chen, Hongxun; Wei, Qun

    2014-06-01

    This paper is to study the cavitating characteristics in a low specific speed centrifugal pump with gap structure impeller experimentally and numerically. A scalable DES numerical method is proposed and developed by introducing the von Karman scale instead of the local grid scale, which can switch at the RANS and LES region interface smoothly and reasonably. The SDES method can detect and grasp unsteady scale flow structures, which were proved by the flow around a triangular prism and the cavitation flow in a centrifugal pump. Through numerical and experimental research, it's shown that the simulated results match qualitatively with tested cavitation performances and visualization patterns, and we can conclude that the gap structure impeller has a superior feature of cavitation suppression. Its mechanism may be the guiding flow feature of the small vice blade and the pressure auto-balance effect of the gap tunnel.

  6. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Maynard, Robert [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-27

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respective features into a new visualization toolkit called VTK-m.

  7. a Novel Ship Detection Method for Large-Scale Optical Satellite Images Based on Visual Lbp Feature and Visual Attention Model

    Science.gov (United States)

    Haigang, Sui; Zhina, Song

    2016-06-01

    Reliably ship detection in optical satellite images has a wide application in both military and civil fields. However, this problem is very difficult in complex backgrounds, such as waves, clouds, and small islands. Aiming at these issues, this paper explores an automatic and robust model for ship detection in large-scale optical satellite images, which relies on detecting statistical signatures of ship targets, in terms of biologically-inspired visual features. This model first selects salient candidate regions across large-scale images by using a mechanism based on biologically-inspired visual features, combined with visual attention model with local binary pattern (CVLBP). Different from traditional studies, the proposed algorithm is high-speed and helpful to focus on the suspected ship areas avoiding the separation step of land and sea. Largearea images are cut into small image chips and analyzed in two complementary ways: Sparse saliency using visual attention model and detail signatures using LBP features, thus accordant with sparseness of ship distribution on images. Then these features are employed to classify each chip as containing ship targets or not, using a support vector machine (SVM). After getting the suspicious areas, there are still some false alarms such as microwaves and small ribbon clouds, thus simple shape and texture analysis are adopted to distinguish between ships and nonships in suspicious areas. Experimental results show the proposed method is insensitive to waves, clouds, illumination and ship size.

  8. Experimental and numerical study of MILD combustion in a lab-scale furnace

    NARCIS (Netherlands)

    Huang, X.; Tummers, M.J.; Roekaerts, D.J.E.M.; Scherer, Viktor; Fricker, Neil; Reis, Albino

    2017-01-01

    Mild combustion in a lab-scale furnace has been experimentally and numerically studied. The furnace was operated with Dutch natural gas (DNG) at 10 kW and at an equivalence ratio of 0.8. OH∗chemiluminescence images were taken to characterize the reaction zone. The chemiluminescence intensity is

  9. Visualization system on ITBL

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2004-01-01

    Visualization systems PATRAS/ITBL and AVS/ITBL, which are based on visualization software PATRAS and AVS/Express respectively, have been developed on a global, heterogeneous computing environment, Information Technology Based Laboratory (ITBL). PATRAS/ITBL allows for real-time visualization of the numerical results acquired from coupled multi-physics numerical simulations, executed on different hosts situated in remote locations. AVS/ITBL allow for post processing visualization. The scientific data located in remote sites may be selected and visualized on a web browser installed in a user terminal. The global structure and main functions of these systems are presented. (author)

  10. The impact of ordinate scaling on the visual analysis of single-case data.

    Science.gov (United States)

    Dart, Evan H; Radley, Keith C

    2017-08-01

    Visual analysis is the primary method for detecting the presence of treatment effects in graphically displayed single-case data and it is often referred to as the "gold standard." Although researchers have developed standards for the application of visual analysis (e.g., Horner et al., 2005), over- and underestimation of effect size magnitude is not uncommon among analysts. Several characteristics have been identified as potential contributors to these errors; however, researchers have largely focused on characteristics of the data itself (e.g., autocorrelation), paying less attention to characteristics of the graphic display which are largely in control of the analyst (e.g., ordinate scaling). The current study investigated the impact that differences in ordinate scaling, a graphic display characteristic, had on experts' accuracy in judgments regarding the magnitude of effect present in single-case percentage data. 32 participants were asked to evaluate eight ABAB data sets (2 each presenting null, small, moderate, and large effects) along with three iterations of each (32 graphs in total) in which only the ordinate scale was manipulated. Results suggest that raters are less accurate in their detection of treatment effects as the ordinate scale is constricted. Additionally, raters were more likely to overestimate the size of a treatment effect when the ordinate scale was constricted. Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  11. Initial Validation Study for a Scale Used to Determine Service Intensity for Itinerant Teachers of Students with Visual Impairments

    Science.gov (United States)

    Pogrund, Rona L.; Darst, Shannon; Munro, Michael P.

    2015-01-01

    Introduction: The purpose of this study was to begin validation of a scale that will be used by teachers of students with visual impairments to determine appropriate recommended type and frequency of services for their students based on identified student need. Methods: Validity and reliability of the Visual Impairment Scale of Service Intensity…

  12. Network-state modulation of power-law frequency-scaling in visual cortical neurons.

    Directory of Open Access Journals (Sweden)

    Sami El Boustani

    2009-09-01

    Full Text Available Various types of neural-based signals, such as EEG, local field potentials and intracellular synaptic potentials, integrate multiple sources of activity distributed across large assemblies. They have in common a power-law frequency-scaling structure at high frequencies, but it is still unclear whether this scaling property is dominated by intrinsic neuronal properties or by network activity. The latter case is particularly interesting because if frequency-scaling reflects the network state it could be used to characterize the functional impact of the connectivity. In intracellularly recorded neurons of cat primary visual cortex in vivo, the power spectral density of V(m activity displays a power-law structure at high frequencies with a fractional scaling exponent. We show that this exponent is not constant, but depends on the visual statistics used to drive the network. To investigate the determinants of this frequency-scaling, we considered a generic recurrent model of cortex receiving a retinotopically organized external input. Similarly to the in vivo case, our in computo simulations show that the scaling exponent reflects the correlation level imposed in the input. This systematic dependence was also replicated at the single cell level, by controlling independently, in a parametric way, the strength and the temporal decay of the pairwise correlation between presynaptic inputs. This last model was implemented in vitro by imposing the correlation control in artificial presynaptic spike trains through dynamic-clamp techniques. These in vitro manipulations induced a modulation of the scaling exponent, similar to that observed in vivo and predicted in computo. We conclude that the frequency-scaling exponent of the V(m reflects stimulus-driven correlations in the cortical network activity. Therefore, we propose that the scaling exponent could be used to read-out the "effective" connectivity responsible for the dynamical signature of the population

  13. Response shift in severity assessment of hand eczema with visual analogue scales

    DEFF Research Database (Denmark)

    Mollerup, Annette; Johansen, Jeanne D

    2015-01-01

    BACKGROUND: Hand eczema is a common and fluctuating disease. Visual analogue scales (VASs) are used to assess disease severity, both currently and when at its worst. However, such patient-reported outcomes may be at risk of being flawed owing to recall bias or response shifts. OBJECTIVE: To explore...

  14. Testing the interval-level measurement property of multi-item visual analogue scales

    NARCIS (Netherlands)

    Krabbe, Paul F M; Stalmeier, Peep F M; Lamers, Leida M; Busschbach, Jan J V

    2006-01-01

    BACKGROUND: Conditions were studied that may invalidate health-state values derived from the visual analogue scale (VAS). METHODS: Respondents were asked to place cards with descriptions of EQ-5D health states on a 20 cm EuroQol VAS and modified versions of it, positioning them such that the

  15. Numerical study on the hydrodynamic characteristics of biofouled full-scale net cage

    Science.gov (United States)

    Bi, Chun-wei; Zhao, Yun-peng; Dong, Guo-hai

    2015-06-01

    The effect of biofouling on the hydrodynamic characteristics of the net cage is of particular interest as biofouled nettings can significantly reduce flow of well-oxygenated water reaching the stocked fish. For computational efficiency, the porous-media fluid model is proposed to simulate flow through the biofouled plane net and full-scale net cage. The porous coefficients of the porous-media fluid model can be determined from the quadratic-function relationship between the hydrodynamic forces on a plane net and the flow velocity using the least squares method. In this study, drag forces on and flow fields around five plane nets with different levels of biofouling are calculated by use of the proposed model. The numerical results are compared with the experimental data of Swift et al. (2006) and the effectiveness of the numerical model is presented. On that basis, flow through full-scale net cages with the same level of biofouling as the tested plane nets are modeled. The flow fields inside and around biofouled net cages are analyzed and the drag force acting on a net cage is estimated by a control volume analysis method. According to the numerical results, empirical formulas of reduction in flow velocity and load on a net cage are derived as function of drag coefficient of the corresponding biofouled netting.

  16. Visualization system for grid environment in the nuclear field

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Matsumoto, Nobuko; Idomura, Yasuhiro; Tani, Masayuki

    2006-01-01

    An innovative scientific visualization system is needed to integratedly visualize large amount of data which are distributedly generated in remote locations as a result of a large-scale numerical simulation using a grid environment. One of the important functions in such a visualization system is a parallel visualization which enables to visualize data using multiple CPUs of a supercomputer. The other is a distributed visualization which enables to execute visualization processes using a local client computer and remote computers. We have developed a toolkit including these functions in cooperation with the commercial visualization software AVS/Express, called Parallel Support Toolkit (PST). PST can execute visualization processes with three kinds of parallelism (data parallelism, task parallelism and pipeline parallelism) using local and remote computers. We have evaluated PST for large amount of data generated by a nuclear fusion simulation. Here, two supercomputers Altix3700Bx2 and Prism installed in JAEA are used. From the evaluation, it can be seen that PST has a potential to efficiently visualize large amount of data in a grid environment. (author)

  17. FuncTree: Functional Analysis and Visualization for Large-Scale Omics Data.

    Directory of Open Access Journals (Sweden)

    Takeru Uchiyama

    Full Text Available Exponential growth of high-throughput data and the increasing complexity of omics information have been making processing and interpreting biological data an extremely difficult and daunting task. Here we developed FuncTree (http://bioviz.tokyo/functree, a web-based application for analyzing and visualizing large-scale omics data, including but not limited to genomic, metagenomic, and transcriptomic data. FuncTree allows user to map their omics data onto the "Functional Tree map", a predefined circular dendrogram, which represents the hierarchical relationship of all known biological functions defined in the KEGG database. This novel visualization method allows user to overview the broad functionality of their data, thus allowing a more accurate and comprehensive understanding of the omics information. FuncTree provides extensive customization and calculation methods to not only allow user to directly map their omics data to identify the functionality of their data, but also to compute statistically enriched functions by comparing it to other predefined omics data. We have validated FuncTree's analysis and visualization capability by mapping pan-genomic data of three different types of bacterial genera, metagenomic data of the human gut, and transcriptomic data of two different types of human cell expression. All three mapping strongly confirms FuncTree's capability to analyze and visually represent key functional feature of the omics data. We believe that FuncTree's capability to conduct various functional calculations and visualizing the result into a holistic overview of biological function, would make it an integral analysis/visualization tool for extensive omics base research.

  18. Numerical modelling of disintegration of basin-scale internal waves in a tank filled with stratified water

    Directory of Open Access Journals (Sweden)

    N. Stashchuk

    2005-01-01

    Full Text Available We present the results of numerical experiments performed with the use of a fully non-linear non-hydrostatic numerical model to study the baroclinic response of a long narrow tank filled with stratified water to an initially tilted interface. Upon release, the system starts to oscillate with an eigen frequency corresponding to basin-scale baroclinic gravitational seiches. Field observations suggest that the disintegration of basin-scale internal waves into packets of solitary waves, shear instabilities, billows and spots of mixed water are important mechanisms for the transfer of energy within stratified lakes. Laboratory experiments performed by D. A. Horn, J. Imberger and G. N. Ivey (JFM, 2001 reproduced several regimes, which include damped linear waves and solitary waves. The generation of billows and shear instabilities induced by the basin-scale wave was, however, not sufficiently studied. The developed numerical model computes a variety of flows, which were not observed with the experimental set-up. In particular, the model results showed that under conditions of low dissipation, the regimes of billows and supercritical flows may transform into a solitary wave regime. The obtained results can help in the interpretation of numerous observations of mixing processes in real lakes.

  19. Performance investigation of a lab–scale latent heat storage prototype – Numerical results

    International Nuclear Information System (INIS)

    Niyas, Hakeem; Prasad, Sunku; Muthukumar, P.

    2017-01-01

    Highlights: • Developed a numerical tool for analyzing a shell-and-tube LHS system. • Effective heat capacity method is used for incorporating the latent heat. • Number of heat transfer fluid tubes and fins are optimized. • Partial charging/discharging is efficient than complete charging/discharging. • Numerically predicted values match well with the experimental results. - Abstract: In the current study, numerical analysis of the charging and discharging characteristics of a lab-scale latent heat storage (LHS) prototype is presented. A mathematical model is developed to analyze the performance characteristics of the LHS prototype of shell and tube heat exchanger configuration. Effective heat capacity (EHC) method is implemented to consider the latent heat of the phase change material (PCM) and Boussinesq approximation is used to incorporate the buoyancy effect of the molten layer of the PCM in the model. For proper modeling of velocities in the PCM, Darcy law’s source term is added. The governing equations involved in the model are solved using a finite element based software product, COMSOL Multiphysics 4.3a. The number of embedded tubes and fins on the embedded tubes are optimized based on the discharging time of the model. Various performance parameters such as charging/discharging time, energy storage/discharge rate and melt fraction are evaluated. Numerically predicted temperature variations of the model during charging and discharging processes were compared with the experimental data extracted from the lab-scale LHS prototype and a good agreement was found between them.

  20. Validity and Reliability of the Verbal Numerical Rating Scale for Children Aged 4 to 17 Years With Acute Pain.

    Science.gov (United States)

    Tsze, Daniel S; von Baeyer, Carl L; Pahalyants, Vartan; Dayan, Peter S

    2018-06-01

    The Verbal Numerical Rating Scale is the most commonly used self-report measure of pain intensity. It is unclear how the validity and reliability of the scale scores vary across children's ages. We aimed to determine the validity and reliability of the scale for children presenting to the emergency department across a comprehensive spectrum of age. This was a cross-sectional study of children aged 4 to 17 years. Children self-reported their pain intensity, using the Verbal Numerical Rating Scale and Faces Pain Scale-Revised at 2 serial assessments. We evaluated convergent validity (strong validity defined as correlation coefficient ≥0.60), agreement (difference between concurrent Verbal Numerical Rating Scale and Faces Pain Scale-Revised scores), known-groups validity (difference in score between children with painful versus nonpainful conditions), responsivity (decrease in score after analgesic administration), and reliability (test-retest at 2 serial assessments) in the total sample and subgroups based on age. We enrolled 760 children; 27 did not understand the Verbal Numerical Rating Scale and were removed. Of the remainder, Pearson correlations were strong to very strong (0.62 to 0.96) in all years of age except 4 and 5 years, and agreement was strong for children aged 8 and older. Known-groups validity and responsivity were strong in all years of age. Reliability was strong in all age subgroups, including each year of age from 4 to 7 years. Convergent validity, known-groups validity, responsivity, and reliability of the Verbal Numerical Rating Scale were strong for children aged 6 to 17 years. Convergent validity was not strong for children aged 4 and 5 years. Our findings support the use of the Verbal Numerical Rating Scale for most children aged 6 years and older, but not for those aged 4 and 5 years. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  1. Applicability of visual-analogue scale in patients with orofacial pain

    Directory of Open Access Journals (Sweden)

    Lončar Jovana

    2013-01-01

    Full Text Available Introduction. Orofacial pain occurs in various disorders of the orofacial region. Objective. The aim of this study was to examine applicability of the visual-analogue scale (VAS in patients with orofacial pain (model of acute and chronic pain. Methods. The study involved 60 patients, aged 18-70 years. The first group consisted of patients with dentin hypersensitivity, and the second group of patients with chronic rhinosinusitis. All patients were asked to fill-in a pain questionnaire and to rate pain intensity on the modified visual analogue scale (VAS; 0-10. Air indexing method was performed in the patients with dentin hypersensitivity in order to provoke pain, while the patients with chronic rhinosinusitis underwent CT imaging of paranasal sinuses. Wilcoxon’s test and Pearson’s correlation coefficient were used for statistical analysis. Results. In patients with dentin hypersensitivity provocation increased subjective feeling of pain, but without statistical significance (t=164.5; p>0.05. In patients with chronic rhinosinusitis a significant statistical correlation (r=0.53; p<0.05 was found between subjective pain assessment of VAS and CT findings. Conclusion. Applying VAS in the evaluation of acute and chronic pain can indicate progression or regression of pathological state under clinical conditions. This study showed that VAS, as a method for follow-up of pathological state, is more applicable and efficient when applied in chronic pain evaluation.

  2. Monitoring the response to rTMS in depression with visual analog scales.

    Science.gov (United States)

    Grunhaus, Leon; Dolberg, Ornah T; Polak, Dana; Dannon, Pinhas N

    2002-10-01

    Visual analog scales (VAS) administered on a daily basis provide a fast and reliable method for assessing clinical change during transcranial magnetic stimulation (TMS). We treated 40 patients with major depression with TMS and assessed their clinical condition with VAS. Response to TMS was defined with the Hamilton rating scale for depression and the Global assessment of function scale. Nineteen patients of 40 were responders to TMS (when the whole sample was considered) whereas 17 of 29 responded when only the non-psychotic patients were considered. Patients who eventually responded to TMS demonstrated early changes in the VAS scores. We conclude that monitoring with VAS scores can detect early response to TMS. Copyright 2002 John Wiley & Sons, Ltd.

  3. Mathematical and numerical modelling of fluids at Nano-metric scales

    International Nuclear Information System (INIS)

    Joubaud, R.

    2012-01-01

    This work presents some contributions to the mathematical and numerical modelling of fluids at Nano-metric scales. We are interested in two levels of modelling. The first level consists in an atomic description. We consider the problem of computing the shear viscosity of a fluid from a microscopic description. More precisely, we study the mathematical properties of the nonequilibrium Langevin dynamics allowing to compute the shear viscosity. The second level of description is a continuous description, and we consider a class of continuous models for equilibrium electrolytes, which incorporate on the one hand a confinement by charged solid objects and on the other hand non-ideality effects stemming from electrostatic correlations and steric exclusion phenomena due to the excluded volume effects. First, we perform the mathematical analysis of the case where the free energy is a convex function (mild non-ideality). Second, we consider numerically the case where the free energy is a non convex function (strong non-ideality) leading in particular to phase separation. (author)

  4. VISUALIZATION METHODS OF VORTICAL FLOWS IN COMPUTATIONAL FLUID DYNAMICS AND THEIR APPLICATIONS

    Directory of Open Access Journals (Sweden)

    K. N. Volkov

    2014-05-01

    Full Text Available The paper deals with conceptions and methods for visual representation of research numerical results in the problems of fluid mechanics and gas. The three-dimensional nature of unsteady flow being simulated creates significant difficulties for the visual representation of results. It complicates control and understanding of numerical data, and exchange and processing of obtained information about the flow field. Approaches to vortical flows visualization with the usage of gradients of primary and secondary scalar and vector fields are discussed. An overview of visualization techniques for vortical flows using different definitions of the vortex and its identification criteria is given. Visualization examples for some solutions of gas dynamics problems related to calculations of jets and cavity flows are presented. Ideas of the vortical structure of the free non-isothermal jet and the formation of coherent vortex structures in the mixing layer are developed. Analysis of formation patterns for spatial flows inside large-scale vortical structures within the enclosed space of the cubic lid-driven cavity is performed. The singular points of the vortex flow in a cubic lid-driven cavity are found based on the results of numerical simulation; their type and location are identified depending on the Reynolds number. Calculations are performed with fine meshes and modern approaches to the simulation of vortical flows (direct numerical simulation and large-eddy simulation. Paradigm of graphical programming and COVISE virtual environment are used for the visual representation of computational results. Application that implements the visualization of the problem is represented as a network which links are modules and each of them is designed to solve a case-specific problem. Interaction between modules is carried out by the input and output ports (data receipt and data transfer giving the possibility to use various input and output devices.

  5. Enhanced learning through scale models and see-thru visualization

    International Nuclear Information System (INIS)

    Kelley, M.D.

    1987-01-01

    The development of PowerSafety International's See-Thru Power Plant has provided the nuclear industry with a bridge that can span the gap between the part-task simulator and the full-scope, high-fidelity plant simulator. The principle behind the See-Thru Power Plant is to provide the use of sensory experience in nuclear training programs. The See-Thru Power Plant is a scaled down, fully functioning model of a commercial nuclear power plant, equipped with a primary system, secondary system, and control console. The major components are constructed of glass, thus permitting visual conceptualization of a working nuclear power plant

  6. Web-Scale Multidimensional Visualization of Big Spatial Data to Support Earth Sciences—A Case Study with Visualizing Climate Simulation Data

    Directory of Open Access Journals (Sweden)

    Sizhe Wang

    2017-06-01

    Full Text Available The world is undergoing rapid changes in its climate, environment, and ecosystems due to increasing population growth, urbanization, and industrialization. Numerical simulation is becoming an important vehicle to enhance the understanding of these changes and their impacts, with regional and global simulation models producing vast amounts of data. Comprehending these multidimensional data and fostering collaborative scientific discovery requires the development of new visualization techniques. In this paper, we present a cyberinfrastructure solution—PolarGlobe—that enables comprehensive analysis and collaboration. PolarGlobe is implemented upon an emerging web graphics library, WebGL, and an open source virtual globe system Cesium, which has the ability to map spatial data onto a virtual Earth. We have also integrated volume rendering techniques, value and spatial filters, and vertical profile visualization to improve rendered images and support a comprehensive exploration of multi-dimensional spatial data. In this study, the climate simulation dataset produced by the extended polar version of the well-known Weather Research and Forecasting Model (WRF is used to test the proposed techniques. PolarGlobe is also easily extendable to enable data visualization for other Earth Science domains, such as oceanography, weather, or geology.

  7. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2014-11-01

    The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipeline model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.

  8. Advanced visualization technology for terascale particle accelerator simulations

    International Nuclear Information System (INIS)

    Ma, K-L; Schussman, G.; Wilson, B.; Ko, K.; Qiang, J.; Ryne, R.

    2002-01-01

    This paper presents two new hardware-assisted rendering techniques developed for interactive visualization of the terascale data generated from numerical modeling of next generation accelerator designs. The first technique, based on a hybrid rendering approach, makes possible interactive exploration of large-scale particle data from particle beam dynamics modeling. The second technique, based on a compact texture-enhanced representation, exploits the advanced features of commodity graphics cards to achieve perceptually effective visualization of the very dense and complex electromagnetic fields produced from the modeling of reflection and transmission properties of open structures in an accelerator design. Because of the collaborative nature of the overall accelerator modeling project, the visualization technology developed is for both desktop and remote visualization settings. We have tested the techniques using both time varying particle data sets containing up to one billion particle s per time step and electromagnetic field data sets with millions of mesh elements

  9. Pain intensity among institutionalized elderly: a comparison between numerical scales and verbal descriptors

    Directory of Open Access Journals (Sweden)

    Lílian Varanda Pereira

    2015-10-01

    Full Text Available AbstractOBJECTIVECorrelating two unidimensional scales for measurement of self-reported pain intensity for elderly and identifying a preference for one of the scales.METHODA study conducted with 101 elderly people living in Nursing Home who reported any pain and reached ( 13 the scores on the Mini-Mental State Examination. A Numeric Rating Scale - (NRS of 11 points and a Verbal Descriptor Scale (VDS of five points were compared in three evaluations: overall, at rest and during movement.RESULTSWomen were more representative (61.4% and the average age was 77.0±9.1 years. NRS was completed by 94.8% of the elderly while VDS by 100%. The association between the mean scores of NRS with the categories of VDS was significant, indicating convergent validity and a similar metric between the scales.CONCLUSIONPain measurements among institutionalized elderly can be made by NRS and VDS; however, the preferred scale for the elderly was the VDS, regardless of gender.

  10. A Real-time Generalization and Multi-scale Visualization Method for POI Data in Volunteered Geographic Information

    Directory of Open Access Journals (Sweden)

    YANG Min

    2015-02-01

    Full Text Available With the development of mobile and Web technologies, there has been an increasing number of map-based mushups which display different kinds of POI data in volunteered geographic information. Due to the lack of suitable mechanisms for multi-scale visualization, the display of the POI data often result in the icon clustering problem with icons touching and overlapping each other. This paper introduces a multi-scale visualization method for urban facility POI data by combing the classic methods of generalization and on-line environment. Firstly, we organize the POI data into hierarchical structure by preprocessing in the server-side; the POI features then will be obtained based on the display scale in the client-side and the displacement operation will be executed to resolve the local icon conflicts. Experiments show that this approach can not only achieve the requirements of real-time online, but also can get better multi-scale representation of POI data.

  11. A Visual Analog Scale to assess anxiety in children during anesthesia induction (VAS-I): Results supporting its validity in a sample of day care surgery patients.

    Science.gov (United States)

    Berghmans, Johan M; Poley, Marten J; van der Ende, Jan; Weber, Frank; Van de Velde, Marc; Adriaenssens, Peter; Himpe, Dirk; Verhulst, Frank C; Utens, Elisabeth

    2017-09-01

    The modified Yale Preoperative Anxiety Scale is widely used to assess children's anxiety during induction of anesthesia, but requires training and its administration is time-consuming. A Visual Analog Scale, in contrast, requires no training, is easy-to-use and quickly completed. The aim of this study was to evaluate a Visual Analog Scale as a tool to assess anxiety during induction of anesthesia and to determine cut-offs to distinguish between anxious and nonanxious children. Four hundred and one children (1.5-16 years) scheduled for daytime surgery were included. Children's anxiety during induction was rated by parents and anesthesiologists on a Visual Analog Scale and by a trained observer on the modified Yale Preoperative Anxiety Scale. Psychometric properties assessed were: (i) concurrent validity (correlations between parents' and anesthesiologists' Visual Analog Scale and modified Yale Preoperative Anxiety Scale scores); (ii) construct validity (differences between subgroups according to the children's age and the parents' anxiety as assessed by the State-Trait Anxiety Inventory); (iii) cross-informant agreement using Bland-Altman analysis; (iv) cut-offs to distinguish between anxious and nonanxious children (reference: modified Yale Preoperative Anxiety Scale ≥30). Correlations between parents' and anesthesiologists' Visual Analog Scale and modified Yale Preoperative Anxiety Scale scores were strong (0.68 and 0.73, respectively). Visual Analog Scale scores were higher for children ≤5 years compared to children aged ≥6. Visual Analog Scale scores of children of high-anxious parents were higher than those of low-anxious parents. The mean difference between parents' and anesthesiologists' Visual Analog Scale scores was 3.6, with 95% limits of agreement (-56.1 to 63.3). To classify anxious children, cut-offs for parents (≥37 mm) and anesthesiologists (≥30 mm) were established. The present data provide preliminary data for the validity of a Visual

  12. Numerical Modeling of Large-Scale Rocky Coastline Evolution

    Science.gov (United States)

    Limber, P.; Murray, A. B.; Littlewood, R.; Valvo, L.

    2008-12-01

    Seventy-five percent of the world's ocean coastline is rocky. On large scales (i.e. greater than a kilometer), many intertwined processes drive rocky coastline evolution, including coastal erosion and sediment transport, tectonics, antecedent topography, and variations in sea cliff lithology. In areas such as California, an additional aspect of rocky coastline evolution involves submarine canyons that cut across the continental shelf and extend into the nearshore zone. These types of canyons intercept alongshore sediment transport and flush sand to abyssal depths during periodic turbidity currents, thereby delineating coastal sediment transport pathways and affecting shoreline evolution over large spatial and time scales. How tectonic, sediment transport, and canyon processes interact with inherited topographic and lithologic settings to shape rocky coastlines remains an unanswered, and largely unexplored, question. We will present numerical model results of rocky coastline evolution that starts with an immature fractal coastline. The initial shape is modified by headland erosion, wave-driven alongshore sediment transport, and submarine canyon placement. Our previous model results have shown that, as expected, an initial sediment-free irregularly shaped rocky coastline with homogeneous lithology will undergo smoothing in response to wave attack; headlands erode and mobile sediment is swept into bays, forming isolated pocket beaches. As this diffusive process continues, pocket beaches coalesce, and a continuous sediment transport pathway results. However, when a randomly placed submarine canyon is introduced to the system as a sediment sink, the end results are wholly different: sediment cover is reduced, which in turn increases weathering and erosion rates and causes the entire shoreline to move landward more rapidly. The canyon's alongshore position also affects coastline morphology. When placed offshore of a headland, the submarine canyon captures local sediment

  13. Numerical studies of the g-hartree density functional in the Thomas-Fermi scaling limit

    International Nuclear Information System (INIS)

    Millack, T.; Weymans, G.

    1986-02-01

    Methods of finite temperature quantum field theory are used to construct the g-Hartree density functional for atoms. Low and high temperature expansions are discussed in detail. Numerical studies for atomic ground-state configurations are presented in the Thomas-Fermi-Scaling limit. (orig.)

  14. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. The impact of MRI combined with visual rating scales on the clinical diagnosis of dementia: a prospective study

    Energy Technology Data Exchange (ETDEWEB)

    Verhagen, Martijn V.; Guit, Gerard L. [Spaarne Gasthuis, Department of Radiology, Haarlem (Netherlands); Hafkamp, Gerrit Jan; Kalisvaart, Kees [Spaarne Gasthuis, Department of Geriatrics, Haarlem (Netherlands)

    2016-06-15

    Dementia is foremost a clinical diagnosis. However, in diagnosing dementia, it is advocated to perform at least one neuroimaging study. This has two purposes: to rule out potential reversible dementia (PRD), and to help determine the dementia subtype. Our first goal was to establish if MRI combined with visual rating scales changes the clinical diagnosis. The second goal was to demonstrate if MRI contributes to a geriatrician's confidence in the diagnosis. The dementia subtype was determined prior to and after MRI. Scoring scales used were: global cortical atrophy (GCA), medial temporal atrophy (MTA), and white matter hyperintensity measured according to the Fazekas scale. The confidence level of the geriatrician was determined using a visual analogue scale. One hundred and thirty-five patients were included. After MRI, the diagnosis changed in 23.7 % (CI 17.0 %-31.1 %) of patients. Change was due to vascular aetiology in 13.3 % of patients. PRD was found in 2.2 % of all patients. The confidence level in the diagnosis increased significantly after MRI (p = 0.001). MRI, combined with visual rating scales, has a significant impact on dementia subtype diagnosis and on a geriatrician's confidence in the final diagnosis. (orig.)

  16. The impact of MRI combined with visual rating scales on the clinical diagnosis of dementia: a prospective study

    International Nuclear Information System (INIS)

    Verhagen, Martijn V.; Guit, Gerard L.; Hafkamp, Gerrit Jan; Kalisvaart, Kees

    2016-01-01

    Dementia is foremost a clinical diagnosis. However, in diagnosing dementia, it is advocated to perform at least one neuroimaging study. This has two purposes: to rule out potential reversible dementia (PRD), and to help determine the dementia subtype. Our first goal was to establish if MRI combined with visual rating scales changes the clinical diagnosis. The second goal was to demonstrate if MRI contributes to a geriatrician's confidence in the diagnosis. The dementia subtype was determined prior to and after MRI. Scoring scales used were: global cortical atrophy (GCA), medial temporal atrophy (MTA), and white matter hyperintensity measured according to the Fazekas scale. The confidence level of the geriatrician was determined using a visual analogue scale. One hundred and thirty-five patients were included. After MRI, the diagnosis changed in 23.7 % (CI 17.0 %-31.1 %) of patients. Change was due to vascular aetiology in 13.3 % of patients. PRD was found in 2.2 % of all patients. The confidence level in the diagnosis increased significantly after MRI (p = 0.001). MRI, combined with visual rating scales, has a significant impact on dementia subtype diagnosis and on a geriatrician's confidence in the final diagnosis. (orig.)

  17. [Assessment of usefulness of visual analogue scale (VAS) for measuring adolescent attitude toward unhealthy behaviors].

    Science.gov (United States)

    Supranowicz, Piotr

    2003-01-01

    In the last two decades the visual analogue scale has been more frequently used for measuring the psychosocial determinants of health, its disorders and unhealthy behaviours. In 1999 in Health Promotion Department of the National Institute of Hygiene multidimensional investigations on self-assessment of health and life style of adolescents were undertaken and evaluation of visual analogue scale usefulness for health promotion research was one of the aims of these investigations. The data were obtained from randomly selected sample of 682 schoolchildren aged 14-15 years attending to public and private schools of Warsaw. The questionnaire contained the questions about frequency of alcohol drinking, cigarette smoking, drug using and manifestation of aggression. Simultaneously, respondents were asked, how much these behaviours are usefully for coping with everyday events. The answers of usefulness of unhealthy behaviours were measured on tenth centimetre line from "not at all" to "completely". The study shows that adolescents who presented unhealthy behaviours more often are more likely to give higher value to these behaviours in coping with their problems. Moreover, adolescents' attitude toward unhealthy behaviours varies according to gender, kind of alcohol, frequency of being drunk, proposals to buy the drugs, carrying the weapon and frequency of injures from violence. The analyses confirm the usefulness of visual analogue scale for study on psychosocial and life style determinants of health.

  18. Validity and reliability of self-assessed physical fitness using visual analogue scales

    DEFF Research Database (Denmark)

    Strøyer, Jesper; Essendrop, Morten; Jensen, Lone Donbaek

    2007-01-01

    To test the validity and reliability of self-assessed physical fitness samples included healthcare assistants working at a hospital (women=170, men=17), persons working with physically and mentally handicapped patients (women=530, men= 123), and two separate groups of healthcare students (a) women...... except for flexibility among men. The reliability was moderate to good (ICC = .62 - .80). Self-assessed aerobic fitness, muscle strength, and flexibility showed moderate construct validity and moderate to good reliability using visual analogues.......=91 and men=5 and (b) women=159 and men=10. Five components of physical fitness were self-assessed by Visual Analogue Scales with illustrations and verbal anchors for the extremes: aerobic fitness, muscle strength, endurance, flexibility, and balance. Convergent and divergent validity were evaluated...

  19. Influence of visual observational conditions on tongue motor learning

    DEFF Research Database (Denmark)

    Kothari, Mohit; Liu, Xuimei; Baad-Hansen, Lene

    2016-01-01

    To investigate the impact of visual observational conditions on performance during a standardized tongue-protrusion training (TPT) task and to evaluate subject-based reports of helpfulness, disturbance, pain, and fatigue due to the observational conditions on 0-10 numerical rating scales. Forty...... regarding the level of disturbance, pain or fatigue. Self-observation of tongue-training facilitated behavioral aspects of tongue motor learning compared with model-observation but not compared with control....

  20. EDITORIAL: Focus on Visualization in Physics FOCUS ON VISUALIZATION IN PHYSICS

    Science.gov (United States)

    Sanders, Barry C.; Senden, Tim; Springel, Volker

    2008-12-01

    the following features highlighting work in this collection and other novel uses of visualization techniques: 'A feast of visualization' Physics World December 2008 pp 20 23 'Seeing the quantum world' by Barry Sanders Physics World December 2008 pp 24 27 'A picture of the cosmos' by Mark SubbaRao and Miguel Aragon-Calvo Physics World December 2008 pp 29 32 'Thinking outside the cube' by César A Hidalgo Physics World December 2008 pp 34 37 Focus on Visualization in Physics Contents Visualization of spiral and scroll waves in simulated and experimental cardiac tissue E M Cherry and F H Fenton Visualization of large scale structure from the Sloan Digital Sky Survey M U SubbaRao, M A Aragón-Calvo, H W Chen, J M Quashnock, A S Szalay and D G York How computers can help us in creating an intuitive access to relativity Hanns Ruder, Daniel Weiskopf, Hans-Peter Nollert and Thomas Müller Lagrangian particle tracking in three dimensions via single-camera in-line digital holography Jiang Lu, Jacob P Fugal, Hansen Nordsiek, Ewe Wei Saw, Raymond A Shaw and Weidong Yang Quantifying spatial heterogeneity from images Andrew E Pomerantz and Yi-Qiao Song Disaggregation and scientific visualization of earthscapes considering trends and spatial dependence structures S Grunwald Strength through structure: visualization and local assessment of the trabecular bone structure C Räth, R Monetti, J Bauer, I Sidorenko, D Müller, M Matsuura, E-M Lochmüller, P Zysset and F Eckstein Thermonuclear supernovae: a multi-scale astrophysical problem challenging numerical simulations and visualization F K Röpke and R Bruckschen Visualization needs and techniques for astrophysical simulations W Kapferer and T Riser Flow visualization and field line advection in computational fluid dynamics: application to magnetic fields and turbulent flows Pablo Mininni, Ed Lee, Alan Norton and John Clyne Splotch: visualizing cosmological simulations K Dolag, M Reinecke, C Gheller and S Imboden Visualizing a silicon

  1. Flow visualization and velocity measurement in a small-scale open channel using an electron microscope

    International Nuclear Information System (INIS)

    Yasuda, K; Sogo, M; Iwamoto, Y

    2013-01-01

    The present note describes a method for use in conjunction with a scanning electron microscope (SEM) that has been developed to visualize a liquid flow under a high-level vacuum and to measure a velocity field in a small-scale flow through an open channel. In general, liquid cannot be observed via a SEM, because liquid evaporates under the high-vacuum environment of the SEM. As such, ionic liquid and room temperature molten salt having a vapor pressure of nearly zero is used in the present study. We use ionic liquid containing Au-coated tracer particles to visualize a small-scale flow under a SEM. Furthermore, the velocity distribution in the open channel is obtained by particle tracking velocimetry measurement and a parabolic profile is confirmed. (technical design note)

  2. Validity of the modified Berg Balance Scale in adults with intellectual and visual disabilities

    NARCIS (Netherlands)

    Dijkhuizen, Annemarie; Krijnen, Wim P; van der Schans, Cees; Waninge, Aly

    BACKGROUND: A modified version of the Berg Balance Scale (mBBS) was developed for individuals with intellectual and visual disabilities (IVD). However, the concurrent and predictive validity has not yet been determined. AIM: The purpose of the current study was to evaluate the concurrent and

  3. Validity of the modified Berg Balance Scale in adults with intellectual and visual disabilities

    NARCIS (Netherlands)

    Dijkhuizen, Annemarie; Krijnen, Wim P.; van der Schans, Cees P.; Waninge, Aly

    Background: A modified version of the Berg Balance Scale (mBBS) was developed for individuals with intellectual and visual disabilities (IVD). However, the concurrent and predictive validity has not yet been determined. Aim: The purpose of the current study was to evaluate the concurrent and

  4. The visual communication in the optonometric scales La comunicación visual en las escalas optométricas A comunicação visual nas escalas optométricas

    Directory of Open Access Journals (Sweden)

    Rosane Arruda Dantas

    2006-12-01

    Full Text Available Communication through vision involves visual apprenticeship that demands ocular integrity, which results in the importance of the evaluation of visual acuity. The scale of images, formed by optotypes, is a method for the verification of visual acuity in kindergarten children. To identify the optotype the child needs to know the image in analysis. Given the importance of visual communication during the process of construction of the scale of images, one presents a bibliographic, analytical study aiming at thinking about the principles for the construction of those tables. One considers the draw inserted as an optotype as a non-verbal symbolic expression of the body and/or of the environment constructed based on the caption of experiences by the individual. One contests the indiscriminate use of images, for one understands that there must be previous knowledge. Despite the subjectivity of the optotypes, the scales continue valid if one adapts images to those of the universe of the children to be examined.La comunicación que ocurre a través de la visión abarca el aprendizaje visual que depende de la integridad ocular, por eso es relevante la evaluación de su acuidad. La escala de figuras, formada por optotipos, es un método usado para verificar la acuidad visual en preescolares. Para identificar el opto-tipo, el niño necesita conocer la figura en análisis. Debido a la importancia de la comunicación visual durante el proceso de construcción de las escalas de figuras, se presenta un estudio bibliográfico analítico, cuyo objetivo es el de reflexionar sobre los principios de construcción de estas tablas. El dibujo inserido como opto-tipo se considera una expresión no verbal del cuerpo y/o del ambiente, construido mediante captación de experiencias por el individuo. Se cuestiona el uso indiscriminado de las figuras, pues conforme se entiende debe existir un conocimiento previo de las mismas. A pesar de la subjetividad de los optotipos, las

  5. LDV measurement, flow visualization and numerical analysis of flow distribution in a close-coupled catalytic converter

    International Nuclear Information System (INIS)

    Kim, Duk Sang; Cho, Yong Seok

    2004-01-01

    Results from an experimental study of flow distribution in a Close-coupled Catalytic Converter (CCC) are presented. The experiments were carried out with a flow measurement system specially designed for this study under steady and transient flow conditions. A pitot tube was a tool for measuring flow distribution at the exit of the first monolith. The flow distribution of the CCC was also measured by LDV system and flow visualization. Results from numerical analysis are also presented. Experimental results showed that the flow uniformity index decreases as flow Reynolds number increases. In steady flow conditions, the flow through each exhaust pipe made some flow concentrations on a specific region of the CCC inlet. The transient test results showed that the flow through each exhaust pipe in the engine firing order, interacted with each other to ensure that the flow distribution was uniform. The results of numerical analysis were qualitatively accepted with experimental results. They supported and helped explain the flow in the entry region of CCC

  6. Reliability and sensitivity of visual scales versus volumetry for evaluating white matter hyperintensity progression

    DEFF Research Database (Denmark)

    Gouw, A A; van der Flier, W M; van Straaten, E C W

    2008-01-01

    the reliability and sensitivity of cross-sectional and longitudinal visual scales with volumetry for measuring WMH progression. METHODS: Twenty MRI scan pairs (interval 2 years) were included from the Amsterdam center of the LADIS study. Semi-automated volumetry of WMH was performed twice by one rater. Three...

  7. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2017-02-01

    efficient computation on an exascale computer. This project concludes with a functional prototype containing pervasively parallel algorithms that perform demonstratively well on many-core processors. These algorithms are fundamental for performing data analysis and visualization at extreme scale.

  8. Numerical scalings of the decay lengths in the scrape-off layer

    DEFF Research Database (Denmark)

    Militello, F.; Naulin, V; Nielsen, Anders Henry

    2013-01-01

    Numerical simulations of L-mode turbulence in the scrape-off layer (SOL) are used to construct power scaling laws for the characteristic decay lengths of the temperature, density and heat flux at the outer mid-plane. Most of the results obtained are in qualitative agreement with the experimental...... observations despite the known limitation of the model. Quantitative agreement is also obtained for some exponents. In particular, an almost linear inverse dependence of the heat flux decay length with the plasma current is recovered. The relative simplicity of the theoretical model used allows one to gain...

  9. Contrast and Strength of Visual Memory and Imagery Differentially Affect Visual Perception

    OpenAIRE

    Saad, Elyana; Silvanto, Juha

    2013-01-01

    Visual short-term memory (VSTM) and visual imagery have been shown to modulate visual perception. However, how the subjective experience of VSTM/imagery and its contrast modulate this process has not been investigated. We addressed this issue by asking participants to detect brief masked targets while they were engaged either in VSTM or visual imagery. Subjective experience of memory/imagery (strength scale), and the visual contrast of the memory/mental image (contrast scale) were assessed on...

  10. Visualization needs and techniques for astrophysical simulations

    International Nuclear Information System (INIS)

    Kapferer, W; Riser, T

    2008-01-01

    Numerical simulations have evolved continuously towards being an important field in astrophysics, equivalent to theory and observation. Due to the enormous developments in computer sciences, both hardware- and software-architecture, state-of-the-art simulations produce huge amounts of raw data with increasing complexity. In this paper some aspects of problems in the field of visualization in numerical astrophysics in combination with possible solutions are given. Commonly used visualization packages along with a newly developed approach to real-time visualization, incorporating shader programming to uncover the computational power of modern graphics cards, are presented. With these techniques at hand, real-time visualizations help scientists to understand the coherences in the results of their numerical simulations. Furthermore a fundamental problem in data analysis, i.e. coverage of metadata on how a visualization was created, is highlighted.

  11. The Visual Analogue Scale for Rating, Ranking and Paired-Comparison (VAS-RRP): A new technique for psychological measurement.

    Science.gov (United States)

    Sung, Yao-Ting; Wu, Jeng-Shin

    2018-04-17

    Traditionally, the visual analogue scale (VAS) has been proposed to overcome the limitations of ordinal measures from Likert-type scales. However, the function of VASs to overcome the limitations of response styles to Likert-type scales has not yet been addressed. Previous research using ranking and paired comparisons to compensate for the response styles of Likert-type scales has suffered from limitations, such as that the total score of ipsative measures is a constant that cannot be analyzed by means of many common statistical techniques. In this study we propose a new scale, called the Visual Analogue Scale for Rating, Ranking, and Paired-Comparison (VAS-RRP), which can be used to collect rating, ranking, and paired-comparison data simultaneously, while avoiding the limitations of each of these data collection methods. The characteristics, use, and analytic method of VAS-RRPs, as well as how they overcome the disadvantages of Likert-type scales, ranking, and VASs, are discussed. On the basis of analyses of simulated and empirical data, this study showed that VAS-RRPs improved reliability, response style bias, and parameter recovery. Finally, we have also designed a VAS-RRP Generator for researchers' construction and administration of their own VAS-RRPs.

  12. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics.

    Science.gov (United States)

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-09-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most "useful" or "interesting". The two major obstacles in recommending interesting visualizations are (a) scale : evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility : identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics.

  13. A concurrent visualization system for large-scale unsteady simulations. Parallel vector performance on an NEC SX-4

    International Nuclear Information System (INIS)

    Takei, Toshifumi; Doi, Shun; Matsumoto, Hideki; Muramatsu, Kazuhiro

    2000-01-01

    We have developed a concurrent visualization system RVSLIB (Real-time Visual Simulation Library). This paper shows the effectiveness of the system when it is applied to large-scale unsteady simulations, for which the conventional post-processing approach may no longer work, on high-performance parallel vector supercomputers. The system performs almost all of the visualization tasks on a computation server and uses compressed visualized image data for efficient communication between the server and the user terminal. We have introduced several techniques, including vectorization and parallelization, into the system to minimize the computational costs of the visualization tools. The performance of RVSLIB was evaluated by using an actual CFD code on an NEC SX-4. The computational time increase due to the concurrent visualization was at most 3% for a smaller (1.6 million) grid and less than 1% for a larger (6.2 million) one. (author)

  14. Multi-scale data visualization for computational astrophysics and climate dynamics at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Ahern, Sean; Daniel, Jamison R; Gao, Jinzhu; Ostrouchov, George; Toedte, Ross J; Wang, Chaoli

    2006-01-01

    Computational astrophysics and climate dynamics are two principal application foci at the Center for Computational Sciences (CCS) at Oak Ridge National Laboratory (ORNL). We identify a dataset frontier that is shared by several SciDAC computational science domains and present an exploration of traditional production visualization techniques enhanced with new enabling research technologies such as advanced parallel occlusion culling and high resolution small multiples statistical analysis. In collaboration with our research partners, these techniques will allow the visual exploration of a new generation of peta-scale datasets that cross this data frontier along all axes

  15. Power profiles and short-term visual performance of soft contact lenses.

    Science.gov (United States)

    Papas, Eric; Dahms, Anne; Carnt, Nicole; Tahhan, Nina; Ehrmann, Klaus

    2009-04-01

    To investigate the manner in which contemporary soft contact lenses differ in the distribution of optical power within their optic zones and establish if these variations affect the vision of wearers or the prescribing procedure for back vertex power (BVP). By using a Visionix VC 2001 contact lens power analyzer, power profiles were measured across the optic zones of the following contemporary contact lenses ACUVUE 2, ACUVUE ADVANCE, O2OPTIX, NIGHT & DAY and PureVision. Single BVP measures were obtained using a Nikon projection lensometer. Visual performance was assessed in 28 masked subjects who wore each lens type in random order. Measurements taken were high and low contrast visual acuity in normal illumination (250 Cd/m), high contrast acuity in reduced illumination (5 Cd/m), subjective visual quality using a numerical rating scale, and visual satisfaction rating using a Likert scale. Marked differences in the distribution of optical power across the optic zone were evident among the lens types. No significant differences were found for any of the visual performance variables (p > 0.05, analysis of variance with repeated measures and Friedman test). Variations in power profile between contemporary soft lens types exist but do not, in general, result in measurable visual performance differences in the short term, nor do they substantially influence the BVP required for optimal correction.

  16. Patient DF's visual brain in action: Visual feedforward control in visual form agnosia.

    Science.gov (United States)

    Whitwell, Robert L; Milner, A David; Cavina-Pratesi, Cristiana; Barat, Masihullah; Goodale, Melvyn A

    2015-05-01

    Patient DF, who developed visual form agnosia following ventral-stream damage, is unable to discriminate the width of objects, performing at chance, for example, when asked to open her thumb and forefinger a matching amount. Remarkably, however, DF adjusts her hand aperture to accommodate the width of objects when reaching out to pick them up (grip scaling). While this spared ability to grasp objects is presumed to be mediated by visuomotor modules in her relatively intact dorsal stream, it is possible that it may rely abnormally on online visual or haptic feedback. We report here that DF's grip scaling remained intact when her vision was completely suppressed during grasp movements, and it still dissociated sharply from her poor perceptual estimates of target size. We then tested whether providing trial-by-trial haptic feedback after making such perceptual estimates might improve DF's performance, but found that they remained significantly impaired. In a final experiment, we re-examined whether DF's grip scaling depends on receiving veridical haptic feedback during grasping. In one condition, the haptic feedback was identical to the visual targets. In a second condition, the haptic feedback was of a constant intermediate width while the visual target varied trial by trial. Despite this incongruent feedback, DF still scaled her grip aperture to the visual widths of the target blocks, showing only normal adaptation to the false haptically-experienced width. Taken together, these results strengthen the view that DF's spared grasping relies on a normal mode of dorsal-stream functioning, based chiefly on visual feedforward processing. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. A visual analogue scale and a Likert scale are simple and responsive tools for assessing dysphagia in eosinophilic oesophagitis.

    Science.gov (United States)

    Reed, C C; Wolf, W A; Cotton, C C; Dellon, E S

    2017-06-01

    While symptom scores have been developed to evaluate dysphagia in eosinophilic oesophagitis (EoE), their complexity may limit clinical use. To evaluate a visual analogue scale (VAS) and a 10-point Likert scale (LS) for assessment of dysphagia severity before and after EoE treatment. We conducted a prospective cohort study enrolling consecutive adults undergoing out-patient endoscopy. Incident cases of EoE were diagnosed per consensus guidelines. At diagnosis and after 8 weeks of treatment, symptoms were measured using the VAS, LS and the Mayo Dysphagia Questionnaire (MDQ). The percentage change in scores before and after treatment were compared overall, in treatment responders (dysphagia severity in EoE in clinical practice. © 2017 John Wiley & Sons Ltd.

  18. A Visual Analogue Scale and a Likert Scale are Simple and Responsive Tools for Assessing Dysphagia in Eosinophilic Esophagitis

    Science.gov (United States)

    Reed, Craig C.; Wolf, W. Asher; Cotton, Cary C.; Dellon, Evan S.

    2017-01-01

    Background While symptom scores have been developed to evaluate dysphagia in eosinophilic oesophagitis (EoE), their complexity may limit clinical use. Aim We aimed to evaluate a visual analogue scale (VAS) and a 10 point Likert scale (LS) for assessment of dysphagia severity before and after EoE treatment. Methods We conducted a prospective cohort study enrolling consecutive adults undergoing outpatient endoscopy. Incident cases of EoE were diagnosed per consensus guidelines. At diagnosis and after 8 weeks of treatment, symptoms were measured using the VAS, LS, and the Mayo Dysphagia Questionnaire (MDQ). The percentage change in scores before and after treatment were compared overall, in treatment responders (dysphagia severity in EoE in clinical practice. PMID:28370355

  19. Psychometric properties of the Numeric Pain Rating Scale and Neck Disability Index in patients with cervicogenic headache.

    Science.gov (United States)

    Young, Ian A; Dunning, James; Butts, Raymond; Cleland, Joshua A; Fernández-de-Las-Peñas, César

    2018-01-01

    Background Self-reported disability and pain intensity are commonly used outcomes in patients with cervicogenic headaches. However, there is a paucity of psychometric evidence to support the use of these self-report outcomes for individuals treated with cervicogenic headaches. Therefore, it is unknown if these measures are reliable, responsive, or result in meaningful clinically important changes in this patient population. Methods A secondary analysis of a randomized clinical trial (n = 110) examining the effects of spinal manipulative therapy with and without exercise in patients with cervicogenic headaches. Reliability, construct validity, responsiveness and thresholds for minimal detectable change and clinically important difference values were calculated for the Neck Disability Index and Numeric Pain Rating Scale. Results The Neck Disability Index exhibited excellent reliability (ICC = 0.92; [95 % CI: 0.46-0.97]), while the Numeric Pain Rating Scale exhibited moderate reliability (ICC = 0.72; [95 % CI: 0.08-0.90]) in the short term. Both instruments also exhibited adequate responsiveness (area under the curve; range = 0.78-0.93) and construct validity ( p numeric pain rating scale and a 5.5-point reduction on the neck disability index after 4 weeks of intervention to be considered clinically meaningful.

  20. Experimental, theoretical, and numerical studies of small scale combustion

    Science.gov (United States)

    Xu, Bo

    Recently, the demand increased for the development of microdevices such as microsatellites, microaerial vehicles, micro reactors, and micro power generators. To meet those demands the biggest challenge is obtaining stable and complete combustion at relatively small scale. To gain a fundamental understanding of small scale combustion in this thesis, thermal and kinetic coupling between the gas phase and the structure at meso and micro scales were theoretically, experimentally, and numerically studied; new stabilization and instability phenomena were identified; and new theories for the dynamic mechanisms of small scale combustion were developed. The reduction of thermal inertia at small scale significantly reduces the response time of the wall and leads to a strong flame-wall coupling and extension of burning limits. Mesoscale flame propagation and extinction in small quartz tubes were theoretically, experimentally and numerically studied. It was found that wall-flame interaction in mesoscale combustion led to two different flame regimes, a heat-loss dominant fast flame regime and a wall-flame coupling slow flame regime. The nonlinear transition between the two flame regimes was strongly dependent on the channel width and flow velocity. It is concluded that the existence of multiple flame regimes is an inherent phenomenon in mesoscale combustion. In addition, all practical combustors have variable channel width in the direction of flame propagation. Quasi-steady and unsteady propagations of methane and propane-air premixed flames in a mesoscale divergent channel were investigated experimentally and theoretically. The emphasis was the impact of variable cross-section area and the flame-wall coupling on the flame transition between different regimes and the onset of flame instability. For the first time, spinning flames were experimentally observed for both lean and rich methane and propane-air mixtures in a broad range of equivalence ratios. An effective Lewis number

  1. Experimental and numerical study on free surface behavior of windowless target

    International Nuclear Information System (INIS)

    Su Guanyu; Gu Hanyang; Cheng Xu

    2012-01-01

    The formation and control method of coolant free surface is one of the key technologies for the design of windowless target in accelerator driven sub-critical system (ADS). Experimental and CFD investigations on free surface behavior were performed in a scaled windowless target model by using water as test fluid. Laser induced fluorescence was applied for flow field visualization. The free surface and flow field visualization were obtained at Re=30000-50000. Under high Re conditions, an unsteady vortex pair was obtained. By decreasing Re, the structure of the vortex becomes more turbulent. CFD simulation was performed using LES and kω-SST turbulence models, separately. The numerical results show that LES model can qualitatively reproduce the characteristics of flow field and free surface. (authors)

  2. Large Scale Visual Recognition

    Science.gov (United States)

    2012-06-01

    Miniature pinscher Figure 2.5: Visualization of the mammal hierarchy. 23 900 1000 1100 elephant okapi panda platypus Caltech101 Lossless JPG size in...limousine taxi Flat Ours Golden Retriever dog Chihuahua dog Husky domes c animal canine English Se er hyena canine polar bear carnivore...snow leopard feline o er living thing conch en y wheelbarrow carnivore orangutan mammal meerkat mammal carnivore polar bear lynx lion Flat

  3. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    Science.gov (United States)

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Topologically-based visualization of large-scale volume data

    International Nuclear Information System (INIS)

    Takeshima, Y.; Tokunaga, M.; Fujishiro, I.; Takahashi, S.

    2004-01-01

    Due to the recent progress in the performance of computing/measurement environments and the advent of ITBL environments, volume datasets have become larger and more complicated. Although computer visualization is one of the tools to analyze such datasets effectively, it is almost impossible to adjust the visualization parameter value by trial and error without taking the feature of a given volume dataset into consideration. In this article, we introduce a scheme of topologically-based volume visualization, which is intended to choose appropriate visualization parameter values automatically through topological volume skeletonization. (author)

  5. Cut points on 0-10 numeric rating scales for symptoms included in the edmonton symptom assessment scale in cancer patients: A systematic review

    NARCIS (Netherlands)

    W.H. Oldenmenger (Wendy); P.J. de Raaf (Pleun); C. de Klerk (Cora); C.C.D. van der Rijt (Carin)

    2013-01-01

    textabstractContext: To improve the management of cancer-related symptoms, systematic screening is necessary, often performed by using 0-10 numeric rating scales. Cut points are used to determine if scores represent clinically relevant burden. Objectives: The aim of this systematic review was to

  6. Comprehensive numerical modelling of tokamaks

    International Nuclear Information System (INIS)

    Cohen, R.H.; Cohen, B.I.; Dubois, P.F.

    1991-01-01

    We outline a plan for the development of a comprehensive numerical model of tokamaks. The model would consist of a suite of independent, communicating packages describing the various aspects of tokamak performance (core and edge transport coefficients and profiles, heating, fueling, magnetic configuration, etc.) as well as extensive diagnostics. These codes, which may run on different computers, would be flexibly linked by a user-friendly shell which would allow run-time specification of packages and generation of pre- and post-processing functions, including workstation-based visualization of output. One package in particular, the calculation of core transport coefficients via gyrokinetic particle simulation, will become practical on the scale required for comprehensive modelling only with the advent of teraFLOP computers. Incremental effort at LLNL would be focused on gyrokinetic simulation and development of the shell

  7. Visual Thing Recognition with Binary Scale-Invariant Feature Transform and Support Vector Machine Classifiers Using Color Information

    OpenAIRE

    Wei-Jong Yang; Wei-Hau Du; Pau-Choo Chang; Jar-Ferr Yang; Pi-Hsia Hung

    2017-01-01

    The demands of smart visual thing recognition in various devices have been increased rapidly for daily smart production, living and learning systems in recent years. This paper proposed a visual thing recognition system, which combines binary scale-invariant feature transform (SIFT), bag of words model (BoW), and support vector machine (SVM) by using color information. Since the traditional SIFT features and SVM classifiers only use the gray information, color information is still an importan...

  8. Application of numerical environment system to regional atmospheric radioactivity transport simulations

    International Nuclear Information System (INIS)

    Yamazawa, H.; Ohkura, T.; Iida, T.; Chino, M.; Nagai, H.

    2003-01-01

    Main functions of the Numerical Environment System (NES), as a part of the Information Technology Based Laboratory (ITBL) project implemented by Japan Atomic Energy Research Institute, became available for test use purposes although the development of the system is still underway. This system consists of numerical models of meteorology and atmospheric dispersion, database necessary for model simulations, post- and pre-processors such as data conversion and visualization, and a suite of system software which provide the users with system functions through a web page access. The system utilizes calculation servers such as vector- and scalar-parallel processors for numerical model execution, a EWS which serves as a hub of the system. This system provides users in the field of nuclear emergency preparedness and atmospheric environment with easy-to-use functions of atmospheric dispersion simulations including input meteorological data preparation and visualization of simulation results. The performance of numerical models in the system was examined with observation data of long-range transported radon-222. The models in the system reproduced quite well temporal variations in the observed radon-222 concentrations in air which were caused by changes in the meteorological field in the synoptic scale. By applying the NES models in combination with the idea of backward-in-time atmospheric dispersion simulation, seasonal shift of source areas of radon-222 in the eastern Asian regions affecting the concentrations in Japan was quantitatively illustrated. (authors)

  9. Visual Constructive and Visual-Motor Skills in Deaf Native Signers

    Science.gov (United States)

    Hauser, Peter C.; Cohen, Julie; Dye, Matthew W. G.; Bavelier, Daphne

    2007-01-01

    Visual constructive and visual-motor skills in the deaf population were investigated by comparing performance of deaf native signers (n = 20) to that of hearing nonsigners (n = 20) on the Beery-Buktenica Developmental Test of Visual-Motor Integration, Rey-Osterrieth Complex Figure Test, Wechsler Memory Scale Visual Reproduction subtest, and…

  10. Numerical investigation on flow behavior and energy separation in a micro-scale vortex tube

    Directory of Open Access Journals (Sweden)

    Rahbar Nader

    2015-01-01

    Full Text Available There are a few experimental and numerical studies on the behaviour of micro-scale vortex tubes. The intention of this work is to investigate the energy separation phenomenon in a micro-scale vortex tube by using the computational fluid dynamic. The flow is assumed as steady, turbulent, compressible ideal gas, and the shear-stress transport sst k-w is used for modeling of turbulence phenomenon. The results show that 3-D CFD simulation is more accurate than 2-D axisymmetric one. Moreover, optimum cold-mass ratios to maximize the refrigeration-power and isentropicefficiency are evaluated. The results of static temperature, velocity magnitude and pressure distributions show that the temperature-separation in the micro-scale vortex tube is a function of kinetic-energy variation and air-expansion in the radial direction.

  11. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    Science.gov (United States)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface

  12. Performance of the Visual Analogue Scale of Happiness and of the Cornell Scale for Depression in Dementia in the Tremembé Epidemiological Study, Brazil

    Directory of Open Access Journals (Sweden)

    Karolina G. César

    Full Text Available Depression is a major growing public health problem. Many population studies have found a significant relationship between depression and the presence of cognitive disorders. OBJECTIVE: To establish the correlation between the Visual Analogue Scale of Happiness and the Cornell Scale for Depression in Dementia in the population aged 60 years or over in the city of Tremembé, state of São Paulo, Brazil. METHODS: An epidemiological survey involving home visits was carried out in the city of Tremembé. The sample was randomly selected by drawing 20% of the population aged 60 years or older from each of the city's census sectors. In this single-phase study, the assessment included clinical history, physical and neurological examination, cognitive evaluation, and application of both the Cornell Scale and the Analogue Scale of Happiness for psychiatric symptoms. The presence of depressive symptoms was defined as scores greater than or equal to 8 points on the Cornell Scale. RESULTS: A total of 623 subjects were evaluated and of these 251 (40.3% had clinically significant depressive symptoms on the Cornell Scale, with a significant association with female gender (p<0.001 and with lower education (p=0.012. One hundred and thirty-six participants (21.8% chose the unhappiness faces, with a significant association with age (p<0.001, female gender (p=0.020 and low socioeconomic status (p=0.012. Although there was a statistically significant association on the correlation test, the correlation was not high (rho=0.47. CONCLUSION: The prevalence of depressive symptoms was high in this sample and the Visual Analogue Scale of Happiness and Cornell Scale for Depression in Dementia should not be used as similar alternatives for evaluating the presence of depressive symptoms, at least in populations with low educational level.

  13. Global-local visual biases correspond with visual-spatial orientation.

    Science.gov (United States)

    Basso, Michael R; Lowery, Natasha

    2004-02-01

    Within the past decade, numerous investigations have demonstrated reliable associations of global-local visual processing biases with right and left hemisphere function, respectively (cf. Van Kleeck, 1989). Yet the relevance of these biases to other cognitive functions is not well understood. Towards this end, the present research examined the relationship between global-local visual biases and perception of visual-spatial orientation. Twenty-six women and 23 men completed a global-local judgment task (Kimchi and Palmer, 1982) and the Judgment of Line Orientation Test (JLO; Benton, Sivan, Hamsher, Varney, and Spreen, 1994), a measure of visual-spatial orientation. As expected, men had better performance on JLO. Extending previous findings, global biases were related to better visual-spatial acuity on JLO. The findings suggest that global-local biases and visual-spatial orientation may share underlying cerebral mechanisms. Implications of these findings for other visually mediated cognitive outcomes are discussed.

  14. Theoretical and Numerical Properties of a Gyrokinetic Plasma: Issues Related to Transport Time Scale Simulation

    International Nuclear Information System (INIS)

    Lee, W.W.

    2003-01-01

    Particle simulation has played an important role for the recent investigations on turbulence in magnetically confined plasmas. In this paper, theoretical and numerical properties of a gyrokinetic plasma as well as its relationship with magnetohydrodynamics (MHD) are discussed with the ultimate aim of simulating microturbulence in transport time scale using massively parallel computers

  15. Auditory motion in the sighted and blind: Early visual deprivation triggers a large-scale imbalance between auditory and "visual" brain regions.

    Science.gov (United States)

    Dormal, Giulia; Rezk, Mohamed; Yakobov, Esther; Lepore, Franco; Collignon, Olivier

    2016-07-01

    How early blindness reorganizes the brain circuitry that supports auditory motion processing remains controversial. We used fMRI to characterize brain responses to in-depth, laterally moving, and static sounds in early blind and sighted individuals. Whole-brain univariate analyses revealed that the right posterior middle temporal gyrus and superior occipital gyrus selectively responded to both in-depth and laterally moving sounds only in the blind. These regions overlapped with regions selective for visual motion (hMT+/V5 and V3A) that were independently localized in the sighted. In the early blind, the right planum temporale showed enhanced functional connectivity with right occipito-temporal regions during auditory motion processing and a concomitant reduced functional connectivity with parietal and frontal regions. Whole-brain searchlight multivariate analyses demonstrated higher auditory motion decoding in the right posterior middle temporal gyrus in the blind compared to the sighted, while decoding accuracy was enhanced in the auditory cortex bilaterally in the sighted compared to the blind. Analyses targeting individually defined visual area hMT+/V5 however indicated that auditory motion information could be reliably decoded within this area even in the sighted group. Taken together, the present findings demonstrate that early visual deprivation triggers a large-scale imbalance between auditory and "visual" brain regions that typically support the processing of motion information. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Steady-state numerical modeling of size effects in micron scale wire drawing

    DEFF Research Database (Denmark)

    Juul, Kristian Jørgensen; Nielsen, Kim Lau; Niordson, Christian Frithiof

    2017-01-01

    Wire drawing processes at the micron scale have received increased interest as micro wires are increasingly required in electrical components. It is well-established that size effects due to large strain gradient effects play an important role at this scale and the present study aims to quantify...... these effects for the wire drawing process. Focus will be on investigating the impact of size effects on the most favourable tool geometry (in terms of minimizing the drawing force) for various conditions between the wire/tool interface. The numerical analysis is based on a steady-state framework that enables...... convergence without dealing with the transient regime, but still fully accounts for the history dependence as-well as the elastic unloading. Thus, it forms the basis for a comprehensive parameter study. During the deformation process in wire drawing, large plastic strain gradients evolve in the contact region...

  17. Visualization, Light Transport, and Big Data

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    Scientific data treated by current peta-scale computers and coming exa-scale computers is quite huge and it is known as "Big Data". For a better understanding and analysis of this massive amount of information, visualization is an important task. Large scale, efficient, and visually compelling visualization has many challenges. In this talk, I will introduce the most promising visualization technique for big data. Namely, ray tracing method from Light Transport theory and image composition. I will also show some recent large scale visualization results using this technique on the K computer, a 10 peta-flops supercomputer with up to 65,536 computing nodes. About the speaker Please find the speaker's LinkedIn profile here.

  18. Verbalizing, Visualizing, and Navigating: The Effect of Strategies on Encoding a Large-Scale Virtual Environment

    Science.gov (United States)

    Kraemer, David J. M.; Schinazi, Victor R.; Cawkwell, Philip B.; Tekriwal, Anand; Epstein, Russell A.; Thompson-Schill, Sharon L.

    2017-01-01

    Using novel virtual cities, we investigated the influence of verbal and visual strategies on the encoding of navigation-relevant information in a large-scale virtual environment. In 2 experiments, participants watched videos of routes through 4 virtual cities and were subsequently tested on their memory for observed landmarks and their ability to…

  19. A neural basis for the visual sense of number and its development: A steady-state visual evoked potential study in children and adults

    Directory of Open Access Journals (Sweden)

    Joonkoo Park

    2018-04-01

    Full Text Available While recent studies in adults have demonstrated the existence of a neural mechanism for a visual sense of number, little is known about its development and whether such a mechanism exists at young ages. In the current study, I introduce a novel steady-state visual evoked potential (SSVEP technique to objectively quantify early visual cortical sensitivity to numerical and non-numerical magnitudes of a dot array. I then examine this neural sensitivity to numerical magnitude in children between three and ten years of age and in college students. Children overall exhibit strong SSVEP sensitivity to numerical magnitude in the right occipital sites with negligible SSVEP sensitivity to non-numerical magnitudes, the pattern similar to what is observed in adults. However, a closer examination of age differences reveals that this selective neural sensitivity to numerical magnitude, which is close to absent in three-year-olds, increases steadily as a function of age, while there is virtually no neural sensitivity to other non-numerical magnitudes across these ages. These results demonstrate the emergence of a neural mechanism underlying direct perception of numerosity across early and middle childhood and provide a potential neural mechanistic explanation for the development of humans’ primitive, non-verbal ability to comprehend number. Keywords: Numerosity, Steady-state visual evoked potential, Child development, Visual cortex, Approximate number system

  20. Direct Numerical Simulation and Visualization of Subcooled Pool Boiling

    Directory of Open Access Journals (Sweden)

    Tomoaki Kunugi

    2014-01-01

    Full Text Available A direct numerical simulation of the boiling phenomena is one of the promising approaches in order to clarify their heat transfer characteristics and discuss the mechanism. During these decades, many DNS procedures have been developed according to the recent high performance computers and computational technologies. In this paper, the state of the art of direct numerical simulation of the pool boiling phenomena during mostly two decades is briefly summarized at first, and then the nonempirical boiling and condensation model proposed by the authors is introduced into the MARS (MultiInterface Advection and Reconstruction Solver developed by the authors. On the other hand, in order to clarify the boiling bubble behaviors under the subcooled conditions, the subcooled pool boiling experiments are also performed by using a high speed and high spatial resolution camera with a highly magnified telescope. Resulting from the numerical simulations of the subcooled pool boiling phenomena, the numerical results obtained by the MARS are validated by being compared to the experimental ones and the existing analytical solutions. The numerical results regarding the time evolution of the boiling bubble departure process under the subcooled conditions show a very good agreement with the experimental results. In conclusion, it can be said that the proposed nonempirical boiling and condensation model combined with the MARS has been validated.

  1. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  2. VisIO: enabling interactive visualization of ultra-scale, time-series data via high-bandwidth distributed I/O systems

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Christopher J [Los Alamos National Laboratory; Ahrens, James P [Los Alamos National Laboratory; Wang, Jun [UCF

    2010-10-15

    Petascale simulations compute at resolutions ranging into billions of cells and write terabytes of data for visualization and analysis. Interactive visuaUzation of this time series is a desired step before starting a new run. The I/O subsystem and associated network often are a significant impediment to interactive visualization of time-varying data; as they are not configured or provisioned to provide necessary I/O read rates. In this paper, we propose a new I/O library for visualization applications: VisIO. Visualization applications commonly use N-to-N reads within their parallel enabled readers which provides an incentive for a shared-nothing approach to I/O, similar to other data-intensive approaches such as Hadoop. However, unlike other data-intensive applications, visualization requires: (1) interactive performance for large data volumes, (2) compatibility with MPI and POSIX file system semantics for compatibility with existing infrastructure, and (3) use of existing file formats and their stipulated data partitioning rules. VisIO, provides a mechanism for using a non-POSIX distributed file system to provide linear scaling of 110 bandwidth. In addition, we introduce a novel scheduling algorithm that helps to co-locate visualization processes on nodes with the requested data. Testing using VisIO integrated into Para View was conducted using the Hadoop Distributed File System (HDFS) on TACC's Longhorn cluster. A representative dataset, VPIC, across 128 nodes showed a 64.4% read performance improvement compared to the provided Lustre installation. Also tested, was a dataset representing a global ocean salinity simulation that showed a 51.4% improvement in read performance over Lustre when using our VisIO system. VisIO, provides powerful high-performance I/O services to visualization applications, allowing for interactive performance with ultra-scale, time-series data.

  3. Landscape Aesthetics and the Scenic Drivers of Amenity Migration in the New West: Naturalness, Visual Scale, and Complexity

    Directory of Open Access Journals (Sweden)

    Jelena Vukomanovic

    2014-04-01

    Full Text Available Values associated with scenic beauty are common “pull factors” for amenity migrants, however the specific landscape features that attract amenity migration are poorly understood. In this study we focused on three visual quality metrics of the intermountain West (USA, with the objective of exploring the relationship between the location of exurban homes and aesthetic landscape preference, as exemplified through greenness, viewshed size, and terrain ruggedness. Using viewshed analysis, we compared the viewsheds of actual exurban houses to the viewsheds of randomly-distributed simulated (validation houses. We found that the actual exurban households can see significantly more vegetation and a more rugged (complex terrain than simulated houses. Actual exurban homes see a more rugged terrain, but do not necessarily see the highest peaks, suggesting that visual complexity throughout the viewshed may be more important. The viewsheds visible from the actual exurban houses were significantly larger than those visible from the simulated houses, indicating that visual scale is important to the general aesthetic experiences of exurbanites. The differences in visual quality metric values between actual exurban and simulated viewsheds call into question the use of county-level scales of analysis for the study of landscape preferences, which may miss key landscape aesthetic drivers of preference.

  4. Health-Terrain: Visualizing Large Scale Health Data

    Science.gov (United States)

    2015-04-01

    An extension of LifeLine, LifeLine2 [3], enables multiple patient comparisons and aggregation for analysis , but the visualization design limited its...and enables users to explore data using various visualization and analysis methods. Concept terms are derived from data mining and text-mining...clustering algorithms [30] were applied to normalize the lexical variants and duplications of the terms. Term correlations were computed using the

  5. Numerical simulation of strongly swirling turbulent flows through an abrupt expansion

    International Nuclear Information System (INIS)

    Paik, Joongcheol; Sotiropoulos, Fotis

    2010-01-01

    Turbulent swirling flow through an abrupt axisymmetric expansion is investigated numerically using detached-eddy simulation at Reynolds numbers = 3.0 x 10 4 and 1.0 x 10 5 . The effects of swirl intensity on the coherent dynamics of the flow are systematically studied by carrying out numerical simulations over a range of swirl numbers from 0.17 to 1.23. Comparison of the computed solutions with the experimental measurements of shows that the numerical simulations resolve both the axial and swirl mean velocity and turbulence intensity profiles with very good accuracy. Our simulations show that, along with moderate mesh refinement, critical prerequisite for accurate predictions of the flow downstream of the expansion is the specification of inlet conditions at a plane sufficiently far upstream of the expansion in order to avoid the spurious suppression of the low-frequency, large-scale precessing of the vortex core. Coherent structure visualizations with the q-criterion, friction lines and Lagrangian particle tracking are used to elucidate the rich dynamics of the flow as a function of the swirl number with emphasis on the onset of the spiral vortex breakdown, the onset and extent of the on-axis recirculation region and the large-scale instabilities along the shear layers and the pipe wall.

  6. Experimental and numerical studies on free surface flow of windowless target

    International Nuclear Information System (INIS)

    Su, G.Y.; Gu, H.Y.; Cheng, X.

    2012-01-01

    Highlights: ► Experimental and CFD studies on free surface flow have been performed in a scaled windowless target. ► Flow structure inside spallation area can be divided into three typical zones. ► Under large Reynolds number, large scale vortex can be observed. ► CFD studies have been conducted by using both LES and RANS (k-ω SST) turbulence models. ► LES model provides better numerical prediction on free surface behavior and flow transient. - Abstract: The formation and control method of the coolant free surface is one of the key technologies for the design of windowless targets in the accelerator driven system (ADS). In the recent study, experimental and numerical investigations on the free surface flow have been performed in a scaled windowless target by using water as the model fluid. The planar laser induced fluorescence technique has been applied to visualize the free surface flow pattern inside the spallation area. Experiments have been carried out with the Reynolds number in the range of 30,000–50,000. The structure and features of flow vortex have been investigated. The experimental results show that the free surface is vulnerable to the vortex movement. In addition, CFD simulations have been performed under the experimental conditions, using LES and RANS (k-ω SST) turbulence models, respectively. The numerical results of LES model agree qualitatively well with the experimental data related to both flow pattern and free surface behavior.

  7. Numerical modeling of suspended sediment tansfers at the catchment scale with TELEMAC

    Science.gov (United States)

    Taccone, Florent; Antoine, Germain; Delestre, Olivier; Goutal, Nicole

    2017-04-01

    In the mountainous regions, the filling of reservoirs is an important issue in terms of efficiency and environmental acceptability for producing hydro-electricity. Thus, the modelling of the sediment transfers on highly erodible watershed is a key challenge from both economic and scientific points of view. The sediment transfers at the watershed scale involve different local flow regimes due to the complex topography of the field and the time and space variability of the meteorological conditions, as well as several physical processes, because of the heterogeneity of the soil composition and cover. A physically-based modelling approach, associated with a fine discretization of the domain, provides an explicit representation of the hydraulic and sedimentary variables, and gives the opportunity to river managers to simulate the global effects of local solutions for decreasing erosion. On the other hand, this approach is time consuming, and needs both detailed data set for validation and robust numerical schemes for simulating various hydraulic and sediment transport conditions. The erosion processes being heavily reliant on the flow characteristics, this paper focus on a robust and accurate numerical resolution of the Shallow Water equations using TELEMAC 2D (www.opentelemac.org). One of the main difficulties is to have a numerical scheme able to represent correctly the hydraulic transfers, preserving the positivity of the water depths, dealing with the wet/dry interface and being well-balanced. Few schemes verifying these properties exist, and their accuracy still needs to be evaluated in the case of rain induced runoff on steep slopes. First, a straight channel test case with a variable slope (Kirstetter et al., 2015) is used to qualify the properties of several Finite Volume numerical schemes. For this test case, a steady rain applied on a dry domain has been performed experimentally in laboratory, and this configuration gives an analytical solution of the Shallow

  8. Multi-Scale Visualization Analysis of Bus Flow Average Travel Speed in Qingdao

    Science.gov (United States)

    Yong, HAN; Man, GAO; Xiao-Lei, ZHANG; Jie, LI; Ge, CHEN

    2016-11-01

    Public transportation is a kind of complex spatiotemporal behaviour. The traffic congestion and environmental pollution caused by the increase in private cars is becoming more and more serious in our city. Spatiotemporal data visualization is an effective tool for studying traffic, transforming non-visual data into recognizable images, which can reveal where/when congestion is formed, developed and disappeared in space and time simultaneously. This paper develops a multi-scale visualization of average travel speed derived from floating bus data, to enable congestion on urban bus networks to be shown and analyzed. The techniques of R language, Echarts, WebGL are used to draw statistical pictures and 3D wall map, which show the congestion in Qingdao from the view of space and time. The results are as follows:(1) There is a more severely delay in Shibei and Shinan areas than Licun and Laoshan areas; (2) The high congestion usually occurs on Hong Kong Middle Road, Shandong Road, Nanjing Road, Liaoyang West Road and Taiping Road;(3) There is a similar law from Monday to Sunday that the congestion is severer in the morning and evening rush hours than other hours; (4) On Monday morning the severity of congestion is higher than on Friday morning, and on Friday evening the severity is higher than on Monday evening. The research results will help to improve the public transportation of Qingdao.

  9. Reliability and validity of the visual analogue scale for disability in patients with chronic musculoskeletal pain

    NARCIS (Netherlands)

    Boonstra, Anne M.; Schiphorst Preuper, Henrica R.; Reneman, Michiel F.; Posthumus, Jitze B.; Stewart, Roy E.

    To determine the reliability and concurrent validity of a visual analogue scale (VAS) for disability as a single-item instrument measuring disability in chronic pain patients was the objective of the study. For the reliability study a test-retest design and for the validity study a cross-sectional

  10. Validation of the MASK-rhinitis visual analogue scale on smartphone screens to assess allergic rhinitis control

    NARCIS (Netherlands)

    Caimmi, D.; Baiz, N.; Tanno, L. K.; Demoly, P.; Arnavielhe, S.; Murray, R.; Bedbrook, A.; Bergmann, K. C.; de Vries, G.; Fokkens, W. J.; Fonseca, J.; Haahtela, T.; Keil, T.; Kuna, P.; Mullol, J.; Papadopoulos, N.; Passalacqua, G.; Samolinski, B.; Tomazic, P. V.; Valiulis, A.; van Eerd, M.; Wickman, M.; Annesi-Maesano, I.; Bousquet, J.; Agache, I.; Angles, R.; Anto, J. M.; Asayag, E.; Bacci, E.; Bachert, C.; Baroni, I.; Barreto, B. A.; Bedolla-Barajas, M.; Bertorello, L.; Bewick, M.; Bieber, T.; Birov, S.; Bindslev-Jensen, C.; Blua, A.; Bochenska Marciniak, M.; Bogus-Buczynska, I.; Bosnic-Ancevich, S.; Bosse, I.; Bourret, R.; Bucca, C.; Buonaiuto, R.; Caiazza, D.; Caillot, D.; Caimmi, D. P.; Camargos, P.; Canfora, G.; Cardona, V.; Carriazo, A. M.; Cartier, C.; Castellano, G.; Chavannes, N. H.; Ciaravolo, M. M.; Cingi, C.; Ciceran, A.; Colas, L.; Colgan, E.; Coll, J.; Conforti, D.; Correira de Sousa, J.; Cortés-Grimaldo, R. M.; Corti, F.; Costa, E.; Courbis, A. L.; Cruz, A.; Custovic, A.; Dario, C.; da Silva, M.; Dauvilliers, Y.; de Blay, F.; Dedeu, T.; de Feo, G.; de Martino, B.; Di Capua, S.; Di Carluccio, N.; Dray, G.; Dubakiene, R.; Eller, E.; Emuzyte, R.; Espinoza-Contreras, J. M.; Estrada-Cardona, A.; Farrell, J.; Ferrero, J.; Fontaine, J. F.; Forti, S.; Gálvez-Romero, J. L.; Garcia Cruz, M. H.; García-Cobas, C. I.; Gemicioğlu, B.; Gerth van Wijck, R.; Guidacci, M.; Gómez-Vera, J.; Guldemond, N. A.; Gutter, Z.; Hajjam, J.; Hellings, P.; Hernández-Velázquez, L.; Illario, M.; Ivancevich, J. C.; Jares, E.; Joos, G.; Just, J.; Kalayci, O.; Kalyoncu, A. F.; Karjalainen, J.; Khaltaev, N.; Klimek, L.; Kull, I.; Kuna, T. P.; Kvedariene, V.; Kolek, V.; Krzych-Fałta, E.; Kupczyk, M.; Lacwik, P.; Larenas-Linnemann, D.; Laune, D.; Lauri, D.; Lavrut, J.; Lessa, M.; Levato, G.; Lewis, L.; Lieten, I.; Lipiec, A.; Louis, R.; Luna-Pech, J. A.; Magnan, A.; Malva, J.; Maspero, J. F.; Mayora, O.; Medina-Ávalos, M. A.; Melen, E.; Menditto, E.; Millot-Keurinck, J.; Moda, G.; Morais-Almeida, M.; Mösges, R.; Mota-Pinto, A.; Muraro, A.; Noguès, M.; Nalin, M.; Napoli, L.; Neffen, H.; O'Hehir, R.; Olivé Elias, M.; Onorato, G.; Palkonen, S.; Pépin, J. L.; Pereira, A. M.; Persico, M.; Pfaar, O.; Pozzi, A. C.; Prokopakis, E. P.; Raciborski, F.; Rizzo, J. A.; Robalo-Cordeiro, C.; Rodríguez-González, M.; Rolla, G.; Roller-Wirnsberger, R. E.; Romano, A.; Romano, M.; Salimäki, J.; Serpa, F. S.; Shamai, S.; Sierra, M.; Sova, M.; Sorlini, M.; Stellato, C.; Stelmach, R.; Strandberg, T.; Stroetman, V.; Stukas, R.; Szylling, A.; Tibaldi, V.; Todo-Bom, A.; Toppila-Salmi, S.; Tomazic, P.; Trama, U.; Triggiani, M.; Valero, A.; Valovirta, E.; Vasankari, T.; Vatrella, A.; Ventura, M. T.; Verissimo, M. T.; Viart, F.; Williams, S.; Wagenmann, M.; Wanscher, C.; Westman, M.; Young, I.; Yorgancioglu, A.; Zernotti, E.; Zurbierber, T.; Zurkuhlen, A.; de Oliviera, B.; Senn, A.

    2017-01-01

    Background: Visual Analogue Scale (VAS) is a validated tool to assess control in allergic rhinitis patients. Objective: The aim of this study was to validate the use of VAS in the MASK-rhinitis (MACVIA-ARIA Sentinel NetworK for allergic rhinitis) app (Allergy Diary) on smartphones screens to

  11. Contrast and strength of visual memory and imagery differentially affect visual perception.

    Science.gov (United States)

    Saad, Elyana; Silvanto, Juha

    2013-01-01

    Visual short-term memory (VSTM) and visual imagery have been shown to modulate visual perception. However, how the subjective experience of VSTM/imagery and its contrast modulate this process has not been investigated. We addressed this issue by asking participants to detect brief masked targets while they were engaged either in VSTM or visual imagery. Subjective experience of memory/imagery (strength scale), and the visual contrast of the memory/mental image (contrast scale) were assessed on a trial-by-trial basis. For both VSTM and imagery, contrast of the memory/mental image was positively associated with reporting target presence. Consequently, at the sensory level, both VSTM and imagery facilitated visual perception. However, subjective strength of VSTM was positively associated with visual detection whereas the opposite pattern was found for imagery. Thus the relationship between subjective strength of memory/imagery and visual detection are qualitatively different for VSTM and visual imagery, although their impact at the sensory level appears similar. Our results furthermore demonstrate that imagery and VSTM are partly dissociable processes.

  12. Contrast and strength of visual memory and imagery differentially affect visual perception.

    Directory of Open Access Journals (Sweden)

    Elyana Saad

    Full Text Available Visual short-term memory (VSTM and visual imagery have been shown to modulate visual perception. However, how the subjective experience of VSTM/imagery and its contrast modulate this process has not been investigated. We addressed this issue by asking participants to detect brief masked targets while they were engaged either in VSTM or visual imagery. Subjective experience of memory/imagery (strength scale, and the visual contrast of the memory/mental image (contrast scale were assessed on a trial-by-trial basis. For both VSTM and imagery, contrast of the memory/mental image was positively associated with reporting target presence. Consequently, at the sensory level, both VSTM and imagery facilitated visual perception. However, subjective strength of VSTM was positively associated with visual detection whereas the opposite pattern was found for imagery. Thus the relationship between subjective strength of memory/imagery and visual detection are qualitatively different for VSTM and visual imagery, although their impact at the sensory level appears similar. Our results furthermore demonstrate that imagery and VSTM are partly dissociable processes.

  13. Tracking and visualization of space-time activities for a micro-scale flu transmission study.

    Science.gov (United States)

    Qi, Feng; Du, Fei

    2013-02-07

    Infectious diseases pose increasing threats to public health with increasing population density and more and more sophisticated social networks. While efforts continue in studying the large scale dissemination of contagious diseases, individual-based activity and behaviour study benefits not only disease transmission modelling but also the control, containment, and prevention decision making at the local scale. The potential for using tracking technologies to capture detailed space-time trajectories and model individual behaviour is increasing rapidly, as technological advances enable the manufacture of small, lightweight, highly sensitive, and affordable receivers and the routine use of location-aware devices has become widespread (e.g., smart cellular phones). The use of low-cost tracking devices in medical research has also been proved effective by more and more studies. This study describes the use of tracking devices to collect data of space-time trajectories and the spatiotemporal processing of such data to facilitate micro-scale flu transmission study. We also reports preliminary findings on activity patterns related to chances of influenza infection in a pilot study. Specifically, this study employed A-GPS tracking devices to collect data on a university campus. Spatiotemporal processing was conducted for data cleaning and segmentation. Processed data was validated with traditional activity diaries. The A-GPS data set was then used for visual explorations including density surface visualization and connection analysis to examine space-time activity patterns in relation to chances of influenza infection. When compared to diary data, the segmented tracking data demonstrated to be an effective alternative and showed greater accuracies in time as well as the details of routes taken by participants. A comparison of space-time activity patterns between participants who caught seasonal influenza and those who did not revealed interesting patterns. This study

  14. Feasibility and reliability of the modified berg balance scale in persons with severe intellectual and visual disabilities

    NARCIS (Netherlands)

    Waninge, Aly; van Wijck, R.; Steenbergen, B.; van der Schans, Cees

    2011-01-01

    Background: The purpose of this study was to determine the feasibility and reliability of the modified Berg Balance Scale (mBBS) in persons with severe intellectual and visual disabilities (severe multiple disabilities, SMD) assigned Gross Motor Function Classification System (GMFCS) grades I and

  15. Manipulations of attention dissociate fragile visual short-term memory from visual working memory

    NARCIS (Netherlands)

    Vandenbroucke, A.R.E.; Sligte, I.G.; Lamme, V.A.F.

    2011-01-01

    People often rely on information that is no longer in view, but maintained in visual short-term memory (VSTM). Traditionally, VSTM is thought to operate on either a short time-scale with high capacity - iconic memory - or a long time scale with small capacity - visual working memory. Recent research

  16. Numerical methods for multi-scale modeling of non-Newtonian flows

    Science.gov (United States)

    Symeonidis, Vasileios

    simulations assume bead-spring representations of polymer chains and investigate different integrating schemes of the DPD equations and different intra-polymer force combinations. (1) A novel family of time-staggered integrators is presented, taking advantage of the time-scale disparity between polymer-solvent and solvent-solvent interactions. Convergence tests for relaxation parameters for the velocity-Verlet and Lowe's schemes are presented. (2) Wormlike chains simulating lambda- DNA molecules subject to constant shear are studied, and direct comparison with Brownian Dynamics and experimental results is made. The effect of the number of beads per chain is examined through the extension autocorrelation function. (3) The Schmidt number (Sc) for each numerical scheme is investigated and the dependence on the scheme's parameters is shown. Re-visiting the wormlike chain problem under shear, we recover a better agreement with the experimental data through proper adjustment of Sc.

  17. Numerically modelling the large scale coronal magnetic field

    Science.gov (United States)

    Panja, Mayukh; Nandi, Dibyendu

    2016-07-01

    The solar corona spews out vast amounts of magnetized plasma into the heliosphere which has a direct impact on the Earth's magnetosphere. Thus it is important that we develop an understanding of the dynamics of the solar corona. With our present technology it has not been possible to generate 3D magnetic maps of the solar corona; this warrants the use of numerical simulations to study the coronal magnetic field. A very popular method of doing this, is to extrapolate the photospheric magnetic field using NLFF or PFSS codes. However the extrapolations at different time intervals are completely independent of each other and do not capture the temporal evolution of magnetic fields. On the other hand full MHD simulations of the global coronal field, apart from being computationally very expensive would be physically less transparent, owing to the large number of free parameters that are typically used in such codes. This brings us to the Magneto-frictional model which is relatively simpler and computationally more economic. We have developed a Magnetofrictional Model, in 3D spherical polar co-ordinates to study the large scale global coronal field. Here we present studies of changing connectivities between active regions, in response to photospheric motions.

  18. Application of the reduction of scale range in a Lorentz boosted frame to the numerical simulation of particle acceleration devices

    International Nuclear Information System (INIS)

    Vay, J.; Fawley, W.M.; Geddes, C.G.; Cormier-Michel, E.; Grote, D.P.

    2009-01-01

    It has been shown that the ratio of longest to shortest space and time scales of a system of two or more components crossing at relativistic velocities is not invariant under Lorentz transformation. This implies the existence of a frame of reference minimizing an aggregate measure of the ratio of space and time scales. It was demonstrated that this translated into a reduction by orders of magnitude in computer simulation run times, using methods based on first principles (e.g., Particle-In-Cell), for particle acceleration devices and for problems such as: free electron laser, laser-plasma accelerator, and particle beams interacting with electron clouds. Since then, speed-ups ranging from 75 to more than four orders of magnitude have been reported for the simulation of either scaled or reduced models of the above-cited problems. In it was shown that to achieve full benefits of the calculation in a boosted frame, some of the standard numerical techniques needed to be revised. The theory behind the speed-up of numerical simulation in a boosted frame, latest developments of numerical methods, and example applications with new opportunities that they offer are all presented

  19. Can a numerically stable subgrid-scale model for turbulent flow computation be ideally accurate?: a preliminary theoretical study for the Gaussian filtered Navier-Stokes equations.

    Science.gov (United States)

    Ida, Masato; Taniguchi, Nobuyuki

    2003-09-01

    This paper introduces a candidate for the origin of the numerical instabilities in large eddy simulation repeatedly observed in academic and practical industrial flow computations. Without resorting to any subgrid-scale modeling, but based on a simple assumption regarding the streamwise component of flow velocity, it is shown theoretically that in a channel-flow computation, the application of the Gaussian filtering to the incompressible Navier-Stokes equations yields a numerically unstable term, a cross-derivative term, which is similar to one appearing in the Gaussian filtered Vlasov equation derived by Klimas [J. Comput. Phys. 68, 202 (1987)] and also to one derived recently by Kobayashi and Shimomura [Phys. Fluids 15, L29 (2003)] from the tensor-diffusivity subgrid-scale term in a dynamic mixed model. The present result predicts that not only the numerical methods and the subgrid-scale models employed but also only the applied filtering process can be a seed of this numerical instability. An investigation concerning the relationship between the turbulent energy scattering and the unstable term shows that the instability of the term does not necessarily represent the backscatter of kinetic energy which has been considered a possible origin of numerical instabilities in large eddy simulation. The present findings raise the question whether a numerically stable subgrid-scale model can be ideally accurate.

  20. Feasibility and reliability of the modified Berg Balance Scale in persons with severe intellectual and visual disabilities

    NARCIS (Netherlands)

    Waninge, A.; van Wijck, R.; Steenbergen, B.; van der Schans, C. P.

    Background The purpose of this study was to determine the feasibility and reliability of the modified Berg Balance Scale (mBBS) in persons with severe intellectual and visual disabilities (severe multiple disabilities, SMD) assigned Gross Motor Function Classification System (GMFCS) grades I and II.

  1. Numerical Simulation of Dispersion from Urban Greenhouse Gas Sources

    Science.gov (United States)

    Nottrott, Anders; Tan, Sze; He, Yonggang; Winkler, Renato

    2017-04-01

    Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model scalar emissions from various components of the natural gas distribution system, to study the impact of urban meteorology on mobile greenhouse gas measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in vertical dispersion of plumes, due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments. The Boussinesq approximation was applied to investigate the effects of canopy layer temperature gradients and convection on sensor footprints.

  2. A method for the assessment of the visual impact caused by the large-scale deployment of renewable-energy facilities

    International Nuclear Information System (INIS)

    Rodrigues, Marcos; Montanes, Carlos; Fueyo, Norberto

    2010-01-01

    The production of energy from renewable sources requires a significantly larger use of the territory compared with conventional (fossil and nuclear) sources. For large penetrations of renewable technologies, such as wind power, the overall visual impact at the national level can be substantial, and may prompt public reaction. This study develops a methodology for the assessment of the visual impact that can be used to measure and report the level of impact caused by several renewable technologies (wind farms, solar photovoltaic plants or solar thermal ones), both at the local and regional (e.g. national) scales. Applications are shown to several large-scale, hypothetical scenarios of wind and solar-energy penetration in Spain, and also to the vicinity of an actual, single wind farm.

  3. Perceived state of self during motion can differentially modulate numerical magnitude allocation.

    Science.gov (United States)

    Arshad, Q; Nigmatullina, Y; Roberts, R E; Goga, U; Pikovsky, M; Khan, S; Lobo, R; Flury, A-S; Pettorossi, V E; Cohen-Kadosh, R; Malhotra, P A; Bronstein, A M

    2016-09-01

    Although a direct relationship between numerical allocation and spatial attention has been proposed, recent research suggests that these processes are not directly coupled. In keeping with this, spatial attention shifts induced either via visual or vestibular motion can modulate numerical allocation in some circumstances but not in others. In addition to shifting spatial attention, visual or vestibular motion paradigms also (i) elicit compensatory eye movements which themselves can influence numerical processing and (ii) alter the perceptual state of 'self', inducing changes in bodily self-consciousness impacting upon cognitive mechanisms. Thus, the precise mechanism by which motion modulates numerical allocation remains unknown. We sought to investigate the influence that different perceptual experiences of motion have upon numerical magnitude allocation while controlling for both eye movements and task-related effects. We first used optokinetic visual motion stimulation (OKS) to elicit the perceptual experience of either 'visual world' or 'self'-motion during which eye movements were identical. In a second experiment, we used a vestibular protocol examining the effects of perceived and subliminal angular rotations in darkness, which also provoked identical eye movements. We observed that during the perceptual experience of 'visual world' motion, rightward OKS-biased judgments towards smaller numbers, whereas leftward OKS-biased judgments towards larger numbers. During the perceptual experience of 'self-motion', judgments were biased towards larger numbers irrespective of the OKS direction. Contrastingly, vestibular motion perception was found not to modulate numerical magnitude allocation, nor was there any differential modulation when comparing 'perceived' vs. 'subliminal' rotations. We provide a novel demonstration that numerical magnitude allocation can be differentially modulated by the perceptual state of self during visual but not vestibular mediated motion

  4. Visual management of large scale data mining projects.

    Science.gov (United States)

    Shah, I; Hunter, L

    2000-01-01

    This paper describes a unified framework for visualizing the preparations for, and results of, hundreds of machine learning experiments. These experiments were designed to improve the accuracy of enzyme functional predictions from sequence, and in many cases were successful. Our system provides graphical user interfaces for defining and exploring training datasets and various representational alternatives, for inspecting the hypotheses induced by various types of learning algorithms, for visualizing the global results, and for inspecting in detail results for specific training sets (functions) and examples (proteins). The visualization tools serve as a navigational aid through a large amount of sequence data and induced knowledge. They provided significant help in understanding both the significance and the underlying biological explanations of our successes and failures. Using these visualizations it was possible to efficiently identify weaknesses of the modular sequence representations and induction algorithms which suggest better learning strategies. The context in which our data mining visualization toolkit was developed was the problem of accurately predicting enzyme function from protein sequence data. Previous work demonstrated that approximately 6% of enzyme protein sequences are likely to be assigned incorrect functions on the basis of sequence similarity alone. In order to test the hypothesis that more detailed sequence analysis using machine learning techniques and modular domain representations could address many of these failures, we designed a series of more than 250 experiments using information-theoretic decision tree induction and naive Bayesian learning on local sequence domain representations of problematic enzyme function classes. In more than half of these cases, our methods were able to perfectly discriminate among various possible functions of similar sequences. We developed and tested our visualization techniques on this application.

  5. Numerical Methods for the Optimization of Nonlinear Residual-Based Sungrid-Scale Models Using the Variational Germano Identity

    NARCIS (Netherlands)

    Maher, G.D.; Hulshoff, S.J.

    2014-01-01

    The Variational Germano Identity [1, 2] is used to optimize the coefficients of residual-based subgrid-scale models that arise from the application of a Variational Multiscale Method [3, 4]. It is demonstrated that numerical iterative methods can be used to solve the Germano relations to obtain

  6. Perceived state of self during motion can differentially modulate numerical magnitude allocation.

    OpenAIRE

    Arshad, Q; Nigmatullina, Y; Roberts, RE; Goga, U; Pikovsky, M; Khan, S; Lobo, R; Flury, AS; Pettorossi, VE; Cohen-Kadosh, R; Malhotra, PA; Bronstein, AM

    2016-01-01

    Although a direct relationship between numerical-allocation and spatial-attention has been proposed, recent research suggests these processes are not directly coupled. In keeping with this, spatial attention shifts induced either via visual or vestibular motion can modulate numerical allocation in some circumstances but not in others. In addition to shifting spatial attention, visual or vestibular motion-paradigms also (i) elicit compensatory eye-movements which themselves can influence numer...

  7. Visual assessment of posterior atrophy development of a MRI rating scale

    International Nuclear Information System (INIS)

    Koedam, Esther L.G.E.; Scheltens, Philip; Pijnenburg, Yolande A.L.; Lehmann, Manja; Fox, Nick; Flier, Wiesje M. van der; Barkhof, Frederik; Wattjes, Mike P.

    2011-01-01

    To develop a visual rating scale for posterior atrophy (PA) assessment and to analyse whether this scale aids in the discrimination between Alzheimer's disease (AD) and other dementias. Magnetic resonance imaging of 118 memory clinic patients were analysed for PA (range 0-3), medial temporal lobe atrophy (MTA) (range 0-4) and global cortical atrophy (range 0-3) by different raters. Weighted-kappas were calculated for inter- and intra-rater agreement. Relationships between PA and MTA with the MMSE and age were estimated with linear-regression analysis. Intra-rater agreement ranged between 0.93 and 0.95 and inter-rater agreement between 0.65 and 0.84. Mean PA scores were higher in AD compared to controls (1.6 ± 0.9 and 0.6 ± 0.7, p < 0.01), and other dementias (0.8 ± 0.8, p < 0.01). PA was not associated with age compared to MTA (B = 1.1 (0.8) versus B = 3.1 (0.7), p < 0.01). PA and MTA were independently negatively associated with the MMSE (B = -1.6 (0.5), p < 0.01 versus B = -1.4 (0.5), p < 0.01). This robust and reproducible scale for PA assessment conveys independent information in a clinical setting and may be useful in the discrimination of AD from other dementias. (orig.)

  8. Visual assessment of posterior atrophy development of a MRI rating scale

    Energy Technology Data Exchange (ETDEWEB)

    Koedam, Esther L.G.E.; Scheltens, Philip; Pijnenburg, Yolande A.L. [VU University Medical Centre, Department of Neurology and Alzheimer Centre, PO Box 7057, MB, Amsterdam (Netherlands); Lehmann, Manja; Fox, Nick [UCL Institute of Neurology, Dementia Research Centre, London (United Kingdom); Flier, Wiesje M. van der [VU University Medical Centre, Department of Neurology and Alzheimer Centre, PO Box 7057, MB, Amsterdam (Netherlands); VU University Medical Centre, Department Epidemiology and Biostatistics, PO Box 7057, MB, Amsterdam (Netherlands); Barkhof, Frederik; Wattjes, Mike P. [VU University Medical Centre, Department of Radiology, PO Box 7057, MB, Amsterdam (Netherlands)

    2011-12-15

    To develop a visual rating scale for posterior atrophy (PA) assessment and to analyse whether this scale aids in the discrimination between Alzheimer's disease (AD) and other dementias. Magnetic resonance imaging of 118 memory clinic patients were analysed for PA (range 0-3), medial temporal lobe atrophy (MTA) (range 0-4) and global cortical atrophy (range 0-3) by different raters. Weighted-kappas were calculated for inter- and intra-rater agreement. Relationships between PA and MTA with the MMSE and age were estimated with linear-regression analysis. Intra-rater agreement ranged between 0.93 and 0.95 and inter-rater agreement between 0.65 and 0.84. Mean PA scores were higher in AD compared to controls (1.6 {+-} 0.9 and 0.6 {+-} 0.7, p < 0.01), and other dementias (0.8 {+-} 0.8, p < 0.01). PA was not associated with age compared to MTA (B = 1.1 (0.8) versus B = 3.1 (0.7), p < 0.01). PA and MTA were independently negatively associated with the MMSE (B = -1.6 (0.5), p < 0.01 versus B = -1.4 (0.5), p < 0.01). This robust and reproducible scale for PA assessment conveys independent information in a clinical setting and may be useful in the discrimination of AD from other dementias. (orig.)

  9. Numerical simulation of small-scale mixing processes in the upper ocean and atmospheric boundary layer

    International Nuclear Information System (INIS)

    Druzhinin, O; Troitskaya, Yu; Zilitinkevich, S

    2016-01-01

    The processes of turbulent mixing and momentum and heat exchange occur in the upper ocean at depths up to several dozens of meters and in the atmospheric boundary layer within interval of millimeters to dozens of meters and can not be resolved by known large- scale climate models. Thus small-scale processes need to be parameterized with respect to large scale fields. This parameterization involves the so-called bulk coefficients which relate turbulent fluxes with large-scale fields gradients. The bulk coefficients are dependent on the properties of the small-scale mixing processes which are affected by the upper-ocean stratification and characteristics of surface and internal waves. These dependencies are not well understood at present and need to be clarified. We employ Direct Numerical Simulation (DNS) as a research tool which resolves all relevant flow scales and does not require closure assumptions typical of Large-Eddy and Reynolds Averaged Navier-Stokes simulations (LES and RANS). Thus DNS provides a solid ground for correct parameterization of small-scale mixing processes and also can be used for improving LES and RANS closure models. In particular, we discuss the problems of the interaction between small-scale turbulence and internal gravity waves propagating in the pycnocline in the upper ocean as well as the impact of surface waves on the properties of atmospheric boundary layer over wavy water surface. (paper)

  10. Representation of Numerical and Non-Numerical Order in Children

    Science.gov (United States)

    Berteletti, Ilaria; Lucangeli, Daniela; Zorzi, Marco

    2012-01-01

    The representation of numerical and non-numerical ordered sequences was investigated in children from preschool to grade 3. The child's conception of how sequence items map onto a spatial scale was tested using the Number-to-Position task (Siegler & Opfer, 2003) and new variants of the task designed to probe the representation of the alphabet…

  11. Museum activities in dementia care: Using visual analog scales to measure subjective wellbeing.

    Science.gov (United States)

    Johnson, Joana; Culverwell, Alison; Hulbert, Sabina; Robertson, Mitch; Camic, Paul M

    2017-07-01

    Introduction Previous research has shown that people with dementia and caregivers derive wellbeing-related benefits from viewing art in a group, and that facilitated museum object handling is effective in increasing subjective wellbeing for people with a range of health conditions. The present study quantitatively compared the impact of two museum-based activities and a social activity on the subjective wellbeing of people with dementia and their caregivers. Methods A quasi-experimental crossover design was used. People with early to middle stage dementia and caregivers ( N = 66) participated in museum object handling, a refreshment break, and art viewing in small groups. Visual analog scales were used to rate subjective wellbeing pre and post each activity. Results Mixed-design analysis of variances indicated wellbeing significantly increased during the session, irrespective of the order in which the activities were presented. Wellbeing significantly increased from object handling and art viewing for those with dementia and caregivers across pooled orders, but did not in the social activity of a refreshment break. An end-of-intervention questionnaire indicated that experiences of the session were positive. Conclusion Results provide a rationale for considering museum activities as part of a broader psychosocial, relational approach to dementia care and support the use of easy to administer visual analog scales as a quantitative outcome measure. Further partnership working is also supported between museums and healthcare professionals in the development of nonclinical, community-based programs for this population.

  12. Acuidade visual: Medidas e notações Visual acuity: Measurements and notations

    Directory of Open Access Journals (Sweden)

    Harley E. A. Bicas

    2002-06-01

    Full Text Available Avaliações da função visual são muito complexas por dependerem de mecanismos aferentes, eferentes e cognitivos, além de fatores externos à pessoa examinada, como o tipo de estímulo e o de sua apresentação. O exame da acuidade visual é discutido em seus aspectos formais de definições, quantificações (critérios de medição de um ângulo e tamanho dos optotipos, notações (decimal ou fracionárias, escalas (representando relações angulares, lineares, logarítmicas e unidades em que os valores são expressos (recíproca do minuto de arco, número puro, freqüência espacial, decibéis e oitavas. Como conseqüência, referências numéricas sobre a acuidade visual e operações que as envolvem (p.ex., cálculo de valores médios, determinação de variações, relações entre elas podem levar a interpretações muito diferentes e até opostas num mesmo estudo, dependendo dos critérios nele empregados.Evaluations of the visual function are very intricate since they depend on afferent, efferent and cognitive mechanisms, besides external factors of the examined subject, such as the type of the stimulus and of its presentation. Testing the visual acuity is discussed in its formal aspects related to definitions, quantifications (criteria for the measurements of an angle, size of optotypes, notations (decimal or fractionary, scales (representing angular, linear or logarithmic relationships and units in which the values are expressed (reciprocal of minutes of arc, pure number, spacial frequency, decibels, octaves. As a consequence, numerical references about visual acuity and respective operations (e.g., calculations of average values, determinations of variations, relationships between them may lead to very different interpretations of a study, sometimes even opposite, according to the criteria which are used.

  13. Large-scale network analysis of imagination reveals extended but limited top-down components in human visual cognition.

    Directory of Open Access Journals (Sweden)

    Verkhlyutov V.M.

    2014-12-01

    Full Text Available We investigated whole-brain functional magnetic resonance imaging (fMRI activation in a group of 21 healthy adult subjects during perception, imagination and remembering of two dynamic visual scenarios. Activation of the posterior parts of the cortex prevailed when watching videos. The cognitive tasks of imagination and remembering were accompanied by a predominant activity in the anterior parts of the cortex. An independent component analysis identified seven large-scale cortical networks with relatively invariant spatial distributions across all experimental conditions. The time course of their activation over experimental sessions was task-dependent. These detected networks can be interpreted as a recombination of resting state networks. Both central and peripheral networks were identified within the primary visual cortex. The central network around the caudal pole of BA17 and centers of other visual areas was activated only by direct visual stimulation, while the peripheral network responded to the presentation of visual information as well as to the cognitive tasks of imagination and remembering. The latter result explains the particular susceptibility of peripheral and twilight vision to cognitive top-down influences that often result in false-alarm detections.

  14. Validation of a Numerical Model for Dynamic Three-Dimensional Railway Bridge Analysis by Comparison with a Small-Scale Laboratory Model

    DEFF Research Database (Denmark)

    Bucinskas, Paulius; Sneideris, Jonas; Agapii, Liuba

    2018-01-01

    The aim of the paper is analyse to what extent a small-scale experimental model can be applied in order to develop and validate a numerical model for dynamic analysis of a multi-span railway bridge interacting with the underlying soil. For this purpose a small-scale model of a bridge structure is...

  15. A Numerical Simulation for a Deterministic Compartmental ...

    African Journals Online (AJOL)

    In this work, an earlier deterministic mathematical model of HIV/AIDS is revisited and numerical solutions obtained using Eulers numerical method. Using hypothetical values for the parameters, a program was written in VISUAL BASIC programming language to generate series for the system of difference equations from the ...

  16. Color-Space-Based Visual-MIMO for V2X Communication.

    Science.gov (United States)

    Kim, Jai-Eun; Kim, Ji-Won; Park, Youngil; Kim, Ki-Doo

    2016-04-23

    In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and we illustrate the significance of networking information such as distance, reference color (symbol), and multiplexing-diversity mode transitions. In addition, in the V2X scenario of multiple access, we may achieve the simultaneous multiple access communication without node interferences by dividing the communication area using image processing. Finally, through numerical simulation, we show the superior SER performance of the visual-MIMO scheme compared with LED-PD communication and show the numerical result of the GCM based visual-MIMO channel capacity versus distance.

  17. Visualization data on the freezing process of micrometer-scaled aqueous citric acid drops

    Directory of Open Access Journals (Sweden)

    Anatoli Bogdan

    2017-02-01

    Full Text Available The visualization data (8 movies presented in this article are related to the research article entitled “Freezing and glass transitions upon cooling and warming and ice/freeze-concentration-solution morphology of emulsified aqueous citric acid” (A. Bogdan, M.J. Molina, H. Tenhu, 2016 [1]. The movies recorded in-situ with optical cryo-miscroscopy (OC-M demonstrate for the first time freezing processes that occur during the cooling and subsequent warming of emulsified micrometer-scaled aqueous citric acid (CA drops. The movies are made publicly available to enable critical or extended analyzes.

  18. A comparison of a patient-rated visual analogue scale with the Liebowitz Social Anxiety Scale for social anxiety disorder: A cross-sectional study

    OpenAIRE

    興津, 裕美

    2014-01-01

    博士(医学) 乙第2814号, 著者名:Hiromi Okitsu・Jitsuki Sawamura・Katsuji Nishimura・Yasuto Sato・Jun Ishigooka,タイトル:A comparison of a patient-rated visual analogue scale with the Liebowitz Social Anxiety Scale for social anxiety disorder: A cross-sectional study,掲載誌:Open Journal of Psychiatry (2161-7325),巻・頁・年:4巻1号 p.68~74 (2014),著作権関連情報:Copyright © 2014 by authors and Scientific Research Publishing Inc.,DOI:10.4236/ojpsych.2014.41010...

  19. Numerical analysis of a main crack interactions with micro-defects/inhomogeneities using two-scale generalized/extended finite element method

    Science.gov (United States)

    Malekan, Mohammad; Barros, Felício B.

    2017-12-01

    Generalized or extended finite element method (G/XFEM) models the crack by enriching functions of partition of unity type with discontinuous functions that represent well the physical behavior of the problem. However, this enrichment functions are not available for all problem types. Thus, one can use numerically-built (global-local) enrichment functions to have a better approximate procedure. This paper investigates the effects of micro-defects/inhomogeneities on a main crack behavior by modeling the micro-defects/inhomogeneities in the local problem using a two-scale G/XFEM. The global-local enrichment functions are influenced by the micro-defects/inhomogeneities from the local problem and thus change the approximate solution of the global problem with the main crack. This approach is presented in detail by solving three different linear elastic fracture mechanics problems for different cases: two plane stress and a Reissner-Mindlin plate problems. The numerical results obtained with the two-scale G/XFEM are compared with the reference solutions from the analytical, numerical solution using standard G/XFEM method and ABAQUS as well, and from the literature.

  20. Numerical simulations of a full-scale polymer electrolyte fuel cell with analysing systematic performance in an automotive application

    International Nuclear Information System (INIS)

    Park, Heesung

    2015-01-01

    Highlights: • A 3-D full-scale fuel cell performance is numerically simulated. • Generated and consumed power in the system is affected by operating condition. • Systematic analysis predicts the net power of conceptual PEFC stack. - Abstract: In fuel cell powered electric vehicles, the net power efficiency is a critical factor in terms of fuel economy and commercialization. Although the fuel cell stack produces enough power to drive the vehicles, the transferred power to the power train could be significantly reduced due to the power consumption to operate the system components of air blower and cooling module. Thus the systematic analysis on the operating condition of the fuel cell stack is essential to predict the net power generation. In this paper numerical simulation is conducted to characterize the fuel cell performance under various operating conditions. Three dimensional and full-scale fuel cell of the active area of 355 cm 2 is numerically modelled with 47.3 million grids to capture the complexities of the fluid dynamics, heat transfer and electrochemical reactions. The proposed numerical model requires large computational time and cost, however, it can be powerful to reasonably predict the fuel cell system performance at the early stage of conceptual design without requiring prototypes. Based on the model, it has been shown that the net power is reduced down to 90% of the gross power due to the power consumption of air blower and cooling module

  1. A numerical study of scale effects on performance of a tractor type podded propeller

    Directory of Open Access Journals (Sweden)

    Choi Jung-Kyu

    2014-06-01

    Full Text Available In this study, the scale effect on the performance of the podded propeller of tractor type is investigated. Turbulent flow computations are carried out for Reynolds numbers increasing progressively from model scale to full scale using the CFD analysis. The result of the flow calculation for model scale Reynolds numbers agrees well with that of the experiment of a large cavitation tunnel. The existing numerical analysis indicates that the performance of the podded propeller blades is mainly influenced by the advance coefficient and relatively little by the Reynolds number. However, the drag of pod housing with propeller in operation is different from that of pod housing without propeller due to the acceleration and swirl of propeller slipstream which is altered by propeller loading as well as the pressure recovery and friction according to Reynolds number, which suggests that the pod housing drag under the condition of propeller in operation is the key factor of the scale effect on the performance between model and full scale podded propellers. The so called ‘drag ratio’, which is the ratio of pod housing drag to total thrust of podded propeller, increases as the advance coefficient increases due to accelerated flow in the slipstream of the podded propeller. However, the increasing rate of the drag ratio reduces continuously as the Reynolds number increases from model to full scale progressively. The contribution of hydrodynamic forces, which acts on the parts composed of the pod housing with propeller operating in various loading conditions, to the thrust and the torque of the total propeller unit are presented for a range of Reynolds numbers from model to full scales.

  2. Two scale damage model and related numerical issues for thermo-mechanical high cycle fatigue

    International Nuclear Information System (INIS)

    Desmorat, R.; Kane, A.; Seyedi, M.; Sermage, J.P.

    2007-01-01

    On the idea that fatigue damage is localized at the microscopic scale, a scale smaller than the mesoscopic one of the Representative Volume Element (RVE), a three-dimensional two scale damage model has been proposed for High Cycle Fatigue applications. It is extended here to aniso-thermal cases and then to thermo-mechanical fatigue. The modeling consists in the micro-mechanics analysis of a weak micro-inclusion subjected to plasticity and damage embedded in an elastic meso-element (the RVE of continuum mechanics). The consideration of plasticity coupled with damage equations at micro-scale, altogether with Eshelby-Kroner localization law, allows to compute the value of microscopic damage up to failure for any kind of loading, 1D or 3D, cyclic or random, isothermal or aniso-thermal, mechanical, thermal or thermo-mechanical. A robust numerical scheme is proposed in order to make the computations fast. A post-processor for damage and fatigue (DAMAGE-2005) has been developed. It applies to complex thermo-mechanical loadings. Examples of the representation by the two scale damage model of physical phenomena related to High Cycle Fatigue are given such as the mean stress effect, the non-linear accumulation of damage. Examples of thermal and thermo-mechanical fatigue as well as complex applications on real size testing structure subjected to thermo-mechanical fatigue are detailed. (authors)

  3. When is best-worst best? A comparison of best-worst scaling, numeric estimation, and rating scales for collection of semantic norms.

    Science.gov (United States)

    Hollis, Geoff; Westbury, Chris

    2018-02-01

    Large-scale semantic norms have become both prevalent and influential in recent psycholinguistic research. However, little attention has been directed towards understanding the methodological best practices of such norm collection efforts. We compared the quality of semantic norms obtained through rating scales, numeric estimation, and a less commonly used judgment format called best-worst scaling. We found that best-worst scaling usually produces norms with higher predictive validities than other response formats, and does so requiring less data to be collected overall. We also found evidence that the various response formats may be producing qualitatively, rather than just quantitatively, different data. This raises the issue of potential response format bias, which has not been addressed by previous efforts to collect semantic norms, likely because of previous reliance on a single type of response format for a single type of semantic judgment. We have made available software for creating best-worst stimuli and scoring best-worst data. We also made available new norms for age of acquisition, valence, arousal, and concreteness collected using best-worst scaling. These norms include entries for 1,040 words, of which 1,034 are also contained in the ANEW norms (Bradley & Lang, Affective norms for English words (ANEW): Instruction manual and affective ratings (pp. 1-45). Technical report C-1, the center for research in psychophysiology, University of Florida, 1999).

  4. Numerical Analysis on Color Preference and Visual Comfort from Eye Tracking Technique

    Directory of Open Access Journals (Sweden)

    Ming-Chung Ho

    2015-01-01

    Full Text Available Color preferences in engineering are very important, and there exists relationship between color preference and visual comfort. In this study, there are thirty university students who participated in the experiment, supplemented by pre- and posttest questionnaires, which lasted about an hour. The main purpose of this study is to explore the visual effects of different color assignment with subjective color preferences via eye tracking technology. Eye-movement data through a nonlinear analysis detect slight differences in color preferences and visual comfort, suggesting effective physiological indicators as extensive future research discussed. Results found that the average pupil size of eye-movement indicators can effectively reflect the differences of color preferences and visual comfort. This study more confirmed that the subjective feeling will make people have misjudgment.

  5. Fine-scale features on bioreplicated decoys of the emerald ash borer provide necessary visual verisimilitude

    Science.gov (United States)

    Domingue, Michael J.; Pulsifer, Drew P.; Narkhede, Mahesh S.; Engel, Leland G.; Martín-Palma, Raúl J.; Kumar, Jayant; Baker, Thomas C.; Lakhtakia, Akhlesh

    2014-03-01

    The emerald ash borer (EAB), Agrilus planipennis, is an invasive tree-killing pest in North America. Like other buprestid beetles, it has an iridescent coloring, produced by a periodically layered cuticle whose reflectance peaks at 540 nm wavelength. The males perform a visually mediated ritualistic mating flight directly onto females poised on sunlit leaves. We attempted to evoke this behavior using artificial visual decoys of three types. To fabricate decoys of the first type, a polymer sheet coated with a Bragg-stack reflector was loosely stamped by a bioreplicating die. For decoys of the second type, a polymer sheet coated with a Bragg-stack reflector was heavily stamped by the same die and then painted green. Every decoy of these two types had an underlying black absorber layer. Decoys of the third type were produced by a rapid prototyping machine and painted green. Fine-scale features were absent on the third type. Experiments were performed in an American ash forest infested with EAB, and a European oak forest home to a similar pest, the two-spotted oak borer (TSOB), Agrilus biguttatus. When pinned to leaves, dead EAB females, dead TSOB females, and bioreplicated decoys of both types often evoked the complete ritualized flight behavior. Males also initiated approaches to the rapidly prototyped decoy, but would divert elsewhere without making contact. The attraction of the bioreplicated decoys was also demonstrated by providing a high dc voltage across the decoys that stunned and killed approaching beetles. Thus, true bioreplication with fine-scale features is necessary to fully evoke ritualized visual responses in insects, and provides an opportunity for developing insecttrapping technologies.

  6. A numerical comparison between the multiple-scales and finite-element solution for sound propagation in lined flow ducts

    NARCIS (Netherlands)

    Rienstra, S.W.; Eversman, W.

    2001-01-01

    An explicit, analytical, multiple-scales solution for modal sound transmission through slowly varying ducts with mean flow and acoustic lining is tested against a numerical finite-element solution solving the same potential flow equations. The test geometry taken is representative of a high-bypass

  7. Initial condition effects on large scale structure in numerical simulations of plane mixing layers

    Science.gov (United States)

    McMullan, W. A.; Garrett, S. J.

    2016-01-01

    In this paper, Large Eddy Simulations are performed on the spatially developing plane turbulent mixing layer. The simulated mixing layers originate from initially laminar conditions. The focus of this research is on the effect of the nature of the imposed fluctuations on the large-scale spanwise and streamwise structures in the flow. Two simulations are performed; one with low-level three-dimensional inflow fluctuations obtained from pseudo-random numbers, the other with physically correlated fluctuations of the same magnitude obtained from an inflow generation technique. Where white-noise fluctuations provide the inflow disturbances, no spatially stationary streamwise vortex structure is observed, and the large-scale spanwise turbulent vortical structures grow continuously and linearly. These structures are observed to have a three-dimensional internal geometry with branches and dislocations. Where physically correlated provide the inflow disturbances a "streaky" streamwise structure that is spatially stationary is observed, with the large-scale turbulent vortical structures growing with the square-root of time. These large-scale structures are quasi-two-dimensional, on top of which the secondary structure rides. The simulation results are discussed in the context of the varying interpretations of mixing layer growth that have been postulated. Recommendations are made concerning the data required from experiments in order to produce accurate numerical simulation recreations of real flows.

  8. Clutter-free Visualization of Large Point Symbols at Multiple Scales by Offset Quadtrees

    Directory of Open Access Journals (Sweden)

    ZHANG Xiang

    2016-08-01

    Full Text Available To address the cartographic problems in map mash-up applications in the Web 2.0 context, this paper studies a clutter-free technique for visualizing large symbols on Web maps. Basically, a quadtree is used to select one symbol in each grid cell at each zoom level. To resolve the symbol overlaps between neighboring quad-grids, multiple offsets are applied to the quadtree and a voting strategy is used to compute the significant level of symbols for their selection at multiple scales. The method is able to resolve spatial conflicts without explicit conflict detection, thus enabling a highly efficient processing. Also the resulting map forms a visual hierarchy of semantic importance. We discuss issues such as the relative importance, symbol-to-grid size ratio, and effective offset schemes, and propose two extensions to make better use of the free space available on the map. Experiments were carried out to validate the technique,which demonstrates its robustness and efficiency (a non-optimal implementation leads to a sub-second processing for datasets of a 105 magnitude.

  9. Visual Aids Improve Diagnostic Inferences and Metacognitive Judgment Calibration

    Directory of Open Access Journals (Sweden)

    Rocio eGarcia-Retamero

    2015-07-01

    Full Text Available Visual aids can improve comprehension of risks associated with medical treatments, screenings, and lifestyles. Do visual aids also help decision makers accurately assess their risk comprehension? That is, do visual aids help them become well calibrated? To address these questions, we investigated the benefits of visual aids displaying numerical information and measured accuracy of self-assessment of diagnostic inferences (i.e., metacognitive judgment calibration controlling for individual differences in numeracy. Participants included 108 patients who made diagnostic inferences about three medical tests on the basis of information about the sensitivity and false-positive rate of the tests and disease prevalence. Half of the patients received the information in numbers without a visual aid, while the other half received numbers along with a grid representing the numerical information. In the numerical condition, many patients --especially those with low numeracy-- misinterpreted the predictive value of the tests and profoundly overestimated the accuracy of their inferences. Metacognitive judgment calibration mediated the relationship between numeracy and accuracy of diagnostic inferences. In contrast, in the visual aid condition, patients at all levels of numeracy showed high-levels of inferential accuracy and metacognitive judgment calibration. Results indicate that accurate metacognitive assessment may explain the beneficial effects of visual aids and numeracy --a result that accords with theory suggesting that metacognition is an essential part of risk literacy. We conclude that well-designed risk communications can inform patients about health-relevant numerical information while helping them assess the quality of their own risk comprehension.

  10. Numerical modelling of a bromide-polysulphide redox flow battery. Part 2: Evaluation of a utility-scale system

    International Nuclear Information System (INIS)

    Scamman, Daniel P.; Roberts, Edward P.L.; Reade, Gavin W.

    2009-01-01

    Numerical modelling of redox flow battery (RFB) systems allows the technical and commercial performance of different designs to be predicted without costly lab, pilot and full-scale testing. A numerical model of a redox flow battery was used in conjunction with a simple cost model incorporating capital and operating costs to predict the technical and commercial performance of a 120 MWh/15 MW utility-scale polysulphide-bromine (PSB) storage plant for arbitrage applications. Based on 2006 prices, the system was predicted to make a net loss of 0.45 p kWh -1 at an optimum current density of 500 A m -2 and an energy efficiency of 64%. The system was predicted to become economic for arbitrage (assuming no further costs were incurred) if the rate constants of both electrolytes could be increased to 10 -5 m s -1 , for example by using a suitable (low cost) electrocatalyst. The economic viability was found to be strongly sensitive to the costs of the electrochemical cells and the electrical energy price differential. (author)

  11. Color-Space-Based Visual-MIMO for V2X Communication

    Directory of Open Access Journals (Sweden)

    Jai-Eun Kim

    2016-04-01

    Full Text Available In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and we illustrate the significance of networking information such as distance, reference color (symbol, and multiplexing-diversity mode transitions. In addition, in the V2X scenario of multiple access, we may achieve the simultaneous multiple access communication without node interferences by dividing the communication area using image processing. Finally, through numerical simulation, we show the superior SER performance of the visual-MIMO scheme compared with LED-PD communication and show the numerical result of the GCM based visual-MIMO channel capacity versus distance.

  12. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  13. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    Science.gov (United States)

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  14. Reactions, accuracy and response complexity of numerical typing on touch screens.

    Science.gov (United States)

    Lin, Cheng-Jhe; Wu, Changxu

    2013-01-01

    Touch screens are popular nowadays as seen on public kiosks, industrial control panels and personal mobile devices. Numerical typing is one frequent task performed on touch screens, but this task on touch screen is subject to human errors and slow responses. This study aims to find innate differences of touch screens from standard physical keypads in the context of numerical typing by eliminating confounding issues. Effects of precise visual feedback and urgency of numerical typing were also investigated. The results showed that touch screens were as accurate as physical keyboards, but reactions were indeed executed slowly on touch screens as signified by both pre-motor reaction time and reaction time. Provision of precise visual feedback caused more errors, and the interaction between devices and urgency was not found on reaction time. To improve usability of touch screens, designers should focus more on reducing response complexity and be cautious about the use of visual feedback. The study revealed that slower responses on touch screens involved more complex human cognition to formulate motor responses. Attention should be given to designing precise visual feedback appropriately so that distractions or visual resource competitions can be avoided to improve human performance on touch screens.

  15. Visualization Design Environment

    Energy Technology Data Exchange (ETDEWEB)

    Pomplun, A.R.; Templet, G.J.; Jortner, J.N.; Friesen, J.A.; Schwegel, J.; Hughes, K.R.

    1999-02-01

    Improvements in the performance and capabilities of computer software and hardware system, combined with advances in Internet technologies, have spurred innovative developments in the area of modeling, simulation and visualization. These developments combine to make it possible to create an environment where engineers can design, prototype, analyze, and visualize components in virtual space, saving the time and expenses incurred during numerous design and prototyping iterations. The Visualization Design Centers located at Sandia National Laboratories are facilities built specifically to promote the ''design by team'' concept. This report focuses on designing, developing and deploying this environment by detailing the design of the facility, software infrastructure and hardware systems that comprise this new visualization design environment and describes case studies that document successful application of this environment.

  16. NUMERICAL SIMULATION OF SHOCK WAVE REFRACTION ON INCLINED CONTACT DISCONTINUITY

    Directory of Open Access Journals (Sweden)

    P. V. Bulat

    2016-05-01

    Full Text Available We consider numerical simulation of shock wave refraction on plane contact discontinuity, separating two gases with different density. Discretization of Euler equations is based on finite volume method and WENO finite difference schemes, implemented on unstructured meshes. Integration over time is performed with the use of the third-order Runge–Kutta stepping procedure. The procedure of identification and classification of gas dynamic discontinuities based on conditions of dynamic consistency and image processing methods is applied to visualize and interpret the results of numerical calculations. The flow structure and its quantitative characteristics are defined. The results of numerical and experimental visualization (shadowgraphs, schlieren images, and interferograms are compared.

  17. Exclusively visual analysis of classroom group interactions

    Science.gov (United States)

    Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric

    2016-12-01

    Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.

  18. Exclusively visual analysis of classroom group interactions

    Directory of Open Access Journals (Sweden)

    Laura Tucker

    2016-11-01

    Full Text Available Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.

  19. The contributions of numerical acuity and non-numerical stimulus features to the development of the number sense and symbolic math achievement.

    Science.gov (United States)

    Starr, Ariel; DeWind, Nicholas K; Brannon, Elizabeth M

    2017-11-01

    Numerical acuity, frequently measured by a Weber fraction derived from nonsymbolic numerical comparison judgments, has been shown to be predictive of mathematical ability. However, recent findings suggest that stimulus controls in these tasks are often insufficiently implemented, and the proposal has been made that alternative visual features or inhibitory control capacities may actually explain this relation. Here, we use a novel mathematical algorithm to parse the relative influence of numerosity from other visual features in nonsymbolic numerical discrimination and to examine the strength of the relations between each of these variables, including inhibitory control, and mathematical ability. We examined these questions developmentally by testing 4-year-old children, 6-year-old children, and adults with a nonsymbolic numerical comparison task, a symbolic math assessment, and a test of inhibitory control. We found that the influence of non-numerical features decreased significantly over development but that numerosity was a primary determinate of decision making at all ages. In addition, numerical acuity was a stronger predictor of math achievement than either non-numerical bias or inhibitory control in children. These results suggest that the ability to selectively attend to number contributes to the maturation of the number sense and that numerical acuity, independent of inhibitory control, contributes to math achievement in early childhood. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Efficient numerical methods for the large-scale, parallel solution of elastoplastic contact problems

    KAUST Repository

    Frohne, Jö rg; Heister, Timo; Bangerth, Wolfgang

    2015-01-01

    © 2016 John Wiley & Sons, Ltd. Quasi-static elastoplastic contact problems are ubiquitous in many industrial processes and other contexts, and their numerical simulation is consequently of great interest in accurately describing and optimizing production processes. The key component in these simulations is the solution of a single load step of a time iteration. From a mathematical perspective, the problems to be solved in each time step are characterized by the difficulties of variational inequalities for both the plastic behavior and the contact problem. Computationally, they also often lead to very large problems. In this paper, we present and evaluate a complete set of methods that are (1) designed to work well together and (2) allow for the efficient solution of such problems. In particular, we use adaptive finite element meshes with linear and quadratic elements, a Newton linearization of the plasticity, active set methods for the contact problem, and multigrid-preconditioned linear solvers. Through a sequence of numerical experiments, we show the performance of these methods. This includes highly accurate solutions of a three-dimensional benchmark problem and scaling our methods in parallel to 1024 cores and more than a billion unknowns.

  1. Efficient numerical methods for the large-scale, parallel solution of elastoplastic contact problems

    KAUST Repository

    Frohne, Jörg

    2015-08-06

    © 2016 John Wiley & Sons, Ltd. Quasi-static elastoplastic contact problems are ubiquitous in many industrial processes and other contexts, and their numerical simulation is consequently of great interest in accurately describing and optimizing production processes. The key component in these simulations is the solution of a single load step of a time iteration. From a mathematical perspective, the problems to be solved in each time step are characterized by the difficulties of variational inequalities for both the plastic behavior and the contact problem. Computationally, they also often lead to very large problems. In this paper, we present and evaluate a complete set of methods that are (1) designed to work well together and (2) allow for the efficient solution of such problems. In particular, we use adaptive finite element meshes with linear and quadratic elements, a Newton linearization of the plasticity, active set methods for the contact problem, and multigrid-preconditioned linear solvers. Through a sequence of numerical experiments, we show the performance of these methods. This includes highly accurate solutions of a three-dimensional benchmark problem and scaling our methods in parallel to 1024 cores and more than a billion unknowns.

  2. Coupled numerical modeling of gas hydrates bearing sediments from laboratory to field-scale conditions

    Science.gov (United States)

    Sanchez, M. J.; Santamarina, C.; Gai, X., Sr.; Teymouri, M., Sr.

    2017-12-01

    Stability and behavior of Hydrate Bearing Sediments (HBS) are characterized by the metastable character of the gas hydrate structure which strongly depends on thermo-hydro-chemo-mechanical (THCM) actions. Hydrate formation, dissociation and methane production from hydrate bearing sediments are coupled THCM processes that involve, amongst other, exothermic formation and endothermic dissociation of hydrate and ice phases, mixed fluid flow and large changes in fluid pressure. The analysis of available data from past field and laboratory experiments, and the optimization of future field production studies require a formal and robust numerical framework able to capture the very complex behavior of this type of soil. A comprehensive fully coupled THCM formulation has been developed and implemented into a finite element code to tackle problems involving gas hydrates sediments. Special attention is paid to the geomechanical behavior of HBS, and particularly to their response upon hydrate dissociation under loading. The numerical framework has been validated against recent experiments conducted under controlled conditions in the laboratory that challenge the proposed approach and highlight the complex interaction among THCM processes in HBS. The performance of the models in these case studies is highly satisfactory. Finally, the numerical code is applied to analyze the behavior of gas hydrate soils under field-scale conditions exploring different features of material behavior under possible reservoir conditions.

  3. Numerical aspects of drift kinetic turbulence: Ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    KAUST Repository

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter. © 2012 IOP Publishing Ltd.

  4. Numerical aspects of drift kinetic turbulence: ill-posedness, regularization and a priori estimates of sub-grid-scale terms

    International Nuclear Information System (INIS)

    Samtaney, Ravi

    2012-01-01

    We present a numerical method based on an Eulerian approach to solve the Vlasov-Poisson system for 4D drift kinetic turbulence. Our numerical approach uses a conservative formulation with high-order (fourth and higher) evaluation of the numerical fluxes coupled with a fourth-order accurate Poisson solver. The fluxes are computed using a low-dissipation high-order upwind differencing method or a tuned high-resolution finite difference method with no numerical dissipation. Numerical results are presented for the case of imposed ion temperature and density gradients. Different forms of controlled regularization to achieve a well-posed system are used to obtain convergent resolved simulations. The regularization of the equations is achieved by means of a simple collisional model, by inclusion of an ad-hoc hyperviscosity or artificial viscosity term or by implicit dissipation in upwind schemes. Comparisons between the various methods and regularizations are presented. We apply a filtering formalism to the Vlasov equation and derive sub-grid-scale (SGS) terms analogous to the Reynolds stress terms in hydrodynamic turbulence. We present a priori quantifications of these SGS terms in resolved simulations of drift-kinetic turbulence by applying a sharp filter.

  5. Predictions of the spontaneous symmetry-breaking theory for visual code completeness and spatial scaling in single-cell learning rules.

    Science.gov (United States)

    Webber, C J

    2001-05-01

    This article shows analytically that single-cell learning rules that give rise to oriented and localized receptive fields, when their synaptic weights are randomly and independently initialized according to a plausible assumption of zero prior information, will generate visual codes that are invariant under two-dimensional translations, rotations, and scale magnifications, provided that the statistics of their training images are sufficiently invariant under these transformations. Such codes span different image locations, orientations, and size scales with equal economy. Thus, single-cell rules could account for the spatial scaling property of the cortical simple-cell code. This prediction is tested computationally by training with natural scenes; it is demonstrated that a single-cell learning rule can give rise to simple-cell receptive fields spanning the full range of orientations, image locations, and spatial frequencies (except at the extreme high and low frequencies at which the scale invariance of the statistics of digitally sampled images must ultimately break down, because of the image boundary and the finite pixel resolution). Thus, no constraint on completeness, or any other coupling between cells, is necessary to induce the visual code to span wide ranges of locations, orientations, and size scales. This prediction is made using the theory of spontaneous symmetry breaking, which we have previously shown can also explain the data-driven self-organization of a wide variety of transformation invariances in neurons' responses, such as the translation invariance of complex cell response.

  6. Advanced Dynamics Analytical and Numerical Calculations with MATLAB

    CERN Document Server

    Marghitu, Dan B

    2012-01-01

    Advanced Dynamics: Analytical and Numerical Calculations with MATLAB provides a thorough, rigorous presentation of kinematics and dynamics while using MATLAB as an integrated tool to solve problems. Topics presented are explained thoroughly and directly, allowing fundamental principles to emerge through applications from areas such as multibody systems, robotics, spacecraft and design of complex mechanical devices. This book differs from others in that it uses symbolic MATLAB for both theory and applications. Special attention is given to solutions that are solved analytically and numerically using MATLAB. The illustrations and figures generated with MATLAB reinforce visual learning while an abundance of examples offer additional support. This book also: Provides solutions analytically and numerically using MATLAB Illustrations and graphs generated with MATLAB reinforce visual learning for students as they study Covers modern technical advancements in areas like multibody systems, robotics, spacecraft and des...

  7. Visualization experimental investigation on long stripe coherent structure in small-scale rectangular channel

    International Nuclear Information System (INIS)

    Su Jiqiang; Sun Zhongning; Fan Guangming; Wang Shiming

    2013-01-01

    The long stripe coherent structure of the turbulent boundary layer in a small- scale vertical rectangular channel was observed by using hydrogen bubble flow trace visualization technique. The statistical properties of the long stripe in the experimental channel boundary layer were compared with that in the smooth flat plate boundary layer. The pitch characteristics were explained by the formation mechanism of the long stripe. It was analyzed that how the change of y + affected the distribution of the long stripe. In addition, the frequency characteristics of the long stripe were also investigated, and the correlation of the long stripe frequency in such a flow channel was obtained. (authors)

  8. Visualization environment of the large-scale data of JAEA's supercomputer system

    Energy Technology Data Exchange (ETDEWEB)

    Sakamoto, Kensaku [Japan Atomic Energy Agency, Center for Computational Science and e-Systems, Tokai, Ibaraki (Japan); Hoshi, Yoshiyuki [Research Organization for Information Science and Technology (RIST), Tokai, Ibaraki (Japan)

    2013-11-15

    On research and development of various fields of nuclear energy, visualization of calculated data is especially useful to understand the result of simulation in an intuitive way. Many researchers who run simulations on the supercomputer in Japan Atomic Energy Agency (JAEA) are used to transfer calculated data files from the supercomputer to their local PCs for visualization. In recent years, as the size of calculated data has gotten larger with improvement of supercomputer performance, reduction of visualization processing time as well as efficient use of JAEA network is being required. As a solution, we introduced a remote visualization system which has abilities to utilize parallel processors on the supercomputer and to reduce the usage of network resources by transferring data of intermediate visualization process. This paper reports a study on the performance of image processing with the remote visualization system. The visualization processing time is measured and the influence of network speed is evaluated by varying the drawing mode, the size of visualization data and the number of processors. Based on this study, a guideline for using the remote visualization system is provided to show how the system can be used effectively. An upgrade policy of the next system is also shown. (author)

  9. Color-Space-Based Visual-MIMO for V2X Communication †

    Science.gov (United States)

    Kim, Jai-Eun; Kim, Ji-Won; Park, Youngil; Kim, Ki-Doo

    2016-01-01

    In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and we illustrate the significance of networking information such as distance, reference color (symbol), and multiplexing-diversity mode transitions. In addition, in the V2X scenario of multiple access, we may achieve the simultaneous multiple access communication without node interferences by dividing the communication area using image processing. Finally, through numerical simulation, we show the superior SER performance of the visual-MIMO scheme compared with LED-PD communication and show the numerical result of the GCM based visual-MIMO channel capacity versus distance. PMID:27120603

  10. Visualization periodic flows in a continuously stratified fluid.

    Science.gov (United States)

    Bardakov, R.; Vasiliev, A.

    2012-04-01

    To visualize the flow pattern of viscous continuously stratified fluid both experimental and computational methods were developed. Computational procedures were based on exact solutions of set of the fundamental equations. Solutions of the problems of flows producing by periodically oscillating disk (linear and torsion oscillations) were visualized with a high resolutions to distinguish small-scale the singular components on the background of strong internal waves. Numerical algorithm of visualization allows to represent both the scalar and vector fields, such as velocity, density, pressure, vorticity, stream function. The size of the source, buoyancy and oscillation frequency, kinematic viscosity of the medium effects were traced in 2D an 3D posing problems. Precision schlieren instrument was used to visualize the flow pattern produced by linear and torsion oscillations of strip and disk in a continuously stratified fluid. Uniform stratification was created by the continuous displacement method. The buoyancy period ranged from 7.5 to 14 s. In the experiments disks with diameters from 9 to 30 cm and a thickness of 1 mm to 10 mm were used. Different schlieren methods that are conventional vertical slit - Foucault knife, vertical slit - filament (Maksoutov's method) and horizontal slit - horizontal grating (natural "rainbow" schlieren method) help to produce supplementing flow patterns. Both internal wave beams and fine flow components were visualized in vicinity and far from the source. Intensity of high gradient envelopes increased proportionally the amplitude of the source. In domains of envelopes convergence isolated small scale vortices and extended mushroom like jets were formed. Experiments have shown that in the case of torsion oscillations pattern of currents is more complicated than in case of forced linear oscillations. Comparison with known theoretical model shows that nonlinear interactions between the regular and singular flow components must be taken

  11. 1/12-Scale mixing interface visualization and buoyant particle release tests in support of Tank 241-SY-101 hydrogen mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Eschbach, E.J.; Enderlin, C.W.

    1993-10-01

    In support of tank waste safety programs, visualization tests were performed in the 1/12-scale tank facility, using a low-viscosity simulant. The primary objective of the tests was to obtain video records of the transient jet-sludge interaction. The intent is that these videos will provide useful qualitative data for comparison with model predictions. Two tests were initially planned: mixing interface visualization (MIV) and buoyant particle release (BPR). Completion of the buoyant particle release test was set aside in order to complete additional MIV tests. Rheological measurements were made on simulant samples before testing, and the simulant was found to exhibit thixotropic behavior. Shear vane measurements were also made on an in-situ analog of the 1/12-scale tank simulant. Simulant shear strength has been observed to be time dependent. The primary objective of obtaining video records of jet-sludge interaction was satisfied, and the records yielded jet location information which may be of use in completing model comparisons. The modeling effort is not part of this task, but this report also discusses test specific instrumentation, visualization techniques, and shear vane instrumentation which would enable improved characterization of jet-sludge interaction and simulant characteristics.

  12. 1/12-Scale mixing interface visualization and buoyant particle release tests in support of Tank 241-SY-101 hydrogen mitigation

    International Nuclear Information System (INIS)

    Eschbach, E.J.; Enderlin, C.W.

    1993-10-01

    In support of tank waste safety programs, visualization tests were performed in the 1/12-scale tank facility, using a low-viscosity simulant. The primary objective of the tests was to obtain video records of the transient jet-sludge interaction. The intent is that these videos will provide useful qualitative data for comparison with model predictions. Two tests were initially planned: mixing interface visualization (MIV) and buoyant particle release (BPR). Completion of the buoyant particle release test was set aside in order to complete additional MIV tests. Rheological measurements were made on simulant samples before testing, and the simulant was found to exhibit thixotropic behavior. Shear vane measurements were also made on an in-situ analog of the 1/12-scale tank simulant. Simulant shear strength has been observed to be time dependent. The primary objective of obtaining video records of jet-sludge interaction was satisfied, and the records yielded jet location information which may be of use in completing model comparisons. The modeling effort is not part of this task, but this report also discusses test specific instrumentation, visualization techniques, and shear vane instrumentation which would enable improved characterization of jet-sludge interaction and simulant characteristics

  13. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling

    International Nuclear Information System (INIS)

    2010-01-01

    In the Phase I SBIR we proposed a ParaView-based solution to provide an environment for individuals to actively collaborate in the visualization process. The technical objectives of Phase I were: (1) to determine the set of features required for an effect collaborative system; (2) to implement a two-person collaborative prototype; and (3) to implement key collaborative features such as control locking and annotation. Accordingly, we implemented a ParaView-based collaboration prototype with support for collaborating with up to four simultaneous clients. We also implemented collaborative features such as control locking, chatting, annotation etc. Due to in part of the flexibility provided by the ParaView framework and the design features implemented in the prototype, we were able to support collaboration with multiple views, instead of a simple give as initially proposed in Phase I. In this section we will summarize the results we obtained during the Phase I project. ParaView is complex, scalable, client-server application framework built on top of the VTK visualization engine. During the implementation of the Phase I prototype, we realized that the ParaView framework naturally supports collaboration technology; hence we were able to go beyond the proposed Phase I prototype in several ways. For example, we were able to support for multiple views, enable server-as well as client-side rendering, and manage up to four heterogeneous clients. The success we achieved with Phase I clearly demonstrated the technical feasibility of the ParaView based collaborative framework we are proposing in the Phase II effort. We also investigated using the web browser as one of the means of participating in a collaborative session. This would enable non-visualization experts to participate in the collaboration process without being intimidated by a complex application such as ParaView. Hence we also developed a prototype web visualization applet that makes it possible for interactive

  14. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Schussman, Greg; /SLAC

    2010-08-25

    In the Phase I SBIR we proposed a ParaView-based solution to provide an environment for individuals to actively collaborate in the visualization process. The technical objectives of Phase I were: (1) to determine the set of features required for an effect collaborative system; (2) to implement a two-person collaborative prototype; and (3) to implement key collaborative features such as control locking and annotation. Accordingly, we implemented a ParaView-based collaboration prototype with support for collaborating with up to four simultaneous clients. We also implemented collaborative features such as control locking, chatting, annotation etc. Due to in part of the flexibility provided by the ParaView framework and the design features implemented in the prototype, we were able to support collaboration with multiple views, instead of a simple give as initially proposed in Phase I. In this section we will summarize the results we obtained during the Phase I project. ParaView is complex, scalable, client-server application framework built on top of the VTK visualization engine. During the implementation of the Phase I prototype, we realized that the ParaView framework naturally supports collaboration technology; hence we were able to go beyond the proposed Phase I prototype in several ways. For example, we were able to support for multiple views, enable server-as well as client-side rendering, and manage up to four heterogeneous clients. The success we achieved with Phase I clearly demonstrated the technical feasibility of the ParaView based collaborative framework we are proposing in the Phase II effort. We also investigated using the web browser as one of the means of participating in a collaborative session. This would enable non-visualization experts to participate in the collaboration process without being intimidated by a complex application such as ParaView. Hence we also developed a prototype web visualization applet that makes it possible for interactive

  15. Exploration of High-Dimensional Scalar Function for Nuclear Reactor Safety Analysis and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer; Michael Pernice; Robert Nourgaliev

    2013-05-01

    The next generation of methodologies for nuclear reactor Probabilistic Risk Assessment (PRA) explicitly accounts for the time element in modeling the probabilistic system evolution and uses numerical simulation tools to account for possible dependencies between failure events. The Monte-Carlo (MC) and the Dynamic Event Tree (DET) approaches belong to this new class of dynamic PRA methodologies. A challenge of dynamic PRA algorithms is the large amount of data they produce which may be difficult to visualize and analyze in order to extract useful information. We present a software tool that is designed to address these goals. We model a large-scale nuclear simulation dataset as a high-dimensional scalar function defined over a discrete sample of the domain. First, we provide structural analysis of such a function at multiple scales and provide insight into the relationship between the input parameters and the output. Second, we enable exploratory analysis for users, where we help the users to differentiate features from noise through multi-scale analysis on an interactive platform, based on domain knowledge and data characterization. Our analysis is performed by exploiting the topological and geometric properties of the domain, building statistical models based on its topological segmentations and providing interactive visual interfaces to facilitate such explorations. We provide a user’s guide to our software tool by highlighting its analysis and visualization capabilities, along with a use case involving dataset from a nuclear reactor safety simulation.

  16. Numerical and Experimental Identification of Seven-Wire Strand Tensions Using Scale Energy Entropy Spectra of Ultrasonic Guided Waves

    Directory of Open Access Journals (Sweden)

    Ji Qian

    2018-01-01

    Full Text Available Accurate identification of tension in multiwire strands is a key issue to ensure structural safety and durability of prestressed concrete structures, cable-stayed bridges, and hoist elevators. This paper proposes a method to identify strand tensions based on scale energy entropy spectra of ultrasonic guided waves (UGWs. A numerical method was first developed to simulate UGW propagation in a seven-wire strand, employing the wavelet transform to extract UGW time-frequency energy distributions for different loadings. Mode separation and frequency band loss of L(0,1 were then found for increasing tension, and UGW scale energy entropy spectra were extracted to establish a tension identification index. A good linear relationship was found between the proposed identification index and tensile force, and effects of propagation distance and propagation path were analyzed. Finally, UGWs propagation was examined experimentally for a long seven-wire strand to investigate attenuation and long distance propagation. Numerical and experimental results verified that the proposed method not only can effectively identify strand tensions but can also adapt to long distance tests for practical engineering.

  17. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  18. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  19. Savant Genome Browser 2: visualization and analysis for population-scale genomics.

    Science.gov (United States)

    Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael

    2012-07-01

    High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.

  20. Scientific visualization and radiology

    International Nuclear Information System (INIS)

    Lawrance, D.P.; Hoyer, C.E.; Wrestler, F.A.; Kuhn, M.J.; Moore, W.D.; Anderson, D.R.

    1989-01-01

    Scientific visualization is the visual presentation of numerical data. The National Center for Supercomputing Applications (NCSA) has developed methods for visualizing computerbased simulations of digital imaging data. The applicability of these various tools for unique and potentially medical beneficial display of MR images is investigated. Raw data are obtained from MR images of the brain, neck, spine, and brachial plexus obtained on a 1.5-T imager with multiple pulse sequences. A supercomputer and other mainframe resources run a variety of graphic and imaging programs using this data. An interdisciplinary team of imaging scientists, computer graphic programmers, an physicians works together to achieve useful information

  1. Improved analysis and visualization of friction loop data: unraveling the energy dissipation of meso-scale stick-slip motion

    Science.gov (United States)

    Kokorian, Jaap; Merlijn van Spengen, W.

    2017-11-01

    In this paper we demonstrate a new method for analyzing and visualizing friction force measurements of meso-scale stick-slip motion, and introduce a method for extracting two separate dissipative energy components. Using a microelectromechanical system tribometer, we execute 2 million reciprocating sliding cycles, during which we measure the static friction force with a resolution of \

  2. Map Learning with a 3D Printed Interactive Small-Scale Model: Improvement of Space and Text Memorization in Visually Impaired Students

    Directory of Open Access Journals (Sweden)

    Stéphanie Giraud

    2017-06-01

    Full Text Available Special education teachers for visually impaired students rely on tools such as raised-line maps (RLMs to teach spatial knowledge. These tools do not fully and adequately meet the needs of the teachers because they are long to produce, expensive, and not versatile enough to provide rapid updating of the content. For instance, the same RLM can barely be used during different lessons. In addition, those maps do not provide any interactivity, which reduces students’ autonomy. With the emergence of 3D printing and low-cost microcontrollers, it is now easy to design affordable interactive small-scale models (SSMs which are adapted to the needs of special education teachers. However, no study has previously been conducted to evaluate non-visual learning using interactive SSMs. In collaboration with a specialized teacher, we designed a SSM and a RLM representing the evolution of the geography and history of a fictitious kingdom. The two conditions were compared in a study with 24 visually impaired students regarding the memorization of the spatial layout and historical contents. The study showed that the interactive SSM improved both space and text memorization as compared to the RLM with braille legend. In conclusion, we argue that affordable home-made interactive small scale models can improve learning for visually impaired students. Interestingly, they are adaptable to any teaching situation including students with specific needs.

  3. Map Learning with a 3D Printed Interactive Small-Scale Model: Improvement of Space and Text Memorization in Visually Impaired Students.

    Science.gov (United States)

    Giraud, Stéphanie; Brock, Anke M; Macé, Marc J-M; Jouffrais, Christophe

    2017-01-01

    Special education teachers for visually impaired students rely on tools such as raised-line maps (RLMs) to teach spatial knowledge. These tools do not fully and adequately meet the needs of the teachers because they are long to produce, expensive, and not versatile enough to provide rapid updating of the content. For instance, the same RLM can barely be used during different lessons. In addition, those maps do not provide any interactivity, which reduces students' autonomy. With the emergence of 3D printing and low-cost microcontrollers, it is now easy to design affordable interactive small-scale models (SSMs) which are adapted to the needs of special education teachers. However, no study has previously been conducted to evaluate non-visual learning using interactive SSMs. In collaboration with a specialized teacher, we designed a SSM and a RLM representing the evolution of the geography and history of a fictitious kingdom. The two conditions were compared in a study with 24 visually impaired students regarding the memorization of the spatial layout and historical contents. The study showed that the interactive SSM improved both space and text memorization as compared to the RLM with braille legend. In conclusion, we argue that affordable home-made interactive small scale models can improve learning for visually impaired students. Interestingly, they are adaptable to any teaching situation including students with specific needs.

  4. The PedsQL™ Present Functioning Visual Analogue Scales: preliminary reliability and validity

    Directory of Open Access Journals (Sweden)

    Varni James W

    2006-10-01

    Full Text Available Abstract Background The PedsQL™ Present Functioning Visual Analogue Scales (PedsQL™ VAS were designed as an ecological momentary assessment (EMA instrument to rapidly measure present or at-the-moment functioning in children and adolescents. The PedsQL™ VAS assess child self-report and parent-proxy report of anxiety, sadness, anger, worry, fatigue, and pain utilizing six developmentally appropriate visual analogue scales based on the well-established Varni/Thompson Pediatric Pain Questionnaire (PPQ Pain Intensity VAS format. Methods The six-item PedsQL™ VAS was administered to 70 pediatric patients ages 5–17 and their parents upon admittance to the hospital environment (Time 1: T1 and again two hours later (Time 2: T2. It was hypothesized that the PedsQL™ VAS Emotional Distress Summary Score (anxiety, sadness, anger, worry and the fatigue VAS would demonstrate moderate to large effect size correlations with the PPQ Pain Intensity VAS, and that patient" parent concordance would increase over time. Results Test-retest reliability was demonstrated from T1 to T2 in the large effect size range. Internal consistency reliability was demonstrated for the PedsQL™ VAS Total Symptom Score (patient self-report: T1 alpha = .72, T2 alpha = .80; parent proxy-report: T1 alpha = .80, T2 alpha = .84 and Emotional Distress Summary Score (patient self-report: T1 alpha = .74, T2 alpha = .73; parent proxy-report: T1 alpha = .76, T2 alpha = .81. As hypothesized, the Emotional Distress Summary Score and Fatigue VAS were significantly correlated with the PPQ Pain VAS in the medium to large effect size range, and patient and parent concordance increased from T1 to T2. Conclusion The results demonstrate preliminary test-retest and internal consistency reliability and construct validity of the PedsQL™ Present Functioning VAS instrument for both pediatric patient self-report and parent proxy-report. Further field testing is required to extend these initial

  5. Three-dimensional visualization of ensemble weather forecasts – Part 1: The visualization tool Met.3D (version 1.0

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2015-07-01

    Full Text Available We present "Met.3D", a new open-source tool for the interactive three-dimensional (3-D visualization of numerical ensemble weather predictions. The tool has been developed to support weather forecasting during aircraft-based atmospheric field campaigns; however, it is applicable to further forecasting, research and teaching activities. Our work approaches challenging topics related to the visual analysis of numerical atmospheric model output – 3-D visualization, ensemble visualization and how both can be used in a meaningful way suited to weather forecasting. Met.3D builds a bridge from proven 2-D visualization methods commonly used in meteorology to 3-D visualization by combining both visualization types in a 3-D context. We address the issue of spatial perception in the 3-D view and present approaches to using the ensemble to allow the user to assess forecast uncertainty. Interactivity is key to our approach. Met.3D uses modern graphics technology to achieve interactive visualization on standard consumer hardware. The tool supports forecast data from the European Centre for Medium Range Weather Forecasts (ECMWF and can operate directly on ECMWF hybrid sigma-pressure level grids. We describe the employed visualization algorithms, and analyse the impact of the ECMWF grid topology on computing 3-D ensemble statistical quantities. Our techniques are demonstrated with examples from the T-NAWDEX-Falcon 2012 (THORPEX – North Atlantic Waveguide and Downstream Impact Experiment campaign.

  6. Manipulations of attention dissociate fragile visual short-term memory from visual working memory.

    Science.gov (United States)

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Lamme, Victor A F

    2011-05-01

    People often rely on information that is no longer in view, but maintained in visual short-term memory (VSTM). Traditionally, VSTM is thought to operate on either a short time-scale with high capacity - iconic memory - or a long time scale with small capacity - visual working memory. Recent research suggests that in addition, an intermediate stage of memory in between iconic memory and visual working memory exists. This intermediate stage has a large capacity and a lifetime of several seconds, but is easily overwritten by new stimulation. We therefore termed it fragile VSTM. In previous studies, fragile VSTM has been dissociated from iconic memory by the characteristics of the memory trace. In the present study, we dissociated fragile VSTM from visual working memory by showing a differentiation in their dependency on attention. A decrease in attention during presentation of the stimulus array greatly reduced the capacity of visual working memory, while this had only a small effect on the capacity of fragile VSTM. We conclude that fragile VSTM is a separate memory store from visual working memory. Thus, a tripartite division of VSTM appears to be in place, comprising iconic memory, fragile VSTM and visual working memory. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Numerical Analysis of Soil Settlement Prediction and Its Application In Large-Scale Marine Reclamation Artificial Island Project

    Directory of Open Access Journals (Sweden)

    Zhao Jie

    2017-11-01

    Full Text Available In an artificial island construction project based on the large-scale marine reclamation land, the soil settlement is a key to affect the late safe operation of the whole field. To analyze the factors of the soil settlement in a marine reclamation project, the SEM method in the soil micro-structural analysis method is used to test and study six soil samples such as the representative silt, mucky silty clay, silty clay and clay in the area. The structural characteristics that affect the soil settlement are obtained by observing the SEM charts at different depths. By combining numerical calculation method of Terzaghi’s one-dimensional and Biot’s two-dimensional consolidation theory, the one-dimensional and two-dimensional creep models are established and the numerical calculation results of two consolidation theories are compared in order to predict the maximum settlement of the soils 100 years after completion. The analysis results indicate that the micro-structural characteristics are the essential factor to affect the settlement in this area. Based on numerical analysis of one-dimensional and two-dimensional settlement, the settlement law and trend obtained by two numerical analysis method is similar. The analysis of this paper can provide reference and guidance to the project related to the marine reclamation land.

  8. Self-similar radiation from numerical Rosenau-Hyman compactons

    International Nuclear Information System (INIS)

    Rus, Francisco; Villatoro, Francisco R.

    2007-01-01

    The numerical simulation of compactons, solitary waves with compact support, is characterized by the presence of spurious phenomena, as numerically induced radiation, which is illustrated here using four numerical methods applied to the Rosenau-Hyman K(p, p) equation. Both forward and backward radiations are emitted from the compacton presenting a self-similar shape which has been illustrated graphically by the proper scaling. A grid refinement study shows that the amplitude of the radiations decreases as the grid size does, confirming its numerical origin. The front velocity and the amplitude of both radiations have been studied as a function of both the compacton and the numerical parameters. The amplitude of the radiations decreases exponentially in time, being characterized by a nearly constant scaling exponent. An ansatz for both the backward and forward radiations corresponding to a self-similar function characterized by the scaling exponent is suggested by the present numerical results

  9. Learning visual balance from large-scale datasets of aesthetically highly rated images

    Science.gov (United States)

    Jahanian, Ali; Vishwanathan, S. V. N.; Allebach, Jan P.

    2015-03-01

    The concept of visual balance is innate for humans, and influences how we perceive visual aesthetics and cognize harmony. Although visual balance is a vital principle of design and taught in schools of designs, it is barely quantified. On the other hand, with emergence of automantic/semi-automatic visual designs for self-publishing, learning visual balance and computationally modeling it, may escalate aesthetics of such designs. In this paper, we present how questing for understanding visual balance inspired us to revisit one of the well-known theories in visual arts, the so called theory of "visual rightness", elucidated by Arnheim. We define Arnheim's hypothesis as a design mining problem with the goal of learning visual balance from work of professionals. We collected a dataset of 120K images that are aesthetically highly rated, from a professional photography website. We then computed factors that contribute to visual balance based on the notion of visual saliency. We fitted a mixture of Gaussians to the saliency maps of the images, and obtained the hotspots of the images. Our inferred Gaussians align with Arnheim's hotspots, and confirm his theory. Moreover, the results support the viability of the center of mass, symmetry, as well as the Rule of Thirds in our dataset.

  10. Numerical modeling of pore-scale phenomena during CO2 sequestration in oceanic sediments

    International Nuclear Information System (INIS)

    Kang, Qinjun; Tsimpanogiannis, Ioannis N.; Zhang, Dongxiao; Lichtner, Peter C.

    2005-01-01

    Direct disposal of liquid CO 2 on the ocean floor is one of the approaches considered for sequestering CO 2 in order to reduce its concentration in the atmosphere. At oceanic depths deeper than approximately 3000 m, liquid CO 2 density is higher than the density of seawater and CO 2 is expected to sink and form a pool at the ocean floor. In addition to chemical reactions between CO 2 and seawater to form hydrate, fluid displacement is also expected to occur within the ocean floor sediments. In this work, we consider two different numerical models for hydrate formation at the pore scale. The first model consists of the Lattice Boltzmann (LB) method applied to a single-phase supersaturated solution in a constructed porous medium. The second model is based on the Invasion Percolation (IP) in pore networks, applied to two-phase immiscible displacement of seawater by liquid CO 2 . The pore-scale results are upscaled to obtain constitutive relations for porosity, both transverse and for the entire domain, and for permeability. We examine deposition and displacement patterns, and changes in porosity and permeability due to hydrate formation, and how these properties depend on various parameters including a parametric study of the effect of hydrate formation kinetics. According to the simulations, the depth of CO 2 invasion in the sediments is controlled by changes in the pore-scale porosity close to the hydrate formation front. (author)

  11. MATH: A Scientific Tool for Numerical Methods Calculation and Visualization

    Directory of Open Access Journals (Sweden)

    Henrich Glaser-Opitz

    2016-02-01

    Full Text Available MATH is an easy to use application for various numerical methods calculations with graphical user interface and integrated plotting tool written in Qt with extensive use of Qwt library for plotting options and use of Gsl and MuParser libraries as a numerical and parser helping libraries. It can be found at http://sourceforge.net/projects/nummath. MATH is a convenient tool for use in education process because of its capability of showing every important step in solution process to better understand how it is done. MATH also enables fast comparison of similar method speed and precision.

  12. The basis of orientation decoding in human primary visual cortex: fine- or coarse-scale biases?

    Science.gov (United States)

    Maloney, Ryan T

    2015-01-01

    Orientation signals in human primary visual cortex (V1) can be reliably decoded from the multivariate pattern of activity as measured with functional magnetic resonance imaging (fMRI). The precise underlying source of these decoded signals (whether by orientation biases at a fine or coarse scale in cortex) remains a matter of some controversy, however. Freeman and colleagues (J Neurosci 33: 19695-19703, 2013) recently showed that the accuracy of decoding of spiral patterns in V1 can be predicted by a voxel's preferred spatial position (the population receptive field) and its coarse orientation preference, suggesting that coarse-scale biases are sufficient for orientation decoding. Whether they are also necessary for decoding remains an open question, and one with implications for the broader interpretation of multivariate decoding results in fMRI studies. Copyright © 2015 the American Physiological Society.

  13. RANS-based numerical simulation and visualization of the horseshoe vortex system in the leading edge endwall region of a symmetric body

    International Nuclear Information System (INIS)

    Levchenya, A.M.; Smirnov, E.M.; Goryachev, V.D.

    2010-01-01

    This contribution is aimed at analyzing the capabilities of popular two-equation turbulence models to predict features of 3D flow fields and endwall heat transfer near the blunt edge of a symmetric body mounted on a plate. The configuration studied experimentally by Praisner and Smith is considered. Results obtained with the in-house CFD code SINF and the commercial package ANSYS-CFX are presented and compared. Prediction capabilities of the low-Re Wilcox turbulence model and two versions of the Menter SST model, the original and the modified one, are analyzed in comparison with the experimental data. Special attention is paid to grid sensitivity of the numerical solutions. Advanced visualization of the vortex structures computed is performed with author's visualization tool HDVIS. It has been established that the Wilcox model is not capable of predicting the development of a multiple-vortex system observed in the experiment upstream of the body leading edge. Both versions of the MSST model produce qualitatively correct results, with a considerable superiority of the modified version when compared with the quantitative data.

  14. A biologically plausible transform for visual recognition that is invariant to translation, scale and rotation

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2011-11-01

    Full Text Available Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled or rotated.

  15. A Biologically Plausible Transform for Visual Recognition that is Invariant to Translation, Scale, and Rotation.

    Science.gov (United States)

    Sountsov, Pavel; Santucci, David M; Lisman, John E

    2011-01-01

    Visual object recognition occurs easily despite differences in position, size, and rotation of the object, but the neural mechanisms responsible for this invariance are not known. We have found a set of transforms that achieve invariance in a neurally plausible way. We find that a transform based on local spatial frequency analysis of oriented segments and on logarithmic mapping, when applied twice in an iterative fashion, produces an output image that is unique to the object and that remains constant as the input image is shifted, scaled, or rotated.

  16. Evaluation of the Cardiac Depression Visual Analogue Scale in a medical and non-medical sample.

    Science.gov (United States)

    Di Benedetto, Mirella; Sheehan, Matthew

    2014-01-01

    Comorbid depression and medical illness is associated with a number of adverse health outcomes such as lower medication adherence and higher rates of subsequent mortality. Reliable and valid psychological measures capable of detecting a range of depressive symptoms found in medical settings are needed. The Cardiac Depression Visual Analogue Scale (CDVAS) is a recently developed, brief six-item measure originally designed to assess the range and severity of depressive symptoms within a cardiac population. The current study aimed to further investigate the psychometric properties of the CDVAS in a general and medical sample. The sample consisted of 117 participants, whose mean age was 40.0 years (SD = 19.0, range 18-84). Participants completed the CDVAS, the Cardiac Depression Scale (CDS), the Depression Anxiety Stress Scales (DASS) and a demographic and health questionnaire. The CDVAS was found to have adequate internal reliability (α = .76), strong concurrent validity with the CDS (r = .89) and the depression sub-scale of the DASS (r = .70), strong discriminant validity and strong predictive validity. The principal components analysis revealed that the CDVAS measured only one component, providing further support for the construct validity of the scale. Results of the current study indicate that the CDVAS is a short, simple, valid and reliable measure of depressive symptoms suitable for use in a general and medical sample.

  17. Physical Explanation of Archie's Porosity Exponent in Granular Materials: A Process-Based, Pore-Scale Numerical Study

    Science.gov (United States)

    Niu, Qifei; Zhang, Chi

    2018-02-01

    The empirical Archie's law has been widely used in geosciences and engineering to explain the measured electrical resistivity of many geological materials, but its physical basis has not been fully understood yet. In this study, we use a pore-scale numerical approach combining discrete element-finite difference methods to study Archie's porosity exponent m of granular materials over a wide porosity range. Numerical results reveal that at dilute states (e.g., porosity ϕ > 65%), m is exclusively related to the particle shape and orientation. As the porosity decreases, the electric flow in pore space concentrates progressively near particle contacts and m increases continuously in response to the intensified nonuniformity of the local electrical field. It is also found that the increase in m is universally correlated with the volume fraction of pore throats for all the samples regardless of their particle shapes, particle size range, and porosities.

  18. Gender effect on the use of modified borg and visual analog scales in the evaluation of dyspnea in chronic obstructive pulmonary disease

    International Nuclear Information System (INIS)

    Ilgin, D.; Ozalevli, S.; Karaali, H.K.; Cimrin, A.H.; Ucan, E.S.

    2010-01-01

    To investigate the gender effect on the use of Modified Borg Scale (MBS) and Visual Analog Scale (VAS) for the effort dyspnea evaluation in Chronic Obstructive Pulmonary Disease (COPD) patients. Fifty-two patients with severe COPD were included in this study. Pulmonary function (spirometry), quality of life (Chronic Respiratory Disease Questionnaire-CRDQ), exercise capacity (6-minute walking test), and dyspnea severity (Modified Borg and Visual Analog Scales) were evaluated. The dyspnea severity scores were higher and walking distance was shorter in women (p<0.05). The scores of the both scales were correlated with each other in both genders (p<0.05). In men, the dyspnea scores obtained by MBS and VAS scales were significantly correlated with 6-minute walking distance (p=0.001) and total score of CRDQ (p=0.001). On the other hand, the dyspnea severity score of the women obtained by MBS was correlated with only the total score of CRDQ (p<0.05). The results of our study show that gender has an effect on dyspnea perception obtained by MBS and VAS. We suggest that MBS and VAS should be used for men whereas MBS may be more convenient for women in the evaluation of dyspnea in severe COPD. (author)

  19. Numerical forensic model for the diagnosis of a full-scale RC floor

    Directory of Open Access Journals (Sweden)

    Ahmed B. Shuraim

    Full Text Available The paper presents the results of an investigation on the diagnosis and assessment of a full-scale reinforced concrete floor utilizing a 3-D forensic model developed in the framework of plasticity-damage approach. Despite the advancement in nonlinear finite element formulations and models, there is a need to verify models on nontrivial challenging structures. Various standards on strengthening existing structures consider numerical diagnosis as a major stage involving safety and economical aspects. Accordingly, model validity is a major issue that should preferably be examined against realistic large-scale tests. This was done in this study by investigating a one-story joist floor with wide shallow beams supported on columns. The surveyed cracking patterns on the entire top side of the floor were reproduced by the forensic model to a reasonable degree in terms of orientation and general location. Concrete principal plastic tensile strain was shown to be a good indirect indicator of cracking patterns. However, identifying the underlying reasons of major cracks in the floor required correlating with other key field parameters including deflections, and internal moments. Therefore, the ability of the forensic model to reproduce the surveyed damage state of the floor provided a positive indication on the material models, spatial representation, and parameter selection. Such models can be used as forensic tools for assessing the existing conditions as required by various standards and codes.

  20. Large scale visualization on the Cray XT3 using ParaView.

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, David; Geveci, Berk (Kitware, Inc.); Eschenbert, Kent (Pittsburgh Supercomputing Center); Neundorf, Alexander (Technical University of Kaiserslautern); Marion, Patrick (Kitware, Inc.); Moreland, Kenneth D.; Greenfield, John

    2008-05-01

    Post-processing and visualization are key components to understanding any simulation. Porting ParaView, a scalable visualization tool, to the Cray XT3 allows our analysts to leverage the same supercomputer they use for simulation to perform post-processing. Visualization tools traditionally rely on a variety of rendering, scripting, and networking resources; the challenge of running ParaView on the Lightweight Kernel is to provide and use the visualization and post-processing features in the absence of many OS resources. We have successfully accomplished this at Sandia National Laboratories and the Pittsburgh Supercomputing Center.

  1. Visualization of myocardial perfusion derived from coronary anatomy

    NARCIS (Netherlands)

    Termeer, M.A.; Bescos, J.O.; Breeuwer, M.; Vilanova, A.; Gerritsen, F.A.; Gröller, M.E.; Nagel, Eike

    2008-01-01

    Visually assessing the effect of the coronary artery anatomy on the perfusion of the heart muscle in patients with coronary artery disease remains a challenging task. We explore the feasibility of visualizing this effect on perfusion using a numerical approach. We perform a computational simulation

  2. Perbandingan Kombinasi Tramadol Parasetamol Intravena dengan Tramadol Ketorolak Intravena terhadap Nilai Numeric Rating Scale dan Kebutuhan Opioid Pascahisterektomi

    Directory of Open Access Journals (Sweden)

    Dendi Karmena

    2015-12-01

    Full Text Available Postoperative pain is an important problem in surgery. This study aimed to compare the combination of intravenous tramadol paracetamol and tramadol ketorolac to numeric rating scale (NRS to postoperative opioid requirements in abdominal hysterectomy. Double blind randomized controlled trial was conducted on 42 women (18–60 years with ASA physical status I–II who underwent abdominal hysterectomy surgery under general anesthesia in Dr. Hasan Sadikin General Hospital Bandung within the period of August–November 2014. Subjects were divided into two groups: 21 subjects received a combination of intravenous tramadol paracetamol and 21 subjects received combination of intravenous tramadol ketorolac that was given when peritoneum was closure. The assessment of postoperative pain was performed using a numeric rating scale both at rest and during mobilization. Correlation analysis is conducted using Mann-whitney test. Result shows that the value of the NRS in group tramadol paracetamol compared to tramadol ketorolac was not significantly different (p>0.05. This study concludes that the combinations of intravenous tramadol paracetamol and tramadol ketorolac are the same in terms of the NRS and postoperative opioid requirement after abdominal hysterectomy.

  3. The diversity of the human hair colour assessed by visual scales and instrumental measurements. A worldwide survey.

    Science.gov (United States)

    Lozano, I; Saunier, J B; Panhard, S; Loussouarn, G

    2017-02-01

    To study (i) the diversity of the natural colour of the human hair through both visual assessment of hair tone levels and colorimetric measurements of hair strands collected from 2057 human male and female volunteers, from 23 regions of the world and (ii) the correlation between visual assessments and colorimetric measurements. Hair strands were analysed by a spectrocolorimeter under the L*, a*, b* referential system and scored in vivo by experts before sampling, through standardized visual reference scales based on a 1-10 range. Results show that from a typological aspect, black or dark brown hairs largely predominate among studied ethnic groups, whereas Caucasian or derived populations exhibit the widest palette of medium to fair shades, partly explaining some past interbreeding among populations. Instrumental measurements clearly confirm that a given colour of a pigmented hair, at the exclusion of red hairs, is mostly governed by two components, L* and b*, from the L*, a*, b* reference system. The comparisons between visual assessments and instrumental data show that these appear closely linked. Darker hairs show close or subtle variations in L*, a*, b* parameters, making their individual colour differentiation calling for technical improvements in colorimetric measurements. The latter are likely governed by other physical factors such as shape, diameter and shine. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  4. Creation and validation of a visual macroscopic hematuria scale for optimal communication and an objective hematuria index.

    Science.gov (United States)

    Wong, Lih-Ming; Chum, Jia-Min; Maddy, Peter; Chan, Steven T F; Travis, Douglas; Lawrentschuk, Nathan

    2010-07-01

    Macroscopic hematuria is a common symptom and sign that is challenging to quantify and describe. The degree of hematuria communicated is variable due to health worker experience combined with lack of a reliable grading tool. We produced a reliable, standardized visual scale to describe hematuria severity. Our secondary aim was to validate a new laboratory test to quantify hemoglobin in hematuria specimens. Nurses were surveyed to ascertain current hematuria descriptions. Blood and urine were titrated at varying concentrations and digitally photographed in catheter bag tubing. Photos were processed and printed on transparency paper to create a prototype swatch or card showing light, medium, heavy and old hematuria. Using the swatch 60 samples were rated by nurses and laymen. Interobserver variability was reported using the generalized kappa coefficient of agreement. Specimens were analyzed for hemolysis by measuring optical density at oxyhemoglobin absorption peaks. Interobserver agreement between nurses and laymen was good (kappa = 0.51, p visual scale to grade and communicate hematuria with adequate interobserver agreement is feasible. The test for optical density at oxyhemoglobin absorption peaks is a new method, validated in our study, to quantify hemoglobin in a hematuria specimen. Copyright (c) 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  5. Subjective assessment of acute mountain sickness: investigating the relationship between the Lake Louise Self-Report, a visual analogue scale and psychological well-being scales.

    Science.gov (United States)

    Frühauf, Anika; Burtscher, Martin; Pocecco, Elena; Faulhaber, Martin; Kopp, Martin

    2016-01-01

    There is an ongoing discussion how to assess acute mountain sickness (AMS) in real life conditions. Next to more-item scales with a cut off like the Lake Louise Self-Report (LLS), some authors suggested to use visual analog scales (VAS) to assess AMS. This study tried to contribute to this question using VAS items used for the Subjective Ratings of Drug Effects, including an additional single item for AMS. Furthermore, we investigated if instruments developed to assess psychological well-being might predict AMS assessed via LLS or VAS. 32 (19 Female) adult persons with known AMS susceptibility filled in questionnaires (Feeling Scale, Felt Arousal Scale, Activation Deactivation Check List, LLS, VAS) at a height of 3650 m above sea level. Correlation and regression analysis suggest a moderate to high relationship between the LLS score and the VAS items, including one VAS item asking for the severity of AMS, as well as psychological well-being. In conclusion, using VAS items to assess AMS can be a more precise alternative to questionnaires like LLS, for people knowledgeable with AMS. Furthermore, researchers should be aware that psychological well-being might be an important parameter influencing the assessment of AMS.

  6. Large-scale genomic 2D visualization reveals extensive CG-AT skew correlation in bird genomes

    Directory of Open Access Journals (Sweden)

    Deng Xuemei

    2007-11-01

    Full Text Available Abstract Background Bird genomes have very different compositional structure compared with other warm-blooded animals. The variation in the base skew rules in the vertebrate genomes remains puzzling, but it must relate somehow to large-scale genome evolution. Current research is inclined to relate base skew with mutations and their fixation. Here we wish to explore base skew correlations in bird genomes, to develop methods for displaying and quantifying such correlations at different scales, and to discuss possible explanations for the peculiarities of the bird genomes in skew correlation. Results We have developed a method called Base Skew Double Triangle (BSDT for exhibiting the genome-scale change of AT/CG skew as a two-dimensional square picture, showing base skews at many scales simultaneously in a single image. By this method we found that most chicken chromosomes have high AT/CG skew correlation (symmetry in 2D picture, except for some microchromosomes. No other organisms studied (18 species show such high skew correlations. This visualized high correlation was validated by three kinds of quantitative calculations with overlapping and non-overlapping windows, all indicating that chicken and birds in general have a special genome structure. Similar features were also found in some of the mammal genomes, but clearly much weaker than in chickens. We presume that the skew correlation feature evolved near the time that birds separated from other vertebrate lineages. When we eliminated the repeat sequences from the genomes, the AT and CG skews correlation increased for some mammal genomes, but were still clearly lower than in chickens. Conclusion Our results suggest that BSDT is an expressive visualization method for AT and CG skew and enabled the discovery of the very high skew correlation in bird genomes; this peculiarity is worth further study. Computational analysis indicated that this correlation might be a compositional characteristic

  7. Robust Visual Tracking Using the Bidirectional Scale Estimation

    Directory of Open Access Journals (Sweden)

    An Zhiyong

    2017-01-01

    Full Text Available Object tracking with robust scale estimation is a challenging task in computer vision. This paper presents a novel tracking algorithm that learns the translation and scale filters with a complementary scheme. The translation filter is constructed using the ridge regression and multidimensional features. A robust scale filter is constructed by the bidirectional scale estimation, including the forward scale and backward scale. Firstly, we learn the scale filter using the forward tracking information. Then the forward scale and backward scale can be estimated using the respective scale filter. Secondly, a conservative strategy is adopted to compromise the forward and backward scales. Finally, the scale filter is updated based on the final scale estimation. It is effective to update scale filter since the stable scale estimation can improve the performance of scale filter. To reveal the effectiveness of our tracker, experiments are performed on 32 sequences with significant scale variation and on the benchmark dataset with 50 challenging videos. Our results show that the proposed tracker outperforms several state-of-the-art trackers in terms of robustness and accuracy.

  8. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    Science.gov (United States)

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree

  9. From the direct numerical simulation to system codes-perspective for the multi-scale analysis of LWR thermal hydraulics

    International Nuclear Information System (INIS)

    Bestion, D.

    2010-01-01

    A multi-scale analysis of water-cooled reactor thermal hydraulics can be used to take advantage of increased computer power and improved simulation tools, including Direct Numerical Simulation (DNS), Computational Fluid Dynamics (CFD) (in both open and porous mediums), and system thermalhydraulic codes. This paper presents a general strategy for this procedure for various thermalhydraulic scales. A short state of the art is given for each scale, and the role of the scale in the overall multi-scale analysis process is defined. System thermalhydraulic codes will remain a privileged tool for many investigations related to safety. CFD in porous medium is already being frequently used for core thermal hydraulics, either in 3D modules of system codes or in component codes. CFD in open medium allows zooming on some reactor components in specific situations, and may be coupled to the system and component scales. Various modeling approaches exist in the domain from DNS to CFD which may be used to improve the understanding of flow processes, and as a basis for developing more physically based models for macroscopic tools. A few examples are given to illustrate the multi-scale approach. Perspectives for the future are drawn from the present state of the art and directions for future research and development are given

  10. Ultrascale Visualization of Climate Data

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Doutriaux, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Patchett, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Ross G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pugmire, Dave [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steed, Chad A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Childs, Hank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Krishnan, Harinarayan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Silva, Claudio T. [New York University, New York, NY (United States). Center for Urban Sciences; Santos, Emanuele [Universidade Federal do Ceara, Ceara (Brazil); Koop, David [New York University, New York, NY (United States); Ellqvist, Tommy [New York University, New York, NY (United States); Poco, Jorge [Polytechnic Institute of New York University, New York, NY (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States); Chaudhary, Aashish [Kitware Inc., Clifton Park, NY (United States); Bauer, Andy [Kitware Inc., Clifton Park, NY (United States); Pletzer, Alexander [Tech-X Corporation, Boulder, CO (United States); Kindig, Dave [Tech-X Corporation, Boulder, CO (United States); Potter, Gerald [National Aeronautics and Space Administration (NASA), Washington, DC (United States); Maxwell, Thomas P. [National Aeronautics and Space Administration (NASA), Washington, DC (United States)

    2013-09-01

    To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in the interactive visual analysis of distributed large-scale data for the climate community.

  11. Numerical Simulation of the Time Evolution of Small-Scale Irregularities in the F-Layer Ionospheric Plasma

    Directory of Open Access Journals (Sweden)

    O. V. Mingalev

    2011-01-01

    Full Text Available Dynamics of magnetic field-aligned small-scale irregularities in the electron concentration, existing in the F-layer ionospheric plasma, is investigated with the help of a mathematical model. The plasma is assumed to be a rarefied compound consisting of electrons and positive ions and being in a strong, external magnetic field. In the applied model, kinetic processes in the plasma are simulated by using the Vlasov-Poisson system of equations. The system of equations is numerically solved applying a macroparticle method. The time evolution of a plasma irregularity, having initial cross-section dimension commensurable with a Debye length, is simulated during the period sufficient for the irregularity to decay completely. The results of simulation indicate that the small-scale irregularity, created initially in the F-region ionosphere, decays accomplishing periodic damped vibrations, with the process being collisionless.

  12. Large-scale numerical simulations on two-phase flow behavior in a fuel bundle of RMWR with the earth simulator

    International Nuclear Information System (INIS)

    Kazuyuki, Takase; Hiroyuki, Yoshida; Hidesada, Tamai; Hajime, Akimoto; Yasuo, Ose

    2003-01-01

    Fluid flow characteristics in a fuel bundle of a reduced-moderation light water reactor (RMWR) with a tight-lattice core were analyzed numerically using a newly developed two-phase flow analysis code under the full bundle size condition. Conventional analysis methods such as sub-channel codes need composition equations based on the experimental data. In case that there are no experimental data regarding to the thermal-hydraulics in the tight-lattice core, therefore, it is difficult to obtain high prediction accuracy on the thermal design of the RMWR. Then the direct numerical simulations with the earth simulator were chosen. The axial velocity distribution in a fuel bundle changed sharply around a grid spacer and its quantitative evaluation was obtained from the present preliminary numerical study. The high prospect was acquired on the possibility of establishment of the thermal design procedure of the RMWR by large-scale direct simulations. (authors)

  13. Temporal windows in visual processing: "prestimulus brain state" and "poststimulus phase reset" segregate visual transients on different temporal scales.

    Science.gov (United States)

    Wutz, Andreas; Weisz, Nathan; Braun, Christoph; Melcher, David

    2014-01-22

    Dynamic vision requires both stability of the current perceptual representation and sensitivity to the accumulation of sensory evidence over time. Here we study the electrophysiological signatures of this intricate balance between temporal segregation and integration in vision. Within a forward masking paradigm with short and long stimulus onset asynchronies (SOA), we manipulated the temporal overlap of the visual persistence of two successive transients. Human observers enumerated the items presented in the second target display as a measure of the informational capacity read-out from this partly temporally integrated visual percept. We observed higher β-power immediately before mask display onset in incorrect trials, in which enumeration failed due to stronger integration of mask and target visual information. This effect was timescale specific, distinguishing between segregation and integration of visual transients that were distant in time (long SOA). Conversely, for short SOA trials, mask onset evoked a stronger visual response when mask and targets were correctly segregated in time. Examination of the target-related response profile revealed the importance of an evoked α-phase reset for the segregation of those rapid visual transients. Investigating this precise mapping of the temporal relationships of visual signals onto electrophysiological responses highlights how the stream of visual information is carved up into discrete temporal windows that mediate between segregated and integrated percepts. Fragmenting the stream of visual information provides a means to stabilize perceptual events within one instant in time.

  14. BDNF Variants May Modulate Long-Term Visual Memory Performance in a Healthy Cohort.

    Science.gov (United States)

    Avgan, Nesli; Sutherland, Heidi G; Spriggens, Lauren K; Yu, Chieh; Ibrahim, Omar; Bellis, Claire; Haupt, Larisa M; Shum, David H K; Griffiths, Lyn R

    2017-03-17

    Brain-derived neurotrophic factor (BDNF) is involved in numerous cognitive functions including learning and memory. BDNF plays an important role in synaptic plasticity in humans and rats with BDNF shown to be essential for the formation of long-term memories. We previously identified a significant association between the BDNF Val66Met polymorphism (rs6265) and long-term visual memory ( p -value = 0.003) in a small cohort ( n = 181) comprised of healthy individuals who had been phenotyped for various aspects of memory function. In this study, we have extended the cohort to 597 individuals and examined multiple genetic variants across both the BDNF and BDNF-AS genes for association with visual memory performance as assessed by the Wechsler Memory Scale-Fourth Edition subtests Visual Reproduction I and II (VR I and II). VR I assesses immediate visual memory, whereas VR II assesses long-term visual memory. Genetic association analyses were performed for 34 single nucleotide polymorphisms genotyped on Illumina OmniExpress BeadChip arrays with the immediate and long-term visual memory phenotypes. While none of the BDNF and BDNF-AS variants were shown to be significant for immediate visual memory, we found 10 variants (including the Val66Met polymorphism ( p -value = 0.006)) that were nominally associated, and three variants (two variants in BDNF and one variant in the BDNF-AS locus) that were significantly associated with long-term visual memory. Our data therefore suggests a potential role for BDNF , and its anti-sense transcript BDNF-AS , in long-term visual memory performance.

  15. Visual Iconicity Across Sign Languages: Large-Scale Automated Video Analysis of Iconic Articulators and Locations

    Science.gov (United States)

    Östling, Robert; Börstell, Carl; Courtaux, Servane

    2018-01-01

    We use automatic processing of 120,000 sign videos in 31 different sign languages to show a cross-linguistic pattern for two types of iconic form–meaning relationships in the visual modality. First, we demonstrate that the degree of inherent plurality of concepts, based on individual ratings by non-signers, strongly correlates with the number of hands used in the sign forms encoding the same concepts across sign languages. Second, we show that certain concepts are iconically articulated around specific parts of the body, as predicted by the associational intuitions by non-signers. The implications of our results are both theoretical and methodological. With regard to theoretical implications, we corroborate previous research by demonstrating and quantifying, using a much larger material than previously available, the iconic nature of languages in the visual modality. As for the methodological implications, we show how automatic methods are, in fact, useful for performing large-scale analysis of sign language data, to a high level of accuracy, as indicated by our manual error analysis.

  16. Robust visual tracking via multiscale deep sparse networks

    Science.gov (United States)

    Wang, Xin; Hou, Zhiqiang; Yu, Wangsheng; Xue, Yang; Jin, Zefenfen; Dai, Bo

    2017-04-01

    In visual tracking, deep learning with offline pretraining can extract more intrinsic and robust features. It has significant success solving the tracking drift in a complicated environment. However, offline pretraining requires numerous auxiliary training datasets and is considerably time-consuming for tracking tasks. To solve these problems, a multiscale sparse networks-based tracker (MSNT) under the particle filter framework is proposed. Based on the stacked sparse autoencoders and rectifier linear unit, the tracker has a flexible and adjustable architecture without the offline pretraining process and exploits the robust and powerful features effectively only through online training of limited labeled data. Meanwhile, the tracker builds four deep sparse networks of different scales, according to the target's profile type. During tracking, the tracker selects the matched tracking network adaptively in accordance with the initial target's profile type. It preserves the inherent structural information more efficiently than the single-scale networks. Additionally, a corresponding update strategy is proposed to improve the robustness of the tracker. Extensive experimental results on a large scale benchmark dataset show that the proposed method performs favorably against state-of-the-art methods in challenging environments.

  17. A Numerical Study of Galaxy Formation and the Large Scale Structure of the Universe : Astrophysics and Relativity

    OpenAIRE

    Kazuyuki, YAMASHITA; Department of Physics, Kyoto University

    1993-01-01

    We investigate the thermodynamical and hydrodynamical effects on the structure formation on scales of 20h^ Mpc in the Einstein de-Sitter universe by three-dimensional numerical simulation. Calculations involve cosmological expansion, self-gravity, hydrodynamics, and cooling processes with 100×100×100 mesh cells and the same number of CDM particles. Galactic bursts out of young galaxies as a heat input are parametrically taken into account. We find that the thermodynamics of the intergalactic ...

  18. Hybrid numerical methods for multiscale simulations of subsurface biogeochemical processes

    International Nuclear Information System (INIS)

    Scheibe, T D; Tartakovsky, A M; Tartakovsky, D M; Redden, G D; Meakin, P

    2007-01-01

    Many subsurface flow and transport problems of importance today involve coupled non-linear flow, transport, and reaction in media exhibiting complex heterogeneity. In particular, problems involving biological mediation of reactions fall into this class of problems. Recent experimental research has revealed important details about the physical, chemical, and biological mechanisms involved in these processes at a variety of scales ranging from molecular to laboratory scales. However, it has not been practical or possible to translate detailed knowledge at small scales into reliable predictions of field-scale phenomena important for environmental management applications. A large assortment of numerical simulation tools have been developed, each with its own characteristic scale. Important examples include 1. molecular simulations (e.g., molecular dynamics); 2. simulation of microbial processes at the cell level (e.g., cellular automata or particle individual-based models); 3. pore-scale simulations (e.g., lattice-Boltzmann, pore network models, and discrete particle methods such as smoothed particle hydrodynamics); and 4. macroscopic continuum-scale simulations (e.g., traditional partial differential equations solved by finite difference or finite element methods). While many problems can be effectively addressed by one of these models at a single scale, some problems may require explicit integration of models across multiple scales. We are developing a hybrid multi-scale subsurface reactive transport modeling framework that integrates models with diverse representations of physics, chemistry and biology at different scales (sub-pore, pore and continuum). The modeling framework is being designed to take advantage of advanced computational technologies including parallel code components using the Common Component Architecture, parallel solvers, gridding, data and workflow management, and visualization. This paper describes the specific methods/codes being used at each

  19. Modes of Power in Technical and Professional Visuals.

    Science.gov (United States)

    Barton, Ben F.; Barton, Marthalee S.

    1993-01-01

    Treats visuals as sites of power inscription. Advances a Foucauldian design model based on the Panoptican--Jeremy Bentham's architectural figure for empowerment based on bimodal surveillance. Notes that numerous examples serve in demonstrating that maximum effectiveness results when visuals foster simultaneous viewing in the two panoptic modes,…

  20. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  1. Large Scale Topographic Maps Generalisation and Visualization Based on New Methodology

    OpenAIRE

    Dinar, Ilma; Ključanin, Slobodanka; Poslončec-Petrić, Vesna

    2015-01-01

    Integrating spatial data from different sources results in visualization which is the last step in the process of digital basic topographic maps creation. Sources used for visualization are existing real estate cadastre database orthophoto plans and digital terrain models. Analogue cadastre plans were scanned and georeferenced according to existing regulations and used for toponyms. Visualization of topologically inspected geometric primitives was performed based on the ''Collection of cartog...

  2. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos [McGill Univ., Montreal, QC (Canada

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  3. Development of Pelton turbine using numerical simulation

    International Nuclear Information System (INIS)

    Patel, K; Patel, B; Yadav, M; Foggia, T

    2010-01-01

    This paper describes recent research and development activities in the field of Pelton turbine design. Flow inside Pelton turbine is most complex due to multiphase (mixture of air and water) and free surface in nature. Numerical calculation is useful to understand flow physics as well as effect of geometry on flow. The optimized design is obtained using in-house special optimization loop. Either single phase or two phase unsteady numerical calculation could be performed. Numerical results are used to visualize the flow pattern in the water passage and to predict performance of Pelton turbine at full load as well as at part load. Model tests are conducted to determine performance of turbine and it shows good agreement with numerically predicted performance.

  4. Development of Pelton turbine using numerical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Patel, K; Patel, B; Yadav, M [Hydraulic Engineer, ALSTOM Hydro R and D India Ltd., GIDC Maneja, Vadodara - 390 013, Gujarat (India); Foggia, T, E-mail: patel@power.alstom.co [Hydraulic Engineer, Alstom Hydro France, Etablissement de Grenoble, 82, avenue Leon Blum BP 75, 38041 Grenoble Cedex (France)

    2010-08-15

    This paper describes recent research and development activities in the field of Pelton turbine design. Flow inside Pelton turbine is most complex due to multiphase (mixture of air and water) and free surface in nature. Numerical calculation is useful to understand flow physics as well as effect of geometry on flow. The optimized design is obtained using in-house special optimization loop. Either single phase or two phase unsteady numerical calculation could be performed. Numerical results are used to visualize the flow pattern in the water passage and to predict performance of Pelton turbine at full load as well as at part load. Model tests are conducted to determine performance of turbine and it shows good agreement with numerically predicted performance.

  5. Development of Pelton turbine using numerical simulation

    Science.gov (United States)

    Patel, K.; Patel, B.; Yadav, M.; Foggia, T.

    2010-08-01

    This paper describes recent research and development activities in the field of Pelton turbine design. Flow inside Pelton turbine is most complex due to multiphase (mixture of air and water) and free surface in nature. Numerical calculation is useful to understand flow physics as well as effect of geometry on flow. The optimized design is obtained using in-house special optimization loop. Either single phase or two phase unsteady numerical calculation could be performed. Numerical results are used to visualize the flow pattern in the water passage and to predict performance of Pelton turbine at full load as well as at part load. Model tests are conducted to determine performance of turbine and it shows good agreement with numerically predicted performance.

  6. Visual assessment of BIPV retrofit design proposals for selected historical buildings using the saliency map method

    Directory of Open Access Journals (Sweden)

    Ran Xu

    2015-06-01

    Full Text Available With the increasing awareness of energy efficiency, many old buildings have to undergo a massive facade energy retrofit. How to predict the visual impact which solar installations on the aesthetic cultural value of these buildings has been a heated debate in Switzerland (and throughout the world. The usual evaluation method to describe the visual impact of BIPV is based on semantic and qualitative descriptors, and strongly dependent on personal preferences. The evaluation scale is therefore relative, flexible and imprecise. This paper proposes a new method to accurately measure the visual impact which BIPV installations have on a historical building by using the saliency map method. By imitating working principles of the human eye, it is measured how much the BIPV design proposals differ from the original building facade in the aspect of attracting human visual attention. The result is directly presented in a quantitative manner, and can be used to compare the fitness of different BIPV design proposals. The measuring process is numeric, objective and more precise.  

  7. Efficient Feature-Driven Visualization of Large-Scale Scientific Data

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Aidong

    2012-12-12

    Very large, complex scientific data acquired in many research areas creates critical challenges for scientists to understand, analyze, and organize their data. The objective of this project is to expand the feature extraction and analysis capabilities to develop powerful and accurate visualization tools that can assist domain scientists with their requirements in multiple phases of scientific discovery. We have recently developed several feature-driven visualization methods for extracting different data characteristics of volumetric datasets. Our results verify the hypothesis in the proposal and will be used to develop additional prototype systems.

  8. Evaluation of a visual risk communication tool: effects on knowledge and perception of blood transfusion risk.

    Science.gov (United States)

    Lee, D H; Mehta, M D

    2003-06-01

    Effective risk communication in transfusion medicine is important for health-care consumers, but understanding the numerical magnitude of risks can be difficult. The objective of this study was to determine the effect of a visual risk communication tool on the knowledge and perception of transfusion risk. Laypeople were randomly assigned to receive transfusion risk information with either a written or a visual presentation format for communicating and comparing the probabilities of transfusion risks relative to other hazards. Knowledge of transfusion risk was ascertained with a multiple-choice quiz and risk perception was ascertained by psychometric scaling and principal components analysis. Two-hundred subjects were recruited and randomly assigned. Risk communication with both written and visual presentation formats increased knowledge of transfusion risk and decreased the perceived dread and severity of transfusion risk. Neither format changed the perceived knowledge and control of transfusion risk, nor the perceived benefit of transfusion. No differences in knowledge or risk perception outcomes were detected between the groups randomly assigned to written or visual presentation formats. Risk communication that incorporates risk comparisons in either written or visual presentation formats can improve knowledge and reduce the perception of transfusion risk in laypeople.

  9. Attention biases visual activity in visual short-term memory.

    Science.gov (United States)

    Kuo, Bo-Cheng; Stokes, Mark G; Murray, Alexandra M; Nobre, Anna Christina

    2014-07-01

    In the current study, we tested whether representations in visual STM (VSTM) can be biased via top-down attentional modulation of visual activity in retinotopically specific locations. We manipulated attention using retrospective cues presented during the retention interval of a VSTM task. Retrospective cues triggered activity in a large-scale network implicated in attentional control and led to retinotopically specific modulation of activity in early visual areas V1-V4. Importantly, shifts of attention during VSTM maintenance were associated with changes in functional connectivity between pFC and retinotopic regions within V4. Our findings provide new insights into top-down control mechanisms that modulate VSTM representations for flexible and goal-directed maintenance of the most relevant memoranda.

  10. Numerical simulation of tsunami-scale wave boundary layers

    DEFF Research Database (Denmark)

    Williams, Isaac A.; Fuhrman, David R.

    2016-01-01

    This paper presents a numerical study of the boundary layer flow and properties induced by tsunami-scalewaves. For this purpose, an existing one-dimensional vertical (1DV) boundary layer model, based on the horizontal component of the incompressible Reynolds-averaged Navier–Stokes (RANS) equation...

  11. Experimental and numerical modelling of ductile crack propagation in large-scale shell structures

    DEFF Research Database (Denmark)

    Simonsen, Bo Cerup; Törnquist, R.

    2004-01-01

    plastic and controlled conditions. The test specimen can be deformed either in combined in-plane bending and extension or in pure extension. Experimental results are described for 5 and 10 mm thick aluminium and steel plates. By performing an inverse finite-element analysis of the experimental results......This paper presents a combined experimental-numerical procedure for development and calibration of macroscopic crack propagation criteria in large-scale shell structures. A novel experimental set-up is described in which a mode-I crack can be driven 400 mm through a 20(+) mm thick plate under fully...... for steel and aluminium plates, mainly as curves showing the critical element deformation versus the shell element size. These derived crack propagation criteria are then validated against a separate set of experiments considering centre crack specimens (CCS) which have a different crack-tip constraint...

  12. Innovations in Measuring Peer Conflict Resolution Knowledge in Children with LI: Exploring the Accessibility of a Visual Analogue Rating Scale

    Science.gov (United States)

    Campbell, Wenonah N.; Skarakis-Doyle, Elizabeth

    2011-01-01

    This preliminary study explored peer conflict resolution knowledge in children with and without language impairment (LI). Specifically, it evaluated the utility of a visual analogue scale (VAS) for measuring nuances in such knowledge. Children aged 9-12 years, 26 with typically developing language (TLD) and 6 with LI, completed a training protocol…

  13. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States)

    2017-10-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  14. Novel mathematical neural models for visual attention

    DEFF Research Database (Denmark)

    Li, Kang

    for the visual attention theories and spiking neuron models for single spike trains. Statistical inference and model selection are performed and various numerical methods are explored. The designed methods also give a framework for neural coding under visual attention theories. We conduct both analysis on real......Visual attention has been extensively studied in psychology, but some fundamental questions remain controversial. We focus on two questions in this study. First, we investigate how a neuron in visual cortex responds to multiple stimuli inside the receptive eld, described by either a response...... system, supported by simulation study. Finally, we present the decoding of multiple temporal stimuli under these visual attention theories, also in a realistic biophysical situation with simulations....

  15. Numerical studies of the linear theta pinch

    International Nuclear Information System (INIS)

    Brackbill, J.U.; Menzel, M.T.; Barnes, D.C.

    1975-01-01

    Aspects of several physical problems associated with linear theta pinches were studied using recently developed numerical methods for the solution of the nonlinear equations for time-dependent magnetohydrodynamic flow in two- and three-dimensions. The problems studied include the propagation of end-loss produced rarefaction waves, the flow produced in a proposed injection experiment geometry, and the linear growth and nonlinear saturation of instabilities in rotating plasmas, all in linear geometries. The studies illustrate how numerical computations aid in flow visualization, and how the small amplitude behavior and nonlinear fate of plasmas in unstable equilibria can be connected through the numerical solution of the dynamical equations. (auth)

  16. Development of real time visual evaluation system for sodium transient thermohydraulic experiments

    International Nuclear Information System (INIS)

    Tanigawa, Shingo

    1990-01-01

    A real time visual evaluation system, the Liquid Metal Visual Evaluation System (LIVES), has been developed for the Plant Dynamics Test Loop facility at O-arai Engineering Center. This facility is designed to provide sodium transient thermohydraulic experimental data not only in a fuel subassembly but also in a plant wide system simulating abnormal or accident conditions in liquid metal fast breeder reactors. Since liquid metal sodium is invisible, measurements to obtain experimental data are mainly conducted by numerous thermo couples installed at various locations in the test sections and the facility. The transient thermohydraulic phenomena are a result of complicated interactions among global and local scale three-dimensional phenomena, and short- and long-time scale phenomena. It is, therefore, difficult to grasp intuitively thermohydraulic behaviors and to observe accurately both temperature distribution and flow condition solely by digital data or various types of analog data in evaluating the experimental results. For effectively conducting sodium transient experiments and for making it possible to observe exactly thermohydraulic phenomena, the real time visualization technique for transient thermohydraulics has been developed using the latest Engineering Work Station. The system makes it possible to observe and compare instantly the experiment and analytical results while experiment or analysis is in progress. The results are shown by not only the time trend curves but also the graphic animations. This paper shows an outline of the system and sample applications of the system. (author)

  17. Flow visualization on a natural circulation inter-wrapper flow. Experimental and numerical results under a geometric condition of button type spacer pads

    Energy Technology Data Exchange (ETDEWEB)

    Yasuda, A.; Miyakoshi, H.; Hayashi, K.; Nishimura, M.; Kamide, H.; Hishida, K. [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1999-04-01

    Investigations on the inter-wrapper flow (IWF) in a liquid metal cooled fast breeder reactor core have been carried out. The IWF is a natural circulation flow between wrapper tubes in the core barrel where cold fluid is coming from a direct heat exchanger (DHX) in the upper plenum. It was shown by the sodium experiment using 7-subassembly core model that the IWF can cool the subassemblies. To clarify thermal-hydraulic characteristics of the IWF in the core, the water experiment was performed using the flow visualization technique. The test rig for IWF (TRIF) has the core simulating the fuel subassemblies and radial reflectors. The subassemblies are constructed featuring transparent heater to enable both Joule heating and flow visualization. The transparent heater was made of glass with thin conductor film coating of tin oxide, and the glass heater was embedded on the wall of modeled wrapper tube made of acrylic plexiglass. In the present experiment, influences of peripheral geometric parameters such as flow holes of core formers on the thermal-hydraulic field were investigated with the button type spacer pads of the wrapper tube. Through the water tests, flow patterns of the IWF were revealed and velocity fields were quantitatively measured with a particle image velocimetry (PIV). Also, no substantial influence of peripheral geometry was found on the temperature field of the IWF, as far as the button type spacer pad was applied. Numerical simulation was applied to the experimental analysis of IWF by using multidimensional code with porous body model. The numerical results reproduced the flow patterns within TRIF and agreed well to experimental temperature distributions, showing capability of predicting IWF with porous body model. (author)

  18. Auditory and visual memory in musicians and nonmusicians

    OpenAIRE

    Cohen, Michael A.; Evans, Karla K.; Horowitz, Todd S.; Wolfe, Jeremy M.

    2011-01-01

    Numerous studies have shown that musicians outperform nonmusicians on a variety of tasks. Here we provide the first evidence that musicians have superior auditory recognition memory for both musical and nonmusical stimuli, compared to nonmusicians. However, this advantage did not generalize to the visual domain. Previously, we showed that auditory recognition memory is inferior to visual recognition memory. Would this be true even for trained musicians? We compared auditory and visual memory ...

  19. Wettability effect on capillary trapping of supercritical CO2 at pore-scale: micromodel experiment and numerical modeling

    Science.gov (United States)

    Hu, R.; Wan, J.

    2015-12-01

    Wettability of reservoir minerals along pore surfaces plays a controlling role in capillary trapping of supercritical (sc) CO2 in geologic carbon sequestration. The mechanisms controlling scCO2 residual trapping are still not fully understood. We studied the effect of pore surface wettability on CO2 residual saturation at the pore-scale using engineered high pressure and high temperature micromodel (transparent pore networks) experiments and numerical modeling. Through chemical treatment of the micromodel pore surfaces, water-wet, intermediate-wet, and CO2-wet micromodels can be obtained. Both drainage and imbibition experiments were conducted at 8.5 MPa and 45 °C with controlled flow rate. Dynamic images of fluid-fluid displacement processes were recorded using a microscope with a CCD camera. Residual saturations were determined by analysis of late stage imbibition images of flow path structures. We performed direct numerical simulations of the full Navier-Stokes equations using a volume-of-fluid based finite-volume framework for the primary drainage and the followed imbibition for the micromodel experiments with different contact angles. The numerical simulations agreed well with our experimental observations. We found that more scCO2 can be trapped within the CO2-wet micromodel whereas lower residual scCO2 saturation occurred within the water-wet micromodels in both our experiments and the numerical simulations. These results provide direct and consistent evidence of the effect of wettability, and have important implications for scCO2 trapping in geologic carbon sequestration.

  20. Interactive data visualization foundations, techniques, and applications

    CERN Document Server

    Ward, Matthew; Keim, Daniel

    2010-01-01

    Visualization is the process of representing data, information, and knowledge in a visual form to support the tasks of exploration, confirmation, presentation, and understanding. This book is designed as a textbook for students, researchers, analysts, professionals, and designers of visualization techniques, tools, and systems. It covers the full spectrum of the field, including mathematical and analytical aspects, ranging from its foundations to human visual perception; from coded algorithms for different types of data, information and tasks to the design and evaluation of new visualization techniques. Sample programs are provided as starting points for building one's own visualization tools. Numerous data sets have been made available that highlight different application areas and allow readers to evaluate the strengths and weaknesses of different visualization methods. Exercises, programming projects, and related readings are given for each chapter. The book concludes with an examination of several existin...

  1. Direct numerical simulation of cellular-scale blood flow in microvascular networks

    Science.gov (United States)

    Balogh, Peter; Bagchi, Prosenjit

    2017-11-01

    A direct numerical simulation method is developed to study cellular-scale blood flow in physiologically realistic microvascular networks that are constructed in silico following published in vivo images and data, and are comprised of bifurcating, merging, and winding vessels. The model resolves large deformation of individual red blood cells (RBC) flowing in such complex networks. The vascular walls and deformable interfaces of the RBCs are modeled using the immersed-boundary methods. Time-averaged hemodynamic quantities obtained from the simulations agree quite well with published in vivo data. Our simulations reveal that in several vessels the flow rates and pressure drops could be negatively correlated. The flow resistance and hematocrit are also found to be negatively correlated in some vessels. These observations suggest a deviation from the classical Poiseuille's law in such vessels. The cells are observed to frequently jam at vascular bifurcations resulting in reductions in hematocrit and flow rate in the daughter and mother vessels. We find that RBC jamming results in several orders of magnitude increase in hemodynamic resistance, and thus provides an additional mechanism of increased in vivo blood viscosity as compared to that determined in vitro. Funded by NSF CBET 1604308.

  2. Comment on "Cheating prevention in visual cryptography".

    Science.gov (United States)

    Chen, Yu-Chi; Horng, Gwoboa; Tsai, Du-Shiau

    2012-07-01

    Visual cryptography (VC), proposed by Naor and Shamir, has numerous applications, including visual authentication and identification, steganography, and image encryption. In 2006, Horng showed that cheating is possible in VC, where some participants can deceive the remaining participants by forged transparencies. Since then, designing cheating-prevention visual secret-sharing (CPVSS) schemes has been studied by many researchers. In this paper, we cryptanalyze the Hu-Tzeng CPVSS scheme and show that it is not cheating immune. We also outline an improvement that helps to overcome the problem.

  3. Statistical inference and visualization in scale-space for spatially dependent images

    KAUST Repository

    Vaughan, Amy

    2012-03-01

    SiZer (SIgnificant ZERo crossing of the derivatives) is a graphical scale-space visualization tool that allows for statistical inferences. In this paper we develop a spatial SiZer for finding significant features and conducting goodness-of-fit tests for spatially dependent images. The spatial SiZer utilizes a family of kernel estimates of the image and provides not only exploratory data analysis but also statistical inference with spatial correlation taken into account. It is also capable of comparing the observed image with a specific null model being tested by adjusting the statistical inference using an assumed covariance structure. Pixel locations having statistically significant differences between the image and a given null model are highlighted by arrows. The spatial SiZer is compared with the existing independent SiZer via the analysis of simulated data with and without signal on both planar and spherical domains. We apply the spatial SiZer method to the decadal temperature change over some regions of the Earth. © 2011 The Korean Statistical Society.

  4. Cortical Double-Opponent Cells in Color Perception: Perceptual Scaling and Chromatic Visual Evoked Potentials.

    Science.gov (United States)

    Nunez, Valerie; Shapley, Robert M; Gordon, James

    2018-01-01

    In the early visual cortex V1, there are currently only two known neural substrates for color perception: single-opponent and double-opponent cells. Our aim was to explore the relative contributions of these neurons to color perception. We measured the perceptual scaling of color saturation for equiluminant color checkerboard patterns (designed to stimulate double-opponent neurons preferentially) and uniformly colored squares (designed to stimulate only single-opponent neurons) at several cone contrasts. The spatially integrative responses of single-opponent neurons would produce the same response magnitude for checkerboards as for uniform squares of the same space-averaged cone contrast. However, perceived saturation of color checkerboards was higher than for the corresponding squares. The perceptual results therefore imply that double-opponent cells are involved in color perception of patterns. We also measured the chromatic visual evoked potential (cVEP) produced by the same stimuli; checkerboard cVEPs were much larger than those for corresponding squares, implying that double-opponent cells also contribute to the cVEP response. The total Fourier power of the cVEP grew sublinearly with cone contrast. However, the 6-Hz Fourier component's power grew linearly with contrast-like saturation perception. This may also indicate that cortical coding of color depends on response dynamics.

  5. The Visual Analog Scale as a Comprehensible Patient-Reported Outcome Measure (PROM) in Septorhinoplasty.

    Science.gov (United States)

    Spiekermann, Christoph; Amler, Susanne; Rudack, Claudia; Stenner, Markus

    2018-06-01

    The patient's satisfaction with the esthetic result is a major criterion of success in septorhinoplasty. However, the idea of esthetic perfection varies greatly and primarily depends on subjective perception. Hence, patient-reported instruments are important and necessary to assess the outcome in septorhinoplasty. To analyze the potential of the visual analog scale (VAS) as a patient-reported outcome measure in septorhinoplasty, the perception of the nasal appearance was assessed by a VAS pre- and postoperatively in 213 patients undergoing septorhinoplasty. Furthermore, in this prospective study, the patients' satisfaction concerning the procedure's result was analyzed using a five-point Likert scale. Females had lower preoperative VAS scores but a higher increase compared to males. Patients with lower initial VAS scores showed a higher improvement in the VAS score postoperatively compared to patients with higher initial VAS scores. Satisfaction with the result depends on the increase in the VAS score value. The VAS scale is a short and comprehensible tool to assess patients' perception of nasal appearance preoperatively and represents an appropriate instrument to assess the esthetic patient-reported outcome in septorhinoplasty.Level of Evidence IV This journal requires that authors assign a level of evidence to each article. For a full description of these evidence-based medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  6. Multi-scale approach in numerical reservoir simulation; Uma abordagem multiescala na simulacao numerica de reservatorios

    Energy Technology Data Exchange (ETDEWEB)

    Guedes, Solange da Silva

    1998-07-01

    Advances in petroleum reservoir descriptions have provided an amount of data that can not be handled directly during numerical simulations. This detailed geological information must be incorporated into a coarser model during multiphase fluid flow simulations by means of some upscaling technique. the most used approach is the pseudo relative permeabilities and the more widely used is the Kyte and Berry method (1975). In this work, it is proposed a multi-scale computational model for multiphase flow that implicitly treats the upscaling without using pseudo functions. By solving a sequence of local problems on subdomains of the refined scale it is possible to achieve results with a coarser grid without expensive computations of a fine grid model. The main advantage of this new procedure is to treat the upscaling step implicitly in the solution process, overcoming some practical difficulties related the use of traditional pseudo functions. results of bidimensional two phase flow simulations considering homogeneous porous media are presented. Some examples compare the results of this approach and the commercial upscaling program PSEUDO, a module of the reservoir simulation software ECLIPSE. (author)

  7. Experimental and numerical analysis of a small-scale turbojet engine

    International Nuclear Information System (INIS)

    Badami, M.; Nuccio, P.; Signoretto, A.

    2013-01-01

    Highlights: • A theoretical and experimental activity was performed on a small scale turbojet. • The small turbojet shows the typical CO, UHC and NO x trends of aero-engines emissions. • The comparison between the CFD and experimental results show a quite good agreement. • The CFD analysis permitted to interpret some unexpected behaviour of thermodynamic parameters. • This essential knowledge of the research will be applied in a subsequent research on the use of alternative fuels. - Abstract: Since experimental activities on real aeronautical turbines can be very complex and expensive, the use of parts of real engines or small-size turbojets can be very useful for research activities. The present paper describes the results of an experimental and numerical activity that was conducted on a research turbojet engine, with a nominal thrust of 80 N at 80,000 rpm. The aim of the research was to obtain detailed information on the thermodynamic cycle and performance of the engine in order to use it in subsequent activities on the benefits of using alternative fuels in gas turbine engines. A specific characterization of each component of the engine has been performed by means of thermodynamics and CFD analyses and several measured parameters have been critically analyzed and compared with theoretical ones, with the purpose of increasing the knowledge of these kinds of small turbo-engines

  8. Data and Visualization Corridors: Report on the 1998 DVC Workshop Series

    International Nuclear Information System (INIS)

    Smith, Paul H.; van Rosendale, John

    1998-01-01

    The Department of Energy and the National Science Foundation sponsored a series of workshops on data manipulation and visualization of large-scale scientific datasets. Three workshops were held in 1998, bringing together experts in high-performance computing, scientific visualization, emerging computer technologies, physics, chemistry, materials science, and engineering. These workshops were followed by two writing and review sessions, as well as numerous electronic collaborations, to synthesize the results. The results of these efforts are reported here. Across the government, mission agencies are charged with understanding scientific and engineering problems of unprecedented complexity. The DOE Accelerated Strategic Computing Initiative, for example, will soon be faced with the problem of understanding the enormous datasets created by teraops simulations, while NASA already has a severe problem in coping with the flood of data captured by earth observation satellites. Unfortunately, scientific visualization algorithms, and high-performance display hardware and software on which they depend, have not kept pace with the sheer size of emerging datasets, which threaten to overwhelm our ability to conduct research. Our capability to manipulate and explore large datasets is growing only slowly, while human cognitive and visual perception are an absolutely fixed resource. Thus, there is a pressing need for new methods of handling truly massive datasets, of exploring and visualizing them, and of communicating them over geographic distances. This report, written by representatives from academia, industry, national laboratories, and the government, is intended as a first step toward the timely creation of a comprehensive federal program in data manipulation and scientific visualization. There is, at this time, an exciting confluence of ideas on data handling, compression, telepresence, and scientific visualization. The combination of these new ideas, which we refer to as

  9. Visualization system of swirl motion

    International Nuclear Information System (INIS)

    Nakayama, K.; Umeda, K.; Ichikawa, T.; Nagano, T.; Sakata, H.

    2004-01-01

    The instrumentation of a system composed of an experimental device and numerical analysis is presented to visualize flow and identify swirling motion. Experiment is performed with transparent material and PIV (Particle Image Velocimetry) instrumentation, by which velocity vector field is obtained. This vector field is then analyzed numerically by 'swirling flow analysis', which estimates its velocity gradient tensor and the corresponding eigenvalue (swirling function). Since an instantaneous flow field in steady/unsteady states is captured by PIV, the flow field is analyzed, and existence of vortices or swirling motions and their locations are identified in spite of their size. In addition, intensity of swirling is evaluated. The analysis enables swirling motion to emerge, even though it is hidden in uniform flow and velocity filed does not indicate any swirling. This visualization system can be applied to investigate condition to control flow or design flow. (authors)

  10. Psychometric evaluation of a visual analog scale for the assessment of anxiety

    Directory of Open Access Journals (Sweden)

    Morlock Robert J

    2010-06-01

    Full Text Available Abstract Background Fast-acting medications for the management of anxiety are important to patients and society. Measuring early onset, however, requires a sensitive and clinically responsive tool. This study evaluates the psychometric properties of a patient-reported Global Anxiety - Visual Analog Scale (GA-VAS. Methods Data from a double-blind, randomized, placebo-controlled study of lorazepam and paroxetine in patients with Generalized Anxiety Disorder were analyzed to assess the reliability, validity, responsiveness, and utility of the GA-VAS. The GA-VAS was completed at clinic visits and at home during the first week of treatment. Targeted psychometric analyses—test-retest reliabilities, validity correlations, responsiveness statistics, and minimum important differences—were conducted. Results The GA-VAS correlates well with other anxiety measures, at Week 4, r = 0.60 (p r = 0.74 (p p p p Conclusions The GA-VAS is capable of validly and effectively capturing a reduction in anxiety as quickly as 24 hours post-dose.

  11. Scaling Quelccaya: Using 3-D Animation and Satellite Data To Visualize Climate Change

    Science.gov (United States)

    Malone, A.; Leich, M.

    2017-12-01

    The near-global glacier retreat of recent decades is among the most convincing evidence for contemporary climate change. The epicenter of this action, however, is often far from population-dense centers. How can a glacier's scale, both physical and temporal, be communicated to those faraway? This project, an artists-scientist collaboration, proposes an alternate system for presenting climate change data, designed to evoke a more visceral response through a visual, geospatial, poetic approach. Focusing on the Quelccaya Ice Cap, the world's largest tropical glaciated area located in the Peruvian Andes, we integrate 30 years of satellite imagery and elevation models with 3D animation and gaming software, to bring it into a virtual juxtaposition with a model of the city of Chicago. Using Chicago as a cosmopolitan North American "measuring stick," we apply glaciological models to determine, for instance, the amount of ice that has melted on Quelccaya over the last 30 years and what the height of an equivalent amount of snow would fall on the city of Chicago (circa 600 feet, higher than the Willis Tower). Placing the two sites in a framework of intimate scale, we present a more imaginative and psychologically-astute manner of portraying the sober facts of climate change, by inviting viewers to learn and consider without inducing fear.

  12. COSFIRE : A Brain-Inspired Approach to Visual Pattern Recognition

    NARCIS (Netherlands)

    Azzopardi, G.; Petkov, N.

    2014-01-01

    The primate visual system has an impressive ability to generalize and to discriminate between numerous objects and it is robust to many geometrical transformations as well as lighting conditions. The study of the visual system has been an active reasearch field in neuropysiology for more than half a

  13. COSFIRE : A brain-inspired approach to visual pattern recognition

    NARCIS (Netherlands)

    Azzopardi, George; Petkov, Nicolai; Grandinetti, Lucio; Lippert, Thomas; Petkov, Nicolai

    2014-01-01

    The primate visual system has an impressive ability to generalize and to discriminate between numerous objects and it is robust to many geometrical transformations as well as lighting conditions. The study of the visual system has been an active reasearch field in neuropysiology for more than half a

  14. In-Situ Visualization Experiments with ParaView Cinema in RAGE

    Energy Technology Data Exchange (ETDEWEB)

    Kares, Robert John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-15

    A previous paper described some numerical experiments performed using the ParaView/Catalyst in-situ visualization infrastructure deployed in the Los Alamos RAGE radiation-hydrodynamics code to produce images from a running large scale 3D ICF simulation. One challenge of the in-situ approach apparent in these experiments was the difficulty of choosing parameters likes isosurface values for the visualizations to be produced from the running simulation without the benefit of prior knowledge of the simulation results and the resultant cost of recomputing in-situ generated images when parameters are chosen suboptimally. A proposed method of addressing this difficulty is to simply render multiple images at runtime with a range of possible parameter values to produce a large database of images and to provide the user with a tool for managing the resulting database of imagery. Recently, ParaView/Catalyst has been extended to include such a capability via the so-called Cinema framework. Here I describe some initial experiments with the first delivery of Cinema and make some recommendations for future extensions of Cinema’s capabilities.

  15. Thinking Visually about Algebra

    Science.gov (United States)

    Baroudi, Ziad

    2015-01-01

    Many introductions to algebra in high school begin with teaching students to generalise linear numerical patterns. This article argues that this approach needs to be changed so that students encounter variables in the context of modelling visual patterns so that the variables have a meaning. The article presents sample classroom activities,…

  16. Discrimination of numerical proportions: A comparison of binomial and Gaussian models.

    Science.gov (United States)

    Raidvee, Aire; Lember, Jüri; Allik, Jüri

    2017-01-01

    Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.

  17. Application of GIS and Visualization Technology in the Regional-Scale Ground-Water Modeling of the Twentynine Palms and San Jose Areas, California

    Science.gov (United States)

    Li, Z.

    2003-12-01

    Application of GIS and visualization technology significantly contributes to the efficiency and success of developing ground-water models in the Twentynine Palms and San Jose areas, California. Visualizations from GIS and other tools can help to formulate the conceptual model by quickly revealing the basinwide geohydrologic characteristics and changes of a ground-water flow system, and by identifying the most influential components of system dynamics. In addition, 3-D visualizations and animations can help validate the conceptual formulation and the numerical calibration of the model by checking for model-input data errors, revealing cause and effect relationships, and identifying hidden design flaws in model layering and other critical flow components. Two case studies will be presented: The first is a desert basin (near the town of Twentynine Palms) characterized by a fault-controlled ground-water flow system. The second is a coastal basin (Santa Clara Valley including the city of San Jose) characterized by complex, temporally variable flow components ­¦ including artificial recharge through a large system of ponds and stream channels, dynamically changing inter-layer flow from hundreds of multi-aquifer wells, pumping-driven subsidence and recovery, and climatically variable natural recharge. For the Twentynine Palms area, more than 10,000 historical ground-water level and water-quality measurements were retrieved from the USGS databases. The combined use of GIS and visualization tools allowed these data to be swiftly organized and interpreted, and depicted by water-level and water-quality maps with a variety of themes for different uses. Overlaying and cross-correlating these maps with other hydrological, geological, geophysical, and geochemical data not only helped to quickly identify the major geohydrologic characteristics controlling the natural variation of hydraulic head in space, such as faults, basin-bottom altitude, and aquifer stratigraphies, but also

  18. High Performance Numerical Computing for High Energy Physics: A New Challenge for Big Data Science

    International Nuclear Information System (INIS)

    Pop, Florin

    2014-01-01

    Modern physics is based on both theoretical analysis and experimental validation. Complex scenarios like subatomic dimensions, high energy, and lower absolute temperature are frontiers for many theoretical models. Simulation with stable numerical methods represents an excellent instrument for high accuracy analysis, experimental validation, and visualization. High performance computing support offers possibility to make simulations at large scale, in parallel, but the volume of data generated by these experiments creates a new challenge for Big Data Science. This paper presents existing computational methods for high energy physics (HEP) analyzed from two perspectives: numerical methods and high performance computing. The computational methods presented are Monte Carlo methods and simulations of HEP processes, Markovian Monte Carlo, unfolding methods in particle physics, kernel estimation in HEP, and Random Matrix Theory used in analysis of particles spectrum. All of these methods produce data-intensive applications, which introduce new challenges and requirements for ICT systems architecture, programming paradigms, and storage capabilities.

  19. Large Scale Functional Brain Networks Underlying Temporal Integration of Audio-Visual Speech Perception: An EEG Study.

    Science.gov (United States)

    Kumar, G Vinodh; Halder, Tamesh; Jaiswal, Amit K; Mukherjee, Abhishek; Roy, Dipanjan; Banerjee, Arpan

    2016-01-01

    Observable lip movements of the speaker influence perception of auditory speech. A classical example of this influence is reported by listeners who perceive an illusory (cross-modal) speech sound (McGurk-effect) when presented with incongruent audio-visual (AV) speech stimuli. Recent neuroimaging studies of AV speech perception accentuate the role of frontal, parietal, and the integrative brain sites in the vicinity of the superior temporal sulcus (STS) for multisensory speech perception. However, if and how does the network across the whole brain participates during multisensory perception processing remains an open question. We posit that a large-scale functional connectivity among the neural population situated in distributed brain sites may provide valuable insights involved in processing and fusing of AV speech. Varying the psychophysical parameters in tandem with electroencephalogram (EEG) recordings, we exploited the trial-by-trial perceptual variability of incongruent audio-visual (AV) speech stimuli to identify the characteristics of the large-scale cortical network that facilitates multisensory perception during synchronous and asynchronous AV speech. We evaluated the spectral landscape of EEG signals during multisensory speech perception at varying AV lags. Functional connectivity dynamics for all sensor pairs was computed using the time-frequency global coherence, the vector sum of pairwise coherence changes over time. During synchronous AV speech, we observed enhanced global gamma-band coherence and decreased alpha and beta-band coherence underlying cross-modal (illusory) perception compared to unisensory perception around a temporal window of 300-600 ms following onset of stimuli. During asynchronous speech stimuli, a global broadband coherence was observed during cross-modal perception at earlier times along with pre-stimulus decreases of lower frequency power, e.g., alpha rhythms for positive AV lags and theta rhythms for negative AV lags. Thus, our

  20. Visually and memory-guided grasping: aperture shaping exhibits a time-dependent scaling to Weber's law.

    Science.gov (United States)

    Holmes, Scott A; Mulla, Ali; Binsted, Gordon; Heath, Matthew

    2011-09-01

    The 'just noticeable difference' (JND) represents the minimum amount by which a stimulus must change to produce a noticeable variation in one's perceptual experience and is related to initial stimulus magnitude (i.e., Weber's law). The goal of the present study was to determine whether aperture shaping for visually derived and memory-guided grasping elicit a temporally dependent or temporally independent adherence to Weber's law. Participants were instructed to grasp differently sized objects (20, 30, 40, 50 and 60mm) in conditions wherein vision of the grasping environment was available throughout the response (i.e., closed-loop), when occluded at movement onset (i.e., open-loop), and when occluded for a brief (i.e., 0ms) or longer (i.e., 2000ms) delay in advance of movement onset. Within-participant standard deviations of grip aperture (i.e., the JNDs) computed at decile increments of normalized grasping time were used to determine participant's sensitivity to detecting changes in object size. Results showed that JNDs increased linearly with increasing object size from 10% to 40% of grasping time; that is, the trial-to-trial stability (i.e., visuomotor certainty) of grip aperture (i.e., the comparator) decreased with increasing object size (i.e., the initial stimulus). However, a null JND/object size scaling was observed during the middle and late stages of the response (i.e., >50% of grasping time). Most notably, the temporal relationship between JNDs and object size scaling was similar across the different visual conditions used here. Thus, our results provide evidence that aperture shaping elicits a time-dependent early, but not late, adherence to the psychophysical principles of Weber's law. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Social Set Visualizer (SoSeVi) II

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi

    2016-01-01

    This paper reports the second iteration of the Social Set Visualizer (SoSeVi), a set theoretical visual analytics dashboard of big social data. In order to further demonstrate its usefulness in large-scale visual analytics tasks of individual and collective behavior of actors in social networks......, the current iteration of the Social Set Visualizer (SoSeVi) in version II builds on recent advancements in visualizing set intersections. The development of the SoSeVi dashboard involved cutting-edge open source visual analytics libraries (D3.js) and creation of new visualizations such as of actor mobility...

  2. Substantial adverse association of visual and vascular comorbidities on visual disability in multiple sclerosis.

    Science.gov (United States)

    Marrie, Ruth Ann; Cutter, Gary; Tyry, Tuula

    2011-12-01

    Visual comorbidities are common in multiple sclerosis (MS) but the impact of visual comorbidities on visual disability is unknown. We assessed the impact of visual and vascular comorbidities on severity of visual disability in MS. In 2006, we queried participants of the North American Research Committee on Multiple Sclerosis (NARCOMS) about cataracts, glaucoma, uveitis, hypertension, hypercholesterolemia, heart disease, diabetes and peripheral vascular disease. We assessed visual disability using the Vision subscale of Performance Scales. Using Cox regression, we investigated whether visual or vascular comorbidities affected the time between MS symptom onset and the development of mild, moderate and severe visual disability. Of 8983 respondents, 1415 (15.9%) reported a visual comorbidity while 4745 (52.8%) reported a vascular comorbidity. The median (interquartile range) visual score was 1 (0-2). In a multivariable Cox model the risk of mild visual disability was higher among participants with vascular (hazard ratio [HR] 1.45; 95% confidence interval [CI]: 1.39-1.51) and visual comorbidities (HR 1.47; 95% CI: 1.37-1.59). Vascular and visual comorbidities were similarly associated with increased risks of moderate and severe visual disability. Visual and vascular comorbidities are associated with progression of visual disability in MS. Clinicians hearing reports of worsening visual symptoms in MS patients should consider visual comorbidities as contributing factors. Further study of these issues using objective, systematic neuro-ophthalmologic evaluations is warranted.

  3. Interaction between numbers and size during visual search

    OpenAIRE

    Krause, Florian; Bekkering, Harold; Pratt, Jay; Lindemann, Oliver

    2016-01-01

    The current study investigates an interaction between numbers and physical size (i.e. size congruity) in visual search. In three experiments, participants had to detect a physically large (or small) target item among physically small (or large) distractors in a search task comprising single-digit numbers. The relative numerical size of the digits was varied, such that the target item was either among the numerically large or small numbers in the search display and the relation between numeric...

  4. Treatment outcomes of a Numeric Rating Scale (NRS)-guided pharmacological pain management strategy in symptomatic knee and hip osteoarthritis in daily clinical practice.

    NARCIS (Netherlands)

    Snijders, G.F.; Ende, C.H.M. van den; Bemt, B.J.F van den; Riel, P.L.C.M. van; Hoogen, F.H.J. van den; Broeder, A. den

    2012-01-01

    OBJECTIVES: To describe the results of a Numeric Rating Scale (NRS)-guided pharmacological pain management strategy in symptomatic knee and hip osteoarthritis (OA) in daily clinical practice. METHODS: In this observational cohort study, standardised conservative treatment was offered to patients

  5. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time.

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which-as shown on the contact process-provides a significant improvement of the large deviation function estimators compared to the standard one.

  6. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which—as shown on the contact process—provides a significant improvement of the large deviation function estimators compared to the standard one.

  7. Determining student teachers' perceptions on using technology via Likert scale, visual association test and metaphors: A mixed study

    Directory of Open Access Journals (Sweden)

    Mevhibe Kobak

    2013-04-01

    Full Text Available The aim of this study is to determine senior student teachers’ perceptions on using technology by approaching various points of view. In this study, researchers collected data through Technology Perceptions Scale, Visual Association Activity and Technology Metaphors. The participants of the study were 104 senior student teachers who were enrolled in Balıkesir University Necatibey Faculty of Education. In this descriptive study, researchers interpreted qualitative data in conjunction with quantitative data. Based on the data obtained, even though student teachers’ perceptions on using technology were found positive in the light of Likert scale, there was no significant relation in terms of gender and enrolled undergraduate program. According to the results of visual association test, student teachers ranked smartboard, Internet and computer in the first three, and portable media player, mobile phone and video/camera in the last three. Besides, researchers analyzed and classified student teachers’ metaphors about technology under 9 categories: 1developing-changing technology, 2rapidly progressing technology, 3 limitless-endless technology, 4beneficial technology, 5harmful technology, 6both beneficial and harmful technology, 7indispensible technology, 8technology as a necessity, 9 all-inclusive technology. At the end of the study, those nine categories which were acquired using the content analysis technique are presented in a table which shows the interaction between categories in a holistic view.

  8. Planetary-Scale Inertio Gravity Waves in the Numerical Spectral Model

    Science.gov (United States)

    Mayr, H. G.; Mengel, J. R.; Talaat, E. R.; Porter, H. S.

    2004-01-01

    In the polar region of the upper mesosphere, horizontal wind oscillations have been observed with periods around 10 hours. Waves with such a period are generated in our Numerical Spectral Model (NSM), and they are identified as planetary-scale inertio gravity waves (IGW). These IGWs have periods between 9 and 11 hours and appear above 60 km in the zonal mean (m = 0), as well as in zonal wavenumbers m = 1 to 4. The waves can propagate eastward and westward and have vertical wavelengths around 25 km. The amplitudes in the wind field are typically between 10 and 20 m/s and can reach 30 m/s in the westward propagating component for m = 1 at the poles. In the temperature perturbations, the wave amplitudes above 100 km are typically 5 K and as large as 10 K for m = 0 at the poles. The IGWs are intermittent but reveal systematic seasonal variations, with the largest amplitudes occurring generally in late winter and spring. In the NSM, the IGW are generated like the planetary waves (PW). They are produced apparently by the instabilities that arise in the zonal mean circulation. Relative to the PWs, however, the IGWs propagate zonally with much larger velocities, such that they are not affected much by interactions with the background zonal winds. Since the IGWs can propagate through the mesosphere without much interaction, except for viscous dissipation, one should then expect that they reach the thermosphere with significant and measurable amplitudes.

  9. Auditory and visual memory in musicians and nonmusicians.

    Science.gov (United States)

    Cohen, Michael A; Evans, Karla K; Horowitz, Todd S; Wolfe, Jeremy M

    2011-06-01

    Numerous studies have shown that musicians outperform nonmusicians on a variety of tasks. Here we provide the first evidence that musicians have superior auditory recognition memory for both musical and nonmusical stimuli, compared to nonmusicians. However, this advantage did not generalize to the visual domain. Previously, we showed that auditory recognition memory is inferior to visual recognition memory. Would this be true even for trained musicians? We compared auditory and visual memory in musicians and nonmusicians using familiar music, spoken English, and visual objects. For both groups, memory for the auditory stimuli was inferior to memory for the visual objects. Thus, although considerable musical training is associated with better musical and nonmusical auditory memory, it does not increase the ability to remember sounds to the levels found with visual stimuli. This suggests a fundamental capacity difference between auditory and visual recognition memory, with a persistent advantage for the visual domain.

  10. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  11. Large Field Visualization with Demand-Driven Calculation

    Science.gov (United States)

    Moran, Patrick J.; Henze, Chris

    1999-01-01

    We present a system designed for the interactive definition and visualization of fields derived from large data sets: the Demand-Driven Visualizer (DDV). The system allows the user to write arbitrary expressions to define new fields, and then apply a variety of visualization techniques to the result. Expressions can include differential operators and numerous other built-in functions, ail of which are evaluated at specific field locations completely on demand. The payoff of following a demand-driven design philosophy throughout becomes particularly evident when working with large time-series data, where the costs of eager evaluation alternatives can be prohibitive.

  12. An experimental and numerical investigation of crossflow effects in two-phase displacements

    DEFF Research Database (Denmark)

    Cinar, Y.; Jessen, Kristian; Berenblyum, Roman

    2006-01-01

    In this paper, we present flow visualization experiments and numerical simulations that demonstrate the combined effects of viscous and capillary forces and gravity segregation on crossflow that occurs in two-phase displacements in layered porous media. We report results of a series of immiscible...... flooding experiments in 2D, two-layered glass bead models. Favorable mobility-ratio imbibition and unfavorable mobility-ratio drainage experiments were performed. We used pre-equilibrated immiscible phases from a ternary isooctane/isopropanol/water system, which allowed control of the interfacial tension....... The experiments also illustrate the complex interplay of capillary, gravity, and viscous forces that controls crossflow. The experimental results confirm that the transition ranges of scaling groups suggested by Zhou et al. (1994) are appropriate/valid. We report also results of simulations of the displacement...

  13. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river.

    Science.gov (United States)

    Williams, R D; Measures, R; Hicks, D M; Brasington, J

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  14. An investigation of the effect of pore scale flow on average geochemical reaction rates using direct numerical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Molins, Sergi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Trebotich, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Steefel, Carl I. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Shen, Chaopeng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division

    2012-03-30

    The scale-dependence of geochemical reaction rates hinders their use in continuum scale models intended for the interpretation and prediction of chemical fate and transport in subsurface environments such as those considered for geologic sequestration of CO2. Processes that take place at the pore scale, especially those involving mass transport limitations to reactive surfaces, may contribute to the discrepancy commonly observed between laboratory-determined and continuum-scale or field rates. In this study we investigate the dependence of mineral dissolution rates on the pore structure of the porous media by means of pore scale modeling of flow and multicomponent reactive transport. The pore scale model is composed of high-performance simulation tools and algorithms for incompressible flow and conservative transport combined with a general-purpose multicomponent geochemical reaction code. The model performs direct numerical simulation of reactive transport based on an operator-splitting approach to coupling transport and reactions. The approach is validated with a Poiseuille flow single-pore experiment and verified with an equivalent 1-D continuum-scale model of a capillary tube packed with calcite spheres. Using the case of calcite dissolution as an example, the high-resolution model is used to demonstrate that nonuniformity in the flow field at the pore scale has the effect of decreasing the overall reactivity of the system, even when systems with identical reactive surface area are considered. In conclusion, the effect becomes more pronounced as the heterogeneity of the reactive grain packing increases, particularly where the flow slows sufficiently such that the solution approaches equilibrium locally and the average rate becomes transport-limited.

  15. Noninvariance of Space and Time Scale Ranges under a Lorentz Transformation and the Implications for the Numerical Study of Relativistic Systems

    International Nuclear Information System (INIS)

    Vay, J.-L.; Vay, J.-L.

    2007-01-01

    We present an analysis which shows that the ranges of space and time scales spanned by a system are not invariant under the Lorentz transformation. This implies the existence of a frame of reference which minimizes an aggregate measure of the range of space and time scales. Such a frame is derived for example cases: free electron laser, laser-plasma accelerator, and particle beam interacting with electron clouds. Implications for experimental, theoretical and numerical studies are discussed. The most immediate relevance is the reduction by orders of magnitude in computer simulation run times for such systems

  16. Enhancing interdisciplinary collaboration and decisionmaking with J-Earth: an open source data sharing, visualization and GIS analysis platform

    Science.gov (United States)

    Prashad, L. C.; Christensen, P. R.; Fink, J. H.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2010-12-01

    Our society currently is facing a number of major environmental challenges, most notably the threat of climate change. A multifaceted, interdisciplinary approach involving physical and social scientists, engineers and decisionmakers is critical to adequately address these complex issues. To best facilitate this interdisciplinary approach, data and models at various scales - from local to global - must be quickly and easily shared between disciplines to effectively understand environmental phenomena and human-environmental interactions. When data are acquired and studied on different scales and within different disciplines, researchers and practitioners may not be able to easily learn from each others results. For example, climate change models are often developed at a global scale, while strategies that address human vulnerability to climate change and mitigation/adaptation strategies are often assessed on a local level. Linkages between urban heat island phenomena and global climate change may be better understood with increased data flow amongst researchers and those making policy decisions. In these cases it would be useful have a single platform to share, visualize, and analyze numerical model and satellite/airborne remote sensing data with social, environmental, and economic data between researchers and practitioners. The Arizona State University 100 Cities Project and Mars Space Flight Facility are developing the open source application J-Earth, with the goal of providing this single platform, that facilitates data sharing, visualization, and analysis between researchers and applied practitioners around environmental and other sustainability challenges. This application is being designed for user communities including physical and social scientists, NASA researchers, non-governmental organizations, and decisionmakers to share and analyze data at multiple scales. We are initially focusing on urban heat island and urban ecology studies, with data and users from

  17. Sketchy Rendering for Information Visualization.

    Science.gov (United States)

    Wood, J; Isenberg, P; Isenberg, T; Dykes, J; Boukhelifa, N; Slingsby, A

    2012-12-01

    We present and evaluate a framework for constructing sketchy style information visualizations that mimic data graphics drawn by hand. We provide an alternative renderer for the Processing graphics environment that redefines core drawing primitives including line, polygon and ellipse rendering. These primitives allow higher-level graphical features such as bar charts, line charts, treemaps and node-link diagrams to be drawn in a sketchy style with a specified degree of sketchiness. The framework is designed to be easily integrated into existing visualization implementations with minimal programming modification or design effort. We show examples of use for statistical graphics, conveying spatial imprecision and for enhancing aesthetic and narrative qualities of visualization. We evaluate user perception of sketchiness of areal features through a series of stimulus-response tests in order to assess users' ability to place sketchiness on a ratio scale, and to estimate area. Results suggest relative area judgment is compromised by sketchy rendering and that its influence is dependent on the shape being rendered. They show that degree of sketchiness may be judged on an ordinal scale but that its judgement varies strongly between individuals. We evaluate higher-level impacts of sketchiness through user testing of scenarios that encourage user engagement with data visualization and willingness to critique visualization design. Results suggest that where a visualization is clearly sketchy, engagement may be increased and that attitudes to participating in visualization annotation are more positive. The results of our work have implications for effective information visualization design that go beyond the traditional role of sketching as a tool for prototyping or its use for an indication of general uncertainty.

  18. D Reconstruction and Visualization of Cultural Heritage: Analyzing Our Legacy Through Time

    Science.gov (United States)

    Rodríguez-Gonzálvez, P.; Muñoz-Nieto, A. L.; del Pozo, S.; Sanchez-Aparicio, L. J.; Gonzalez-Aguilera, D.; Micoli, L.; Gonizzi Barsanti, S.; Guidi, G.; Mills, J.; Fieber, K.; Haynes, I.; Hejmanowska, B.

    2017-02-01

    Temporal analyses and multi-temporal 3D reconstruction are fundamental for the preservation and maintenance of all forms of Cultural Heritage (CH) and are the basis for decisions related to interventions and promotion. Introducing the fourth dimension of time into three-dimensional geometric modelling of real data allows the creation of a multi-temporal representation of a site. In this way, scholars from various disciplines (surveyors, geologists, archaeologists, architects, philologists, etc.) are provided with a new set of tools and working methods to support the study of the evolution of heritage sites, both to develop hypotheses about the past and to model likely future developments. The capacity to "see" the dynamic evolution of CH assets across different spatial scales (e.g. building, site, city or territory) compressed in diachronic model, affords the possibility to better understand the present status of CH according to its history. However, there are numerous challenges in order to carry out 4D modelling and the requisite multi-data source integration. It is necessary to identify the specifications, needs and requirements of the CH community to understand the required levels of 4D model information. In this way, it is possible to determine the optimum material and technologies to be utilised at different CH scales, as well as the data management and visualization requirements. This manuscript aims to provide a comprehensive approach for CH time-varying representations, analysis and visualization across different working scales and environments: rural landscape, urban landscape and architectural scales. Within this aim, the different available metric data sources are systemized and evaluated in terms of their suitability.

  19. Color-Space-Based Visual-MIMO for V2X Communication

    OpenAIRE

    Jai-Eun Kim; Ji-Won Kim; Youngil Park; Ki-Doo Kim

    2016-01-01

    In this paper, we analyze the applicability of color-space-based, color-independent visual-MIMO for V2X. We aim to achieve a visual-MIMO scheme that can maintain the original color and brightness while performing seamless communication. We consider two scenarios of GCM based visual-MIMO for V2X. One is a multipath transmission using visual-MIMO networking and the other is multi-node V2X communication. In the scenario of multipath transmission, we analyze the channel capacity numerically and w...

  20. Numerical modelling of the flow in the resin infusion process on the REV scale: A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Jabbari, M.; Spangenberg, J.; Hattel, J. H. [Process Modelling Group, Department of Mechanical Engineering, Technical University of Denmark, Nils Koppels Allé, 2800 Kgs. Lyngby (Denmark); Jambhekar, V. A.; Helmig, R. [Department of Hydromechanics and Modelling of Hydrosystems, Institute for Modelling Hydraulic and Environmental Systems, Universität Stuttgart, Stuttgart (Germany); Gersborg, A. R. [SCION DTU, Diplomvej 373N, DK-2800 Lyngby (Denmark)

    2016-06-08

    The resin infusion process (RIP) has developed as a low cost method for manufacturing large fibre reinforced plastic parts. However, the process still presents some challenges to industry with regards to reliability and repeatability, resulting in expensive and inefficient trial and error development. In this paper, we show the implementation of 2D numerical models for the RIP using the open source simulator DuMu{sup X}. The idea of this study is to present a model which accounts for the interfacial forces coming from the capillary pressure on the so-called representative elementary volume (REV) scale. The model is described in detail and three different test cases — a constant and a tensorial permeability as well as a preform/Balsa domain — are investigated. The results show that the developed model is very applicable for the RIP for manufacturing of composite parts. The idea behind this study is to test the developed model for later use in a real application, in which the preform medium has numerous layers with different material properties.

  1. Self-reported pain intensity with the numeric reporting scale in adult dengue.

    Directory of Open Access Journals (Sweden)

    Joshua G X Wong

    Full Text Available BACKGROUND: Pain is a prominent feature of acute dengue as well as a clinical criterion in World Health Organization guidelines in diagnosing dengue. We conducted a prospective cohort study to compare levels of pain during acute dengue between different ethnicities and dengue severity. METHODS: Demographic, clinical and laboratory data were collected. Data on self-reported pain was collected using the 11-point Numerical Rating Scale. Generalized structural equation models were built to predict progression to severe disease. RESULTS: A total of 499 laboratory confirmed dengue patients were recruited in the Prospective Adult Dengue Study at Tan Tock Seng Hospital, Singapore. We found no statistically significant differences between pain score with age, gender, ethnicity or the presence of co-morbidity. Pain score was not predictive of dengue severity but highly correlated to patients' day of illness. Prevalence of abdominal pain in our cohort was 19%. There was no difference in abdominal pain score between grades of dengue severity. CONCLUSION: Dengue is a painful disease. Patients suffer more pain at the earlier phase of illness. However, pain score cannot be used to predict a patient's progression to severe disease.

  2. Numerical simulation in astrophysics

    International Nuclear Information System (INIS)

    Miyama, Shoken

    1985-01-01

    There have been many numerical simulations of hydrodynamical problems in astrophysics, e.g. processes of star formation, supernova explosion and formation of neutron stars, and general relativistic collapse of star to form black hole. The codes are made to be suitable for computing such problems. In astrophysical hydrodynamical problems, there are the characteristics: problems of self-gravity or external gravity acting, objects of scales very large or very short, objects changing by short period or long time scale, problems of magnetic force and/or centrifugal force acting. In this paper, we present one of methods of numerical simulations which may satisfy these requirements, so-called smoothed particle methods. We then introduce the methods briefly. Then, we show one of the applications of the methods to astrophysical problem (fragmentation and collapse of rotating isothermal cloud). (Mori, K.)

  3. tran-SAS v1.0: a numerical model to compute catchment-scale hydrologic transport using StorAge Selection functions

    Directory of Open Access Journals (Sweden)

    P. Benettin

    2018-04-01

    Full Text Available This paper presents the tran-SAS package, which includes a set of codes to model solute transport and water residence times through a hydrological system. The model is based on a catchment-scale approach that aims at reproducing the integrated response of the system at one of its outlets. The codes are implemented in MATLAB and are meant to be easy to edit, so that users with minimal programming knowledge can adapt them to the desired application. The problem of large-scale solute transport has both theoretical and practical implications. On the one side, the ability to represent the ensemble of water flow trajectories through a heterogeneous system helps unraveling streamflow generation processes and allows us to make inferences on plant–water interactions. On the other side, transport models are a practical tool that can be used to estimate the persistence of solutes in the environment. The core of the package is based on the implementation of an age master equation (ME, which is solved using general StorAge Selection (SAS functions. The age ME is first converted into a set of ordinary differential equations, each addressing the transport of an individual precipitation input through the catchment, and then it is discretized using an explicit numerical scheme. Results show that the implementation is efficient and allows the model to run in short times. The numerical accuracy is critically evaluated and it is shown to be satisfactory in most cases of hydrologic interest. Additionally, a higher-order implementation is provided within the package to evaluate and, if necessary, to improve the numerical accuracy of the results. The codes can be used to model streamflow age and solute concentration, but a number of additional outputs can be obtained by editing the codes to further advance the ability to understand and model catchment transport processes.

  4. tran-SAS v1.0: a numerical model to compute catchment-scale hydrologic transport using StorAge Selection functions

    Science.gov (United States)

    Benettin, Paolo; Bertuzzo, Enrico

    2018-04-01

    This paper presents the tran-SAS package, which includes a set of codes to model solute transport and water residence times through a hydrological system. The model is based on a catchment-scale approach that aims at reproducing the integrated response of the system at one of its outlets. The codes are implemented in MATLAB and are meant to be easy to edit, so that users with minimal programming knowledge can adapt them to the desired application. The problem of large-scale solute transport has both theoretical and practical implications. On the one side, the ability to represent the ensemble of water flow trajectories through a heterogeneous system helps unraveling streamflow generation processes and allows us to make inferences on plant-water interactions. On the other side, transport models are a practical tool that can be used to estimate the persistence of solutes in the environment. The core of the package is based on the implementation of an age master equation (ME), which is solved using general StorAge Selection (SAS) functions. The age ME is first converted into a set of ordinary differential equations, each addressing the transport of an individual precipitation input through the catchment, and then it is discretized using an explicit numerical scheme. Results show that the implementation is efficient and allows the model to run in short times. The numerical accuracy is critically evaluated and it is shown to be satisfactory in most cases of hydrologic interest. Additionally, a higher-order implementation is provided within the package to evaluate and, if necessary, to improve the numerical accuracy of the results. The codes can be used to model streamflow age and solute concentration, but a number of additional outputs can be obtained by editing the codes to further advance the ability to understand and model catchment transport processes.

  5. Investigation of Numerical Dissipation in Classical and Implicit Large Eddy Simulations

    Directory of Open Access Journals (Sweden)

    Moutassem El Rafei

    2017-12-01

    Full Text Available The quantitative measure of dissipative properties of different numerical schemes is crucial to computational methods in the field of aerospace applications. Therefore, the objective of the present study is to examine the resolving power of Monotonic Upwind Scheme for Conservation Laws (MUSCL scheme with three different slope limiters: one second-order and two third-order used within the framework of Implicit Large Eddy Simulations (ILES. The performance of the dynamic Smagorinsky subgrid-scale model used in the classical Large Eddy Simulation (LES approach is examined. The assessment of these schemes is of significant importance to understand the numerical dissipation that could affect the accuracy of the numerical solution. A modified equation analysis has been employed to the convective term of the fully-compressible Navier–Stokes equations to formulate an analytical expression of truncation error for the second-order upwind scheme. The contribution of second-order partial derivatives in the expression of truncation error showed that the effect of this numerical error could not be neglected compared to the total kinetic energy dissipation rate. Transitions from laminar to turbulent flow are visualized considering the inviscid Taylor–Green Vortex (TGV test-case. The evolution in time of volumetrically-averaged kinetic energy and kinetic energy dissipation rate have been monitored for all numerical schemes and all grid levels. The dissipation mechanism has been compared to Direct Numerical Simulation (DNS data found in the literature at different Reynolds numbers. We found that the resolving power and the symmetry breaking property are enhanced with finer grid resolutions. The production of vorticity has been observed in terms of enstrophy and effective viscosity. The instantaneous kinetic energy spectrum has been computed using a three-dimensional Fast Fourier Transform (FFT. All combinations of numerical methods produce a k − 4 spectrum

  6. Numerical investigation of the flow in axial water turbines and marine propellers with scale-resolving simulations

    Science.gov (United States)

    Morgut, Mitja; Jošt, Dragica; Nobile, Enrico; Škerlavaj, Aljaž

    2015-11-01

    The accurate prediction of the performances of axial water turbines and naval propellers is a challenging task, of great practical relevance. In this paper a numerical prediction strategy, based on the combination of a trusted CFD solver and a calibrated mass transfer model, is applied to the turbulent flow in axial turbines and around a model scale naval propeller, under non-cavitating and cavitating conditions. Some selected results for axial water turbines and a marine propeller, and in particular the advantages, in terms of accuracy and fidelity, of ScaleResolving Simulations (SRS), like SAS (Scale Adaptive Simulation) and Zonal-LES (ZLES) compared to standard RANS approaches, are presented. Efficiency prediction for a Kaplan and a bulb turbine was significantly improved by use of the SAS SST model in combination with the ZLES in the draft tube. Size of cavitation cavity and sigma break curve for Kaplan turbine were successfully predicted with SAS model in combination with robust high resolution scheme, while for mass transfer the Zwart model with calibrated constants were used. The results obtained for a marine propeller in non-uniform inflow, under cavitating conditions, compare well with available experimental measurements, and proved that a mass transfer model, previously calibrated for RANS (Reynolds Averaged Navier Stokes), can be successfully applied also within the SRS approaches.

  7. Dry corrosion prediction of radioactive waste containers in long term interim storage: mechanisms of low temperature oxidation of pure iron and numerical simulation of an oxide scale growth

    International Nuclear Information System (INIS)

    Bertrand, N.

    2006-10-01

    In the framework of research on long term behaviour of radioactive waste containers, this work consists on the one hand in the study of low temperature oxidation of iron and on the other hand in the development of a numerical model of oxide scale growth. Isothermal oxidation experiments are performed on pure iron at 300 and 400 C in dry and humid air at atmospheric pressure. Oxide scales formed in these conditions are characterized. They are composed of a duplex magnetite scale under a thin hematite scale. The inner layer of the duplex scale is thinner than the outer one. Both are composed of columnar grains, that are smaller in the inner part. The outer hematite layer is made of very small equiaxed grains. Markers and tracers experiments show that a part of the scale grows at metal/oxide interface thanks to short-circuits diffusion of oxygen. A model for iron oxide scale growth at low temperature is then deduced. Besides this experimental study, the numerical model EKINOX (Estimation Kinetics Oxidation) is developed. It allows to simulate the growth of an oxide scale controlled by mixed mechanisms, such as anionic and cationic vacancies diffusion through the scale, as well as metal transfer at metal/oxide interface. It is based on the calculation of concentration profiles of chemical species and also point defects in the oxide scale and in the substrate. This numerical model does not use the classical quasi-steady-state approximation and calculates the future of cationic vacancies at metal/oxide interface. Indeed, these point defects can either be eliminated by interface motion or injected in the substrate, where they can be annihilated, considering sinks as the climb of dislocations. Hence, the influence of substrate cold-work can be investigated. The EKINOX model is validated in the conditions of Wagner's theory and is confronted with experimental results by its application to the case of high temperature oxidation of nickel. (author)

  8. The impact of visual gaze direction on auditory object tracking

    OpenAIRE

    Pomper, U.; Chait, M.

    2017-01-01

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention wh...

  9. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  10. GAViT: Genome Assembly Visualization Tool for Short Read Data

    Energy Technology Data Exchange (ETDEWEB)

    Syed, Aijazuddin; Shapiro, Harris; Tu, Hank; Pangilinan, Jasmyn; Trong, Stephan

    2008-03-14

    It is a challenging job for genome analysts to accurately debug, troubleshoot, and validate genome assembly results. Genome analysts rely on visualization tools to help validate and troubleshoot assembly results, including such problems as mis-assemblies, low-quality regions, and repeats. Short read data adds further complexity and makes it extremely challenging for the visualization tools to scale and to view all needed assembly information. As a result, there is a need for a visualization tool that can scale to display assembly data from the new sequencing technologies. We present Genome Assembly Visualization Tool (GAViT), a highly scalable and interactive assembly visualization tool developed at the DOE Joint Genome Institute (JGI).

  11. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.; Jadhav, S.; Bremer, P.; Guoning Chen,; Levine, J. A.; Nonato, L. G.; Pascucci, V.

    2012-01-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  12. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.

    2012-09-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  13. Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions

    Science.gov (United States)

    Nottrott, A.; Tan, S. M.; He, Y.

    2016-12-01

    There is a global movement toward urbanization. Approximately 7% of the global population lives in just 28 megacities, occupying less than 0.1% of the total land area used by human activity worldwide. These cities contribute a significant fraction of the global budget of anthropogenic primary pollutants and greenhouse gasses. The 27 largest cities consume 9.9%, 9.3%, 6.7% and 3.0% of global gasoline, electricity, energy and water use, respectively. This impact motivates novel approaches to quantify and mitigate the growing contribution of megacity emissions to global climate change. Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model methane (CH4) emissions from various components of the natural gas distribution system, to investigate the impact of urban meteorology on mobile CH4 measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in

  14. Pain point system scale (PPSS: a method for postoperative pain estimation in retrospective studies

    Directory of Open Access Journals (Sweden)

    Gkotsi A

    2012-11-01

    Full Text Available Anastasia Gkotsi,1 Dimosthenis Petsas,2 Vasilios Sakalis,3 Asterios Fotas,3 Argyrios Triantafyllidis,3 Ioannis Vouros,3 Evangelos Saridakis,2 Georgios Salpiggidis,3 Athanasios Papathanasiou31Department of Experimental Physiology, Aristotle University of Thessaloniki, Thessaloniki, Greece; 2Department of Anesthesiology, 3Department of Urology, Hippokration General Hospital, Thessaloniki, GreecePurpose: Pain rating scales are widely used for pain assessment. Nevertheless, a new tool is required for pain assessment needs in retrospective studies.Methods: The postoperative pain episodes, during the first postoperative day, of three patient groups were analyzed. Each pain episode was assessed by a visual analog scale, numerical rating scale, verbal rating scale, and a new tool – pain point system scale (PPSS – based on the analgesics administered. The type of analgesic was defined based on the authors’ clinic protocol, patient comorbidities, pain assessment tool scores, and preadministered medications by an artificial neural network system. At each pain episode, each patient was asked to fill the three pain scales. Bartlett’s test and Kaiser–Meyer–Olkin criterion were used to evaluate sample sufficiency. The proper scoring system was defined by varimax rotation. Spearman’s and Pearson’s coefficients assessed PPSS correlation to the known pain scales.Results: A total of 262 pain episodes were evaluated in 124 patients. The PPSS scored one point for each dose of paracetamol, three points for each nonsteroidal antiinflammatory drug or codeine, and seven points for each dose of opioids. The correlation between the visual analog scale and PPSS was found to be strong and linear (rho: 0.715; P <0.001 and Pearson: 0.631; P < 0.001.Conclusion: PPSS correlated well with the known pain scale and could be used safely in the evaluation of postoperative pain in retrospective studies.Keywords: pain scale, retrospective studies, pain point system

  15. Numerical models for fluid-grains interactions: opportunities and limitations

    Science.gov (United States)

    Esteghamatian, Amir; Rahmani, Mona; Wachs, Anthony

    2017-06-01

    In the framework of a multi-scale approach, we develop numerical models for suspension flows. At the micro scale level, we perform particle-resolved numerical simulations using a Distributed Lagrange Multiplier/Fictitious Domain approach. At the meso scale level, we use a two-way Euler/Lagrange approach with a Gaussian filtering kernel to model fluid-solid momentum transfer. At both the micro and meso scale levels, particles are individually tracked in a Lagrangian way and all inter-particle collisions are computed by a Discrete Element/Soft-sphere method. The previous numerical models have been extended to handle particles of arbitrary shape (non-spherical, angular and even non-convex) as well as to treat heat and mass transfer. All simulation tools are fully-MPI parallel with standard domain decomposition and run on supercomputers with a satisfactory scalability on up to a few thousands of cores. The main asset of multi scale analysis is the ability to extend our comprehension of the dynamics of suspension flows based on the knowledge acquired from the high-fidelity micro scale simulations and to use that knowledge to improve the meso scale model. We illustrate how we can benefit from this strategy for a fluidized bed, where we introduce a stochastic drag force model derived from micro-scale simulations to recover the proper level of particle fluctuations. Conversely, we discuss the limitations of such modelling tools such as their limited ability to capture lubrication forces and boundary layers in highly inertial flows. We suggest ways to overcome these limitations in order to enhance further the capabilities of the numerical models.

  16. Continuation of full-scale three-dimensional numerical experiments on high-intensity particle and laser beam-matter interactions

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Warren, B.

    2012-12-01

    We present results from the grant entitled, Continuation of full-scale three-dimensional numerical experiments on high-intensity particle and laser beam-matter interactions. The research significantly advanced the understanding of basic high-energy density science (HEDS) on ultra intense laser and particle beam plasma interactions. This advancement in understanding was then used to to aid in the quest to make 1 GeV to 500 GeV plasma based accelerator stages. The work blended basic research with three-dimensions fully nonlinear and fully kinetic simulations including full-scale modeling of ongoing or planned experiments. The primary tool was three-dimensional particle-in-cell simulations. The simulations provided a test bed for theoretical ideas and models as well as a method to guide experiments. The research also included careful benchmarking of codes against experiment. High-fidelity full-scale modeling provided a means to extrapolate parameters into regimes that were not accessible to current or near term experiments, thereby allowing concepts to be tested with confidence before tens to hundreds of millions of dollars were spent building facilities. The research allowed the development of a hierarchy of PIC codes and diagnostics that is one of the most advanced in the world.

  17. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  18. Evaluating acute pain intensity relief: challenges when using an 11-point numerical rating scale.

    Science.gov (United States)

    Chauny, Jean-Marc; Paquet, Jean; Lavigne, Gilles; Marquis, Martin; Daoust, Raoul

    2016-02-01

    Percentage of pain intensity difference (PercentPID) is a recognized way of evaluating pain relief with an 11-point numerical rating scale (NRS) but is not without flaws. A new metric, the slope of relative pain intensity difference (SlopePID), which consists in dividing PercentPID by the time between 2 pain measurements, is proposed. This study aims to validate SlopePID with 3 measures of subjective pain relief: a 5-category relief scale (not, a little, moderate, very, complete), a 2-category relief question ("I'm relieved," "I'm not relieved"), and a single-item question, "Wanting other medication to treat pain?" (Yes/No). This prospective cohort study included 361 patients in the emergency department who had an initial acute pain NRS > 3 and a pain intensity assessment within 90 minutes after analgesic administration. Mean age was 50.2 years (SD = 19.3) and 59% were women. Area under the curves of receiver operating characteristic curves analyses revealed similar discriminative power for PercentPID (0.83; 95% confidence interval [CI], 0.79-0.88) and SlopePID (0.82; 95% CI, 0.77-0.86). Considering the "very" category from the 5-category relief scale as a substantial relief, the average cutoff for substantial relief was a decrease of 64% (95% CI, 59-69) for PercentPID and of 49% per hour (95% CI, 44-54) for SlopePID. However, when a cutoff criterion of 50% was used as a measure of pain relief for an individual patient, PercentPID underestimated pain-relieved patients by 12.1% (P pain intensity at baseline was an odd number compared with an even number (32.9% vs 45.0%, respectively). SlopePID should be used instead of PercentPID as a metric to evaluate acute pain relief on a 0 to 10 NRS.

  19. Usefulness of medial temporal lobe atrophy visual rating scale in detecting Alzheimer′s disease: Preliminary study

    Directory of Open Access Journals (Sweden)

    Jae-Hyeok Heo

    2013-01-01

    Full Text Available Background: The Korean version of Mini-Mental Status Examination (K-MMSE and the Korean version of Addenbrooke Cognitive Examination (K-ACE have been validated as quick neuropsychological tests for screening dementia in various clinical settings. Medial temporal atrophy (MTA is an early pathological characteristic of Alzheimer′s disease (AD. We aimed to assess the diagnostic validity of the fusion of the neuropsychological test and visual rating scale (VRS of MTA in AD. Materials and Methods: A total of fifty subjects (25 AD, 25 controls were included. The neuropsychological tests used were the K-MMSE and the K-ACE. T1 axial imaging visual rating scale (VRS was applied for assessing the grade of MTA. We calculated the fusion score with the difference of neuropsychological test and VRS of MTA. The receiver operating characteristics (ROC curve was used to determine optimal cut-off score, sensitivity and specificity of the fusion scores in screening AD. Results: No significant differences in age, gender and education were found between AD and control group. The values of K-MMSE, K-ACE, CDR, VRS and cognitive function test minus VRS were significantly lower in the AD group than control group. The AUC (Area under the curve, sensitivity and specificity for K-MMSE minus VRS were 0.857, 84% and 80% and for K-ACE minus VRS were 0.884, 80% and 88%, respectively. Those for K-MMSE only were 0.842, 76% and 72% and for K-ACE only were 0.868, 80% and 88%, respectively. Conclusions: The fusion of the neuropsychological test and VRS suggested clinical usefulness in their easy and superiority over neuropsychological test only. However, this study failed to find any difference. This may be because of small numbers in the study or because there is no true difference.

  20. Does Visualization Matter? The Role of Interactive Data Visualization to Make Sense of Information

    Directory of Open Access Journals (Sweden)

    Arif Perdana

    2018-05-01

    Full Text Available As part of business analytics (BA technologies, reporting and visualization play essential roles in mitigating users’ limitations (i.e., being inexperienced, having limited knowledge, and relying on simplified information. Reporting and visualization can potentially enhance users’ sense-making, thus permitting them to focus more on the information’s message rather than numerical analysis. To better understand the role of reporting and visualization in a contextualized environment, we investigate using interactive data visualization (IDV within accounting. We aim to understand whether IDV can help enhance non-professional investors’ ability to make sense of foundational financial statement analyses. This study conducted an experiment using a sample of 324 nonprofessional investors. Our findings indicate that nonprofessional investors who use IDV are more heuristically adept than non-professional investors who use non-IDV. These findings enrich the theoretical understanding of business analytics’ use in accounting decision making. The results of this study also suggest several practical courses of action, such as promoting wider use of IDV and making affordable IDV more broadly available, particularly for non-professional investors.

  1. Parallel real-time visualization system for large-scale simulation. Application to WSPEEDI

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Kitabata, Hideyuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    2000-01-01

    The real-time visualization system, PATRAS (PArallel TRAcking Steering system) has been developed on parallel computing servers. The system performs almost all of the visualization tasks on a parallel computing server, and uses image data compression technique for efficient communication between the server and the client terminal. Therefore, the system realizes high performance concurrent visualization in an internet computing environment. The experience in applying PATRAS to WSPEEDI (Worldwide version of System for Prediction Environmental Emergency Dose Information) is reported. The application of PATRAS to WSPEEDI enables users to understand behaviours of radioactive tracers from different release points easily and quickly. (author)

  2. The measurement of enhancement in mathematical abilities as a result of joint cognitive trainings in numerical and visual- spatial skills: A preliminary study

    International Nuclear Information System (INIS)

    Agus, M; Mascia, M L; Fastame, M C; Melis, V; Pilloni, M C; Penna, M P

    2015-01-01

    A body of literature shows the significant role of visual-spatial skills played in the improvement of mathematical skills in the primary school. The main goal of the current study was to investigate the impact of a combined visuo-spatial and mathematical training on the improvement of mathematical skills in 146 second graders of several schools located in Italy. Participants were presented single pencil-and-paper visuo-spatial or mathematical trainings, computerised version of the above mentioned treatments, as well as a combined version of computer-assisted and pencil-and-paper visuo-spatial and mathematical trainings, respectively. Experimental groups were presented with training for 3 months, once a week. All children were treated collectively both in computer-assisted or pencil-and-paper modalities. At pre and post-test all our participants were presented with a battery of objective tests assessing numerical and visuo-spatial abilities. Our results suggest the positive effect of different types of training for the empowerment of visuo-spatial and numerical abilities. Specifically, the combination of computerised and pencil-and-paper versions of visuo-spatial and mathematical trainings are more effective than the single execution of the software or of the pencil-and-paper treatment

  3. The measurement of enhancement in mathematical abilities as a result of joint cognitive trainings in numerical and visual- spatial skills: A preliminary study

    Science.gov (United States)

    Agus, M.; Mascia, M. L.; Fastame, M. C.; Melis, V.; Pilloni, M. C.; Penna, M. P.

    2015-02-01

    A body of literature shows the significant role of visual-spatial skills played in the improvement of mathematical skills in the primary school. The main goal of the current study was to investigate the impact of a combined visuo-spatial and mathematical training on the improvement of mathematical skills in 146 second graders of several schools located in Italy. Participants were presented single pencil-and-paper visuo-spatial or mathematical trainings, computerised version of the above mentioned treatments, as well as a combined version of computer-assisted and pencil-and-paper visuo-spatial and mathematical trainings, respectively. Experimental groups were presented with training for 3 months, once a week. All children were treated collectively both in computer-assisted or pencil-and-paper modalities. At pre and post-test all our participants were presented with a battery of objective tests assessing numerical and visuo-spatial abilities. Our results suggest the positive effect of different types of training for the empowerment of visuo-spatial and numerical abilities. Specifically, the combination of computerised and pencil-and-paper versions of visuo-spatial and mathematical trainings are more effective than the single execution of the software or of the pencil-and-paper treatment.

  4. Designing visual appearance using a structured surface

    DEFF Research Database (Denmark)

    Johansen, Villads Egede; Thamdrup, Lasse Højlund; Smitrup, Christian

    2015-01-01

    followed by numerical and experimental verification. The approach comprises verifying all design and fabrication steps required to produce a desired appearance. We expect that the procedure in the future will yield structurally colored surfaces with appealing prescribed visual appearances.......We present an approach for designing nanostructured surfaces with prescribed visual appearances, starting at design analysis and ending with a fabricated sample. The method is applied to a silicon wafer structured using deep ultraviolet lithography and dry etching and includes preliminary design...

  5. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    Science.gov (United States)

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  6. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sewell, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Meredith, Jeremy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  7. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogers, David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Childs, Hank [Univ. of Oregon, Eugene, OR (United States); Ma, Kwan-Liu [Univ. of California, Davis, CA (United States); Geveci, Berk [Kitware Inc., Clifton Park, NY (United States)

    2017-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  8. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    Energy Technology Data Exchange (ETDEWEB)

    Moreland, Kenneth D.; Sewell, Christopher (LANL); Childs, Hank (U of Oregon); Ma, Kwan-Liu (UC Davis); Geveci, Berk (Kitware); Meredith, Jeremy (ORNL)

    2016-05-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  9. Visualization program development using Java

    International Nuclear Information System (INIS)

    Sasaki, Akira; Suto, Keiko

    2002-03-01

    Method of visualization programs using Java for the PC with the graphical user interface (GUI) is discussed, and applied to the visualization and analysis of 1D and 2D data from experiments and numerical simulations. Based on an investigation of programming techniques such as drawing graphics and event driven program, example codes are provided in which GUI is implemented using the Abstract Window Toolkit (AWT). The marked advantage of Java comes from the inclusion of library routines for graphics and networking as its language specification, which enables ordinary scientific programmers to make interactive visualization a part of their simulation codes. Moreover, the Java programs are machine independent at the source level. Object oriented programming (OOP) methods used in Java programming will be useful for developing large scientific codes which includes number of modules with better maintenance ability. (author)

  10. Numerical implementation and oceanographic application of the Gibbs thermodynamic potential of seawater

    Directory of Open Access Journals (Sweden)

    R. Feistel

    2005-01-01

    Full Text Available The 2003 Gibbs thermodynamic potential function represents a very accurate, compact, consistent and comprehensive formulation of equilibrium properties of seawater. It is expressed in the International Temperature Scale ITS-90 and is fully consistent with the current scientific pure water standard, IAPWS-95. Source code examples in FORTRAN, C++ and Visual Basic are presented for the numerical implementation of the potential function and its partial derivatives, as well as for potential temperature. A collection of thermodynamic formulas and relations is given for possible applications in oceanography, from density and chemical potential over entropy and potential density to mixing heat and entropy production. For colligative properties like vapour pressure, freezing points, and for a Gibbs potential of sea ice, the equations relating the Gibbs function of seawater to those of vapour and ice are presented.

  11. Closed-form approximation and numerical validation of the influence of van der Waals force on electrostatic cantilevers at nano-scale separations

    Energy Technology Data Exchange (ETDEWEB)

    Ramezani, Asghar [School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Alasty, Aria [Center of Excellence in Design, Robotics, and Automation (CEDRA), School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Akbari, Javad [Center of Excellence in Design, Robotics, and Automation (CEDRA), School of Mechanical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of)

    2008-01-09

    In this paper the two-point boundary value problem (BVP) of the cantilever deflection at nano-scale separations subjected to van der Waals and electrostatic forces is investigated using analytical and numerical methods to obtain the instability point of the beam. In the analytical treatment of the BVP, the nonlinear differential equation of the model is transformed into the integral form by using the Green's function of the cantilever beam. Then, closed-form solutions are obtained by assuming an appropriate shape function for the beam deflection to evaluate the integrals. In the numerical method, the BVP is solved with the MATLAB BVP solver, which implements a collocation method for obtaining the solution of the BVP. The large deformation theory is applied in numerical simulations to study the effect of the finite kinematics on the pull-in parameters of cantilevers. The centerline of the beam under the effect of electrostatic and van der Waals forces at small deflections and at the point of instability is obtained numerically. In computing the centerline of the beam, the axial displacement due to the transverse deformation of the beam is taken into account, using the inextensibility condition. The pull-in parameters of the beam are computed analytically and numerically under the effects of electrostatic and/or van der Waals forces. The detachment length and the minimum initial gap of freestanding cantilevers, which are the basic design parameters, are determined. The results of the analytical study are compared with the numerical solutions of the BVP. The proposed methods are validated by the results published in the literature.

  12. [Correlation between dental pulp demyelination degree and pain visual analogue scale scores data under acute and chronic pulpitis].

    Science.gov (United States)

    Korsantiia, N B; Davarashvili, X T; Gogiashvili, L E; Mamaladze, M T; Tsagareli, Z G; Melikadze, E B

    2013-05-01

    The aim of study is the analysis of pulp nerve fibers demyelination degree and its relationship with Visual Analogue Scale (VAS) score that may be measured as objective criteria. Material and methods of study. Step I: electron micrografs of dental pulp simples with special interest of myelin structural changes detected in 3 scores system, obtained from 80 patients, displays in 4 groups: 1) acute and 2) chronic pulpitis without and with accompined systemic deseases, 20 patients in each group. Dental care was realized in Kutaisi N1 Dental clinic. Step II - self-reported VAS used for describing dental pain. All data were performed by SPSS 10,0 version statistics including Spearmen-rank and Mann-Whitny coefficients for examine the validity between pulp demyelination degree and pain intensity in verbal, numbered and box scales. Researched Data were shown that damaged myelin as focal decomposition of membranes and Schwann cells hyperthrophia correspond with acute dental pain intensity as Spearman index reported in VAS numbered Scales, myelin and axoplasm degeneration as part of chronic gangrenous pulpitis disorders are in direct correlation with VAS in verbal, numbered and behavioral Rating Scales. In fact, all morphological and subjective data, including psychomotoric assessment of dental painin pulpitis may be used in dental practice for evaluation of pain syndrome considered personal story.

  13. Visual Puzzles, Figure Weights, and Cancellation: Some Preliminary Hypotheses on the Functional and Neural Substrates of These Three New WAIS-IV Subtests

    Science.gov (United States)

    McCrea, Simon M.; Robinson, Thomas P.

    2011-01-01

    In this study, five consecutive patients with focal strokes and/or cortical excisions were examined with the Wechsler Adult Intelligence Scale and Wechsler Memory Scale—Fourth Editions along with a comprehensive battery of other neuropsychological tasks. All five of the lesions were large and typically involved frontal, temporal, and/or parietal lobes and were lateralized to one hemisphere. The clinical case method was used to determine the cognitive neuropsychological correlates of mental rotation (Visual Puzzles), Piagetian balance beam (Figure Weights), and visual search (Cancellation) tasks. The pattern of results on Visual Puzzles and Figure Weights suggested that both subtests involve predominately right frontoparietal networks involved in visual working memory. It appeared that Visual Puzzles could also critically rely on the integrity of the left temporoparietal junction. The left temporoparietal junction could be involved in temporal ordering and integration of local elements into a nonverbal gestalt. In contrast, the Figure Weights task appears to critically involve the right temporoparietal junction involved in numerical magnitude estimation. Cancellation was sensitive to left frontotemporal lesions and not right posterior parietal lesions typical of other visual search tasks. In addition, the Cancellation subtest was sensitive to verbal search strategies and perhaps object-based attention demands, thereby constituting a unique task in comparison with previous visual search tasks. PMID:22389807

  14. A rare case of haboob in Tehran: Observational and numerical study

    Science.gov (United States)

    Karami, S.; Ranjbar, A.; Mohebalhojeh, A. R.; Moradi, M.

    2017-03-01

    A great dust storm occurred in Tehran on 2 June 2014 and caused severe damage to properties and involved loss of human life. From the visual evidence available, it can be regarded as a case of haboob. As a lower latitude phenomenon, its occurrence in Tehran was unprecedented in the last 50 years. This paper aims to present a detailed analysis of the weather conditions, the pathways by which dust particles were ingested by the haboob, as well as the impact of the urban boundary layer on the intensity and propagation of the dust storm. Using numerical simulation carried out by the WRF-Chem model and various observational techniques, the coupling of a low-level small-scale deformation field with a lower-tropospheric cold pool produced by precipitating mid-tropospheric clouds is identified as the main process involved in shaping this rare dust storm.

  15. Data Cube Visualization with Blender

    Science.gov (United States)

    Kent, Brian R.; Gárate, Matías

    2017-06-01

    With the increasing data acquisition rates from observational and computational astrophysics, new tools are needed to study and visualize data. We present a methodology for rendering 3D data cubes using the open-source 3D software Blender. By importing processed observations and numerical simulations through the Voxel Data format, we are able use the Blender interface and Python API to create high-resolution animated visualizations. We review the methods for data import, animation, and camera movement, and present examples of this methodology. The 3D rendering of data cubes gives scientists the ability to create appealing displays that can be used for both scientific presentations as well as public outreach.

  16. Impact of visual repetition rate on intrinsic properties of low frequency fluctuations in the visual network.

    Directory of Open Access Journals (Sweden)

    Yi-Chia Li

    Full Text Available BACKGROUND: Visual processing network is one of the functional networks which have been reliably identified to consistently exist in human resting brains. In our work, we focused on this network and investigated the intrinsic properties of low frequency (0.01-0.08 Hz fluctuations (LFFs during changes of visual stimuli. There were two main questions to be discussed in this study: intrinsic properties of LFFs regarding (1 interactions between visual stimuli and resting-state; (2 impact of repetition rate of visual stimuli. METHODOLOGY/PRINCIPAL FINDINGS: We analyzed scanning sessions that contained rest and visual stimuli in various repetition rates with a novel method. The method included three numerical approaches involving ICA (Independent Component Analyses, fALFF (fractional Amplitude of Low Frequency Fluctuation, and Coherence, to respectively investigate the modulations of visual network pattern, low frequency fluctuation power, and interregional functional connectivity during changes of visual stimuli. We discovered when resting-state was replaced by visual stimuli, more areas were involved in visual processing, and both stronger low frequency fluctuations and higher interregional functional connectivity occurred in visual network. With changes of visual repetition rate, the number of areas which were involved in visual processing, low frequency fluctuation power, and interregional functional connectivity in this network were also modulated. CONCLUSIONS/SIGNIFICANCE: To combine the results of prior literatures and our discoveries, intrinsic properties of LFFs in visual network are altered not only by modulations of endogenous factors (eye-open or eye-closed condition; alcohol administration and disordered behaviors (early blind, but also exogenous sensory stimuli (visual stimuli with various repetition rates. It demonstrates that the intrinsic properties of LFFs are valuable to represent physiological states of human brains.

  17. Numerical Simulation of Airfoil Aerodynamic Penalties and Mechanisms in Heavy Rain

    Directory of Open Access Journals (Sweden)

    Zhenlong Wu

    2013-01-01

    Full Text Available Numerical simulations that are conducted on a transport-type airfoil, NACA 64-210, at a Reynolds number of 2.6×106 and LWC of 25 g/m3 explore the aerodynamic penalties and mechanisms that affect airfoil performance in heavy rain conditions. Our simulation results agree well with the experimental data and show significant aerodynamic penalties for the airfoil in heavy rain. The maximum percentage decrease in CL is reached by 13.2% and the maximum percentage increase in CD by 47.6%. Performance degradation in heavy rain at low angles of attack is emulated by an originally creative boundary-layer-tripped technique near the leading edge. Numerical flow visualization technique is used to show premature boundary-layer separation at high angles of attack and the particulate trajectories at various angles of attack. A mathematic model is established to qualitatively study the water film effect on the airfoil geometric changes. All above efforts indicate that two primary mechanisms are accountable for the airfoil aerodynamic penalties. One is to cause premature boundary-layer transition at low AOA and separation at high AOA. The other occurs at times scales consistent with the water film layer, which is thought to alter the airfoil geometry and increase the mass effectively.

  18. Visualizing water

    Science.gov (United States)

    Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.

    2016-12-01

    A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.

  19. Wave propagation visualization in an experimental model for a control rod drive mechanism assembly

    International Nuclear Information System (INIS)

    Lee, Jung-Ryul; Jeong, Hyomi; Kong, Churl-Won

    2011-01-01

    Highlights: → We fabricate a full-scale mock-up of the control rod drive mechanism (CRDM) assembly in the upper reactor head of the nuclear power plant. → An ultrasonic propagation imaging method using a scanning laser ultrasonic generator is proposed to visualize and simulate ultrasonic wave propagation around the CRDM assembly. → The ultrasonic source location and frequency are simulated by changing the sensor location and the band pass-filtering zone. → The ultrasonic propagation patterns before and after cracks in the weld and nozzle of the CRDM assembly are analyzed. - Abstract: Nondestructive inspection techniques such as ultrasonic testing, eddy current testing, and visual testing are being developed to detect primary water stress corrosion cracks in control rod drive mechanism (CRDM) assemblies of nuclear power plants. A unit CRDM assembly consists of a reactor upper head including cladding, a penetration nozzle, and J-groove dissimilar metal welds with buttering. In this study, we fabricated a full-scale CRDM assembly mock-up. An ultrasonic propagation imaging (UPI) method using a scanning laser ultrasonic generator is proposed to visualize and simulate ultrasonic wave propagation around the thick and complex CRDM assembly. First, the proposed laser UPI system was validated for a simple aluminium plate by comparing the ultrasonic wave propagation movie (UWPM) obtained using the system with numerical simulation results reported in the literature. Lamb wave mode identification and damage detectability, depending on the ultrasonic frequency, were also included in the UWPM analysis. A CRDM assembly mock-up was fabricated in full-size and its vertical cross section was scanned using the laser UPI system to investigate the propagation characteristics of the longitudinal and Rayleigh waves in the complex structure. The ultrasonic source location and frequency were easily simulated by changing the sensor location and the band pass filtering zone

  20. Interactive visual exploration of a trillion particles

    KAUST Repository

    Schatz, Karsten

    2017-03-10

    We present a method for the interactive exploration of tera-scale particle data sets. Such data sets arise from molecular dynamics, particle-based fluid simulation, and astrophysics. Our visualization technique provides a focus+context view of the data that runs interactively on commodity hardware. The method is based on a hybrid multi-scale rendering architecture, which renders the context as a hierarchical density volume. Fine details in the focus are visualized using direct particle rendering. In addition, clusters like dark matter halos can be visualized as semi-transparent spheres enclosing the particles. Since the detail data is too large to be stored in main memory, our approach uses an out-of-core technique that streams data on demand. Our technique is designed to take advantage of a dual-GPU configuration, in which the workload is split between the GPUs based on the type of data. Structural features in the data are visually enhanced using advanced rendering and shading techniques. To allow users to easily identify interesting locations even in overviews, both the focus and context view use color tables to show data attributes on the respective scale. We demonstrate that our technique achieves interactive performance on a one trillionpar-ticle data set from the DarkSky simulation.

  1. The numerical benchmark CB2-S, final evaluation

    International Nuclear Information System (INIS)

    Chrapciak, V.

    2002-01-01

    In this paper are final results of numerical benchmark CB2-S compared (activity, gamma and neutron sources, concentration of important nuclides and decay heat). The participants are: Vladimir Chrapciak (SCALE), Ludmila Markova (SCALE), Svetlana Zabrodskaja (SCALA), Pavel Mikolas (WIMS). Eva Tinkova (HELIOS) and Maria Manolova (SCALE) (Authors)

  2. Multi-scale approximation of Vlasov equation

    International Nuclear Information System (INIS)

    Mouton, A.

    2009-09-01

    One of the most important difficulties of numerical simulation of magnetized plasmas is the existence of multiple time and space scales, which can be very different. In order to produce good simulations of these multi-scale phenomena, it is recommended to develop some models and numerical methods which are adapted to these problems. Nowadays, the two-scale convergence theory introduced by G. Nguetseng and G. Allaire is one of the tools which can be used to rigorously derive multi-scale limits and to obtain new limit models which can be discretized with a usual numerical method: this procedure is so-called a two-scale numerical method. The purpose of this thesis is to develop a two-scale semi-Lagrangian method and to apply it on a gyrokinetic Vlasov-like model in order to simulate a plasma submitted to a large external magnetic field. However, the physical phenomena we have to simulate are quite complex and there are many questions without answers about the behaviour of a two-scale numerical method, especially when such a method is applied on a nonlinear model. In a first part, we develop a two-scale finite volume method and we apply it on the weakly compressible 1D isentropic Euler equations. Even if this mathematical context is far from a Vlasov-like model, it is a relatively simple framework in order to study the behaviour of a two-scale numerical method in front of a nonlinear model. In a second part, we develop a two-scale semi-Lagrangian method for the two-scale model developed by E. Frenod, F. Salvarani et E. Sonnendrucker in order to simulate axisymmetric charged particle beams. Even if the studied physical phenomena are quite different from magnetic fusion experiments, the mathematical context of the one-dimensional paraxial Vlasov-Poisson model is very simple for establishing the basis of a two-scale semi-Lagrangian method. In a third part, we use the two-scale convergence theory in order to improve M. Bostan's weak-* convergence results about the finite

  3. Creatine Phosphokinase and Visual Analogue Scale as Indicators for Muscle Injury in Untrained Bodybuilders

    Directory of Open Access Journals (Sweden)

    Suresh Shanmugam

    2015-06-01

    Full Text Available Background: Skeletal muscle is a vital tissue in the human body to enable breathing, walking and performing several sports activities. However, this muscle is persistently injured throughout every sports session. Some exercises demand a muscle injury occurrence in order to build a stronger muscle through an adaptation process namely bodybuilding exercise. Importantly, every muscle injury should occur within a physiological range which can be identified by several biomarkers as well as pain scale. The aim of this study was to identify changes on the level of Creatine phosphokinase (CPK and Visual analogue scale (VAS between pre and post training sessions and the correlation between these two indicators. Methods: This was an observational analytical cross sectional comparison study which was conducted in October 2012 and the subjects were adult untrained bodybuilders at the Jatinangor fitness center. The data was obtained by measuring serum CPK and marked VAS. The data were analyzed by t-test, Wilcoxon’s test and Spearman’s correlation. Results: Both CPK and VAS increased significantly by 296 U/L and 19.9 mm respectively. There was a strong positive significant correlation between VAS and CPK (p=0.01, r = 0.711. Conclusion: The healthy untrained bodybuilders chosen in this study experienced a mild (<2000 U/L muscle injury throughout the training sessions with general increased CPK levels and VAS measurement.

  4. The 'utility' of the visual analog scale in medical decision making and technology assessment. Is it an alternative to the time trade-off?

    NARCIS (Netherlands)

    Stiggelbout, A. M.; Eijkemans, M. J.; Kiebert, G. M.; Kievit, J.; Leer, J. W.; de Haes, H. J.

    1996-01-01

    Methods often used for the valuation of health states are the time trade-off (TTO) and the visual analog scale (VAS). The VAS is easier than the TTO and can be self-administered; however it usually leads to lower scores. In the literature a power transformation of group mean VAS scores to TTO scores

  5. Spontaneous ignition characteristics of coal in a large-scale furnace: An experimental and numerical investigation

    International Nuclear Information System (INIS)

    Wen, Hu; Yu, Zhijin; Deng, Jun; Zhai, Xiaowei

    2017-01-01

    Highlights: • Three coal spontaneous combustion coupled models based on various flow equations were constructed and compared. • The airflow behavior in loose coal should be defined as a Brinkman flow. • The self-heating of coal in a large-scale reactor was numerically reappeared. • The effect of heat dissipated conditions on temperature profiles of broken coal was presented. - Abstract: A comprehensive understanding of the spontaneous combustion characteristics of coal in various surroundings is necessary for developing reliable test platform and predictive models. In this study, the characteristics of oxidation and self-heating combining various gas flow equations in loose coal were investigated separately and used to simulate the experimental procedure of spontaneous combustion. The main focus was to investigate the effect of thermal boundary on temperature profiles as well as spontaneous combustion period. The results showed that the numerical approach was validated by comparison with the test data. Furthermore, the model based upon Brinkman equation showed a higher accuracy, which indicated that airflow behavior influences the balances of coal oxidation and heat dissipation, thus impacts the temperature profiles of loose coal. The areas of high temperature zones would be evidently expanded and the spontaneous ignition time would be significantly accelerated if the thermal exchange between the coal and its surroundings decreased. Our results, especially for the field of engineering, have substantial effects for grasping and controlling coal spontaneous combustion disaster.

  6. On Numerical Stability in Large Scale Linear Algebraic Computations

    Czech Academy of Sciences Publication Activity Database

    Strakoš, Zdeněk; Liesen, J.

    2005-01-01

    Roč. 85, č. 5 (2005), s. 307-325 ISSN 0044-2267 R&D Projects: GA AV ČR 1ET400300415 Institutional research plan: CEZ:AV0Z10300504 Keywords : linear algebraic systems * eigenvalue problems * convergence * numerical stability * backward error * accuracy * Lanczos method * conjugate gradient method * GMRES method Subject RIV: BA - General Mathematics Impact factor: 0.351, year: 2005

  7. VarB Plus: An Integrated Tool for Visualization of Genome Variation Datasets

    KAUST Repository

    Hidayah, Lailatul

    2012-07-01

    Research on genomic sequences has been improving significantly as more advanced technology for sequencing has been developed. This opens enormous opportunities for sequence analysis. Various analytical tools have been built for purposes such as sequence assembly, read alignments, genome browsing, comparative genomics, and visualization. From the visualization perspective, there is an increasing trend towards use of large-scale computation. However, more than power is required to produce an informative image. This is a challenge that we address by providing several ways of representing biological data in order to advance the inference endeavors of biologists. This thesis focuses on visualization of variations found in genomic sequences. We develop several visualization functions and embed them in an existing variation visualization tool as extensions. The tool we improved is named VarB, hence the nomenclature for our enhancement is VarB Plus. To the best of our knowledge, besides VarB, there is no tool that provides the capability of dynamic visualization of genome variation datasets as well as statistical analysis. Dynamic visualization allows users to toggle different parameters on and off and see the results on the fly. The statistical analysis includes Fixation Index, Relative Variant Density, and Tajima’s D. Hence we focused our efforts on this tool. The scope of our work includes plots of per-base genome coverage, Principal Coordinate Analysis (PCoA), integration with a read alignment viewer named LookSeq, and visualization of geo-biological data. In addition to description of embedded functionalities, significance, and limitations, future improvements are discussed. The result is four extensions embedded successfully in the original tool, which is built on the Qt framework in C++. Hence it is portable to numerous platforms. Our extensions have shown acceptable execution time in a beta testing with various high-volume published datasets, as well as positive

  8. Applications of visual soil evaluation

    DEFF Research Database (Denmark)

    Ball, Bruce C; Munkholm, Lars Juhl; Batey, Tom

    2013-01-01

    Working Group F “Visual Soil Examination and Evaluation” (VSEE) was formed over 30 years ago within the International Soil & Tillage Research Organisation (ISTRO) on the initiative of Tom Batey. The objectives of the Working Group are to stimulate interest in field methods of visual-tactile soil...... assessment, to encourage their wider use and to foster international cooperation. The previous main meeting of the group in 2005 at Peronne, France, brought together, for the first time, a group of soil scientists who had each developed a method to evaluate soil structure directly in the field (Boizard et al...... to the re-development of the Peerlkamp numeric method of assessment of soil structure into the Visual Evaluation of Soil Structure (VESS) spade test (Ball et al., 2007 and Guimarães et al., 2011). The meeting also recommended further cooperation between members of the Working Group. The evaluation...

  9. A computational theory of visual receptive fields.

    Science.gov (United States)

    Lindeberg, Tony

    2013-12-01

    A receptive field constitutes a region in the visual field where a visual cell or a visual operator responds to visual stimuli. This paper presents a theory for what types of receptive field profiles can be regarded as natural for an idealized vision system, given a set of structural requirements on the first stages of visual processing that reflect symmetry properties of the surrounding world. These symmetry properties include (i) covariance properties under scale changes, affine image deformations, and Galilean transformations of space-time as occur for real-world image data as well as specific requirements of (ii) temporal causality implying that the future cannot be accessed and (iii) a time-recursive updating mechanism of a limited temporal buffer of the past as is necessary for a genuine real-time system. Fundamental structural requirements are also imposed to ensure (iv) mutual consistency and a proper handling of internal representations at different spatial and temporal scales. It is shown how a set of families of idealized receptive field profiles can be derived by necessity regarding spatial, spatio-chromatic, and spatio-temporal receptive fields in terms of Gaussian kernels, Gaussian derivatives, or closely related operators. Such image filters have been successfully used as a basis for expressing a large number of visual operations in computer vision, regarding feature detection, feature classification, motion estimation, object recognition, spatio-temporal recognition, and shape estimation. Hence, the associated so-called scale-space theory constitutes a both theoretically well-founded and general framework for expressing visual operations. There are very close similarities between receptive field profiles predicted from this scale-space theory and receptive field profiles found by cell recordings in biological vision. Among the family of receptive field profiles derived by necessity from the assumptions, idealized models with very good qualitative

  10. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    Science.gov (United States)

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  11. Heat release effects on mixing scales of non-premixed turbulent wall-jets: A direct numerical simulation study

    International Nuclear Information System (INIS)

    Pouransari, Zeinab; Vervisch, Luc; Johansson, Arne V.

    2013-01-01

    Highlights: ► A non-premixed turbulent flame close to a solid surface is studied using DNS. ► Heat release effects delay transition and enlarge fluctuation of density and pressure. ► The fine-scale structures damped and surface wrinkling diminished due to heat-release. ► Using semilocal scaling improves the collapse of turbulence statistic in inner region. ► There are regions of the flame where considerable (up to 10%) premixed burning occurs. -- Abstract: The present study concerns the role of heat release effects on characteristics mixing scales of turbulence in reacting wall-jet flows. Direct numerical simulations of exothermic reacting turbulent wall-jets are performed and compared to the isothermal reacting case. An evaluation of the heat-release effects on the structure of turbulence is given by examining the mixture fraction surface characteristics, diagnosing vortices and exploring the dissipation rate of the fuel and passive scalar concentrations, and moreover by illustration of probability density functions of reacting species and scatter plots of the local temperature against the mixture fraction. Primarily, heat release effects delay the transition, enlarge the fluctuation intensities of density and pressure and also enhance the fluctuation level of the species concentrations. However, it has a damping effect on all velocity fluctuation intensities and the Reynolds shear stress. A key result is that the fine-scale structures of turbulence are damped, the surface wrinkling is diminished and the vortices become larger due to heat-release effects. Taking into account the varying density by using semi-local scaling improves the collapse of the turbulence statistics in the inner region, but does not eliminate heat release induced differences in the outer region. Examining the two-dimensional premultiplied spanwise spectra of the streamwise velocity fluctuations indicates a shifting in the positions of the outer peaks, associated with large

  12. Numerical investigation of room-temperature deformation behavior of a duplex type γTiAl alloy using a multi-scale modeling approach

    International Nuclear Information System (INIS)

    Kabir, M.R.; Chernova, L.; Bartsch, M.

    2010-01-01

    Room-temperature deformation of a niobium-rich TiAl alloy with duplex microstructure has been numerically investigated. The model links the microstructural features at micro- and meso-scale by the two-level (FE 2 ) multi-scale approach. The deformation mechanisms of the considered phases were described in the micro-mechanical crystal-plasticity model. Initial material parameters for the model were taken from the literature and validated using tensile experiments at macro-scale. For the niobium-rich TiAl alloy further adaptation of the crystal plasticity parameters is proposed. Based on these model parameters, the influences of the grain orientation, grain size, and texture on the global mechanical behavior have been investigated. The contributions of crystal deformation modes (slips and dislocations in the phases) to the mechanical response are also analyzed. The results enable a quantitative prediction of relationships between microstructure and mechanical behavior on global and local scale, including an assessment of possible crack initiation sites. The model can be used for microstructure optimization to obtain better material properties.

  13. Visualising magnetic fields numerical equation solvers in action

    CERN Document Server

    Beeteson, John Stuart

    2001-01-01

    Visualizing Magnetic Fields: Numerical Equation Solvers in Action provides a complete description of the theory behind a new technique, a detailed discussion of the ways of solving the equations (including a software visualization of the solution algorithms), the application software itself, and the full source code. Most importantly, there is a succinct, easy-to-follow description of each procedure in the code.The physicist Michael Faraday said that the study of magnetic lines of force was greatly influential in leading him to formulate many of those concepts that are now so fundamental to our modern world, proving to him their "great utility as well as fertility." Michael Faraday could only visualize these lines in his mind's eye and, even with modern computers to help us, it has been very expensive and time consuming to plot lines of force in magnetic fields

  14. Literature Review of Applying Visual Method to Understand Mathematics

    Directory of Open Access Journals (Sweden)

    Yu Xiaojuan

    2015-01-01

    Full Text Available As a new method to understand mathematics, visualization offers a new way of understanding mathematical principles and phenomena via image thinking and geometric explanation. It aims to deepen the understanding of the nature of concepts or phenomena and enhance the cognitive ability of learners. This paper collates and summarizes the application of this visual method in the understanding of mathematics. It also makes a literature review of the existing research, especially with a visual demonstration of Euler’s formula, introduces the application of this method in solving relevant mathematical problems, and points out the differences and similarities between the visualization method and the numerical-graphic combination method, as well as matters needing attention for its application.

  15. Teaching numerical methods with IPython notebooks and inquiry-based learning

    KAUST Repository

    Ketcheson, David I.

    2014-01-01

    A course in numerical methods should teach both the mathematical theory of numerical analysis and the craft of implementing numerical algorithms. The IPython notebook provides a single medium in which mathematics, explanations, executable code, and visualizations can be combined, and with which the student can interact in order to learn both the theory and the craft of numerical methods. The use of notebooks also lends itself naturally to inquiry-based learning methods. I discuss the motivation and practice of teaching a course based on the use of IPython notebooks and inquiry-based learning, including some specific practical aspects. The discussion is based on my experience teaching a Masters-level course in numerical analysis at King Abdullah University of Science and Technology (KAUST), but is intended to be useful for those who teach at other levels or in industry.

  16. Large-scale remapping of visual cortex is absent in adult humans with macular degeneration

    NARCIS (Netherlands)

    Baseler, Heidi A.; Gouws, Andre; Haak, Koen V.; Racey, Christopher; Crossland, Michael D.; Tufail, Adnan; Rubin, Gary S.; Cornelissen, Frans W.; Morland, Antony B.

    The occipital lobe contains retinotopic representations of the visual field. The representation of the central retina in early visual areas (V1-3) is found at the occipital pole. When the central retina is lesioned in both eyes by macular degeneration, this region of visual cortex at the occipital

  17. Correlation between the pain numeric rating scale and the 12-item WHO Disability Assessment Schedule 2.0 in patients with musculoskeletal pain.

    Science.gov (United States)

    Saltychev, Mikhail; Bärlund, Esa; Laimi, Katri

    2018-03-01

    The aim of this study was to assess the correlation between pain severity measured on a numeric rating scale and restrictions of functioning measured with the WHO Disability Assessment Schedule (WHODAS 2.0). This was a cross-sectional study of 1207 patients with musculoskeletal pain conditions. Correlation was assessed using Spearman's and Pearson tests. Although all the Spearman's rank correlations between WHODAS 2.0 items and pain severity were statistically significant, they were mostly weak, with only a few moderate associations for 'S2 household responsibilities', 'S8 washing', 'S9 dressing', and 'S12 day-to-day work'. The correlation between the WHODAS 2.0 total score and pain severity was also moderate: 0.41 [95% confidence interval (CI): 0.36-0.45] for average pain and 0.42 (95% CI: 0.37-0.46) for worst pain. The correlation between the WHODAS 2.0 total score and pain level was also assessed using Pearson's product-moment correlation, yielding figures that were similar to Spearman's correlation: 0.42 (Pcorrelation between pain severity measured by numeric rating scale and functioning level measured by WHODAS 2.0 was weak to moderate, with slightly stronger associations in physical domains of functioning.

  18. A deterministic combination of numerical and physical models for coastal waves

    DEFF Research Database (Denmark)

    Zhang, Haiwen

    2006-01-01

    of numerical and physical modelling hence provides an attractive alternative to the use of either tool on it's own. The goal of this project has been to develop a deterministically combined numerical/physical model where the physical wave tank is enclosed in a much larger computational domain, and the two......Numerical and physical modelling are the two main tools available for predicting the influence of water waves on coastlines and structures placed in the near-shore environment. Numerical models can cover large areas at the correct scale, but are limited in their ability to capture strong...... nonlinearities, wave breaking, splash, mixing, and other such complicated physics. Physical models naturally include the real physics (at the model scale), but are limited by the physical size of the facility and must contend with the fact that different physical effects scale differently. An integrated use...

  19. Multi-scale experimental and numerical study of the structure and the dynamics of water confined in clay minerals

    International Nuclear Information System (INIS)

    Guillaud, Emmanuel Bertrand

    2017-01-01

    Clay are complex minerals with a multi-scale porosity and a remarkable ability to swell under humid atmosphere. These materials have many applications in catalysis, waste management, construction industry... However, the properties of confined water are still not fully understood, due in particular to the complexity of water itself. The aim of this work is, using mainly molecular simulations and vibrational spectroscopy, to understand the structure and the dynamics of water confined in clay minerals. To evaluate the accuracy of numerical models to describe water confined in clay minerals, and to understand the origin of its structural and dynamical properties, a large part of the work was devoted to the building blocks of clays: pure bulk water, water at the surface of a solid, and salt water. To this extent, the viscoelastic properties of water from the deeply supercooled regime to the boiling temperature were investigated using classical molecular dynamics. The evolution of the friction properties of water on a prototypical solid surface was also analyzed, and the accuracy of ab initio approaches and empirical salt models was studied. In a second part, those results were confronted to the properties of water confined in clay minerals at low and room temperature, studied both experimentally and numerically. Experimental work consisted mostly in extensive far- and -mid infrared absorption spectrometry measurements, whereas numerical work mainly consisted in empirical molecular dynamics simulations. Especially, the existence of confinement- or temperature-induced phase transitions of confined water was investigated. (author)

  20. A numerical formulation and algorithm for limit and shakedown analysis of large-scale elastoplastic structures

    Science.gov (United States)

    Peng, Heng; Liu, Yinghua; Chen, Haofeng

    2018-05-01

    In this paper, a novel direct method called the stress compensation method (SCM) is proposed for limit and shakedown analysis of large-scale elastoplastic structures. Without needing to solve the specific mathematical programming problem, the SCM is a two-level iterative procedure based on a sequence of linear elastic finite element solutions where the global stiffness matrix is decomposed only once. In the inner loop, the static admissible residual stress field for shakedown analysis is constructed. In the outer loop, a series of decreasing load multipliers are updated to approach to the shakedown limit multiplier by using an efficient and robust iteration control technique, where the static shakedown theorem is adopted. Three numerical examples up to about 140,000 finite element nodes confirm the applicability and efficiency of this method for two-dimensional and three-dimensional elastoplastic structures, with detailed discussions on the convergence and the accuracy of the proposed algorithm.

  1. Ultrascale Visualization of Climate Data

    Science.gov (United States)

    Williams, Dean N.; Bremer, Timo; Doutriaux, Charles; Patchett, John; Williams, Sean; Shipman, Galen; Miller, Ross; Pugmire, David R.; Smith, Brian; Steed, Chad; hide

    2013-01-01

    Fueled by exponential increases in the computational and storage capabilities of high-performance computing platforms, climate simulations are evolving toward higher numerical fidelity, complexity, volume, and dimensionality. These technological breakthroughs are coming at a time of exponential growth in climate data, with estimates of hundreds of exabytes by 2020. To meet the challenges and exploit the opportunities that such explosive growth affords, a consortium of four national laboratories, two universities, a government agency, and two private companies formed to explore the next wave in climate science. Working in close collaboration with domain experts, the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) project aims to provide high-level solutions to a variety of climate data analysis and visualization problems.

  2. Visualizing Dynamic Data with Maps.

    Science.gov (United States)

    Mashima, Daisuke; Kobourov, Stephen G; Hu, Yifan

    2012-09-01

    Maps offer a familiar way to present geographic data (continents, countries), and additional information (topography, geology), can be displayed with the help of contours and heat-map overlays. In this paper, we consider visualizing large-scale dynamic relational data by taking advantage of the geographic map metaphor. We describe a map-based visualization system which uses animation to convey dynamics in large data sets, and which aims to preserve the viewer's mental map while also offering readable views at all times. Our system is fully functional and has been used to visualize user traffic on the Internet radio station last.fm, as well as TV-viewing patterns from an IPTV service. All map images in this paper are available in high-resolution at [1] as are several movies illustrating the dynamic visualization.

  3. A review of numerical techniques approaching microstructures of crystalline rocks

    Science.gov (United States)

    Zhang, Yahui; Wong, Louis Ngai Yuen

    2018-06-01

    The macro-mechanical behavior of crystalline rocks including strength, deformability and failure pattern are dominantly influenced by their grain-scale structures. Numerical technique is commonly used to assist understanding the complicated mechanisms from a microscopic perspective. Each numerical method has its respective strengths and limitations. This review paper elucidates how numerical techniques take geometrical aspects of the grain into consideration. Four categories of numerical methods are examined: particle-based methods, block-based methods, grain-based methods, and node-based methods. Focusing on the grain-scale characters, specific relevant issues including increasing complexity of micro-structure, deformation and breakage of model elements, fracturing and fragmentation process are described in more detail. Therefore, the intrinsic capabilities and limitations of different numerical approaches in terms of accounting for the micro-mechanics of crystalline rocks and their phenomenal mechanical behavior are explicitly presented.

  4. Neurophysiology of visual aura in migraine

    International Nuclear Information System (INIS)

    Shibata, Koichi

    2007-01-01

    Visual processing in migraine has been targeted because the visual symptoms that are commonly associated with attack, either in the form of aura or other more subtle symptoms, indicate that the visual pathways are involved in migrainous pathophysiology. The visual aura of the migraine attack has been explained by the cortical spreading depression (CSD) of Leao, neuroelectric event beginning in the occipital cortex and propagating into contiguous brain region. Clinical observations suggest that hyperexcitability occurs not only during the attack, typically in the form of photophobia, but also between attacks. Numerous human neuroimaging, neurophysiological and psychophysical studies have identified differences in cortical visual processing in migraine. The possibility of imaging the typical visual aura with BOLD functional MRI has revealed multiple neurovascular events in the occipital cortex within a single attack that closely resemble CSD. As transient synchronized neuronal excitation precedes CSD, changes in cortical excitability underlie the migraine attack. Independent evidence for altered neuronal excitability in migraineurs between attacks emerges from visual evoked potentials (VEPs) and transcranial magnetic stimulation (TMS), recordings of cortical potentials and psychophysics. Recently, both TMS and psychophysical studies measuring visual performance in migraineurs have used measures which presumably measure primary visual (V1) and visual association cortex. Our VEP and blink reflex study showed that migraine patients exhibiting allodynia might show central sensitization of braistem trigeminal neuron and had contrast modulation dysfunction during the cortical visual processing of V1 and visual association cortex in-between attacks. In pathophysiology of migraine, these neurophysiological and psychophysical studies indicate that abnormal visual and trigeminal hyperexcitability might persist between migraine attacks. The influence of migraine on cortical

  5. Attention Gating in Short-Term Visual Memory.

    Science.gov (United States)

    Reeves, Adam; Sperling, George

    1986-01-01

    An experiment is conducted showing that an attention shift to a stream of numerals presented in rapid serial visual presentation mode produces not a total loss, but a systematic distortion of order. An attention gating model (AGM) is developed from a more general attention model. (Author/LMO)

  6. The Two Visual Systems Hypothesis: new challenges and insights from visual form agnosic patient DF

    Directory of Open Access Journals (Sweden)

    Robert Leslie Whitwell

    2014-12-01

    Full Text Available Patient DF, who developed visual form agnosia following carbon monoxide poisoning, is still able to use vision to adjust the configuration of her grasping hand to the geometry of a goal object. This striking dissociation between perception and action in DF provided a key piece of evidence for the formulation of Goodale and Milner’s Two Visual Systems Hypothesis (TVSH. According to the TVSH, the ventral stream plays a critical role in constructing our visual percepts, whereas the dorsal stream mediates the visual control of action, such as visually guided grasping. In this review, we discuss recent studies of DF that provide new insights into the functional organization of the dorsal and ventral streams. We confirm recent evidence that DF has dorsal as well as ventral brain damage – and that her dorsal-stream lesions and surrounding atrophy have increased in size since her first published brain scan. We argue that the damage to DF’s dorsal stream explains her deficits in directing actions at targets in the periphery. We then focus on DF’s ability to accurately adjust her in-flight hand aperture to changes in the width of goal objects (grip scaling whose dimensions she cannot explicitly report. An examination of several studies of DF’s grip scaling under natural conditions reveals a modest though significant deficit. Importantly, however, she continues to show a robust dissociation between form vision for perception and form vision for action. We also review recent studies that explore the role of online visual feedback and terminal haptic feedback in the programming and control of her grasping. These studies make it clear that DF is no more reliant on visual or haptic feedback than are neurologically-intact individuals. In short, we argue that her ability to grasp objects depends on visual feedforward processing carried out by visuomotor networks in her dorsal stream that function in the much the same way as they do in neurologically

  7. Innovative Visualizations Shed Light on Avian Nocturnal Migration.

    Directory of Open Access Journals (Sweden)

    Judy Shamoun-Baranes

    Full Text Available Globally, billions of flying animals undergo seasonal migrations, many of which occur at night. The temporal and spatial scales at which migrations occur and our inability to directly observe these nocturnal movements makes monitoring and characterizing this critical period in migratory animals' life cycles difficult. Remote sensing, therefore, has played an important role in our understanding of large-scale nocturnal bird migrations. Weather surveillance radar networks in Europe and North America have great potential for long-term low-cost monitoring of bird migration at scales that have previously been impossible to achieve. Such long-term monitoring, however, poses a number of challenges for the ornithological and ecological communities: how does one take advantage of this vast data resource, integrate information across multiple sensors and large spatial and temporal scales, and visually represent the data for interpretation and dissemination, considering the dynamic nature of migration? We assembled an interdisciplinary team of ecologists, meteorologists, computer scientists, and graphic designers to develop two different flow visualizations, which are interactive and open source, in order to create novel representations of broad-front nocturnal bird migration to address a primary impediment to long-term, large-scale nocturnal migration monitoring. We have applied these visualization techniques to mass bird migration events recorded by two different weather surveillance radar networks covering regions in Europe and North America. These applications show the flexibility and portability of such an approach. The visualizations provide an intuitive representation of the scale and dynamics of these complex systems, are easily accessible for a broad interest group, and are biologically insightful. Additionally, they facilitate fundamental ecological research, conservation, mitigation of human-wildlife conflicts, improvement of meteorological

  8. Numerical and experimental analysis of the flow around a two-element wingsail at Reynolds number 0.53 × 10"6

    International Nuclear Information System (INIS)

    Fiumara, Alessandro; Gourdain, Nicolas; Chapin, Vincent; Senter, Julien; Bury, Yannick

    2016-01-01

    Highlights: • An experimental campaign including pressure measurements, oil visualizations and PIV was performed on a scale wingsail. • Unsteady RANS simulations were carried out on the wingsail scale model reproducing also the wind tunnel domain. • The geometrical slot parameters affect the circulation around the main element influencing the pressure distribution on it. - Abstract: The rigid wingsail is a propulsion system, utilized in sailing competitions in order to enhance the yacht performance in both upwind and downwind conditions. Nevertheless, this new rig is sensitive to upstream flow variations, making its steering difficult. This issue suggests the need to perform a study on wingsail aerodynamics. Thus this paper reports some investigations done to better understand the flow physics around a scaled model of an America’s Cup wingsail, based on a two-element AC72 profile. First a wind tunnel test campaign was carried out to generate a database for aerodynamic phenomena analyses and CFD validation. Unsteady RANS simulations were performed to predict and validate the flow characteristics on the wingsail, in the wind tunnel test conditions. The wind tunnel domain was fully modeled, in order to take into account the facility confinement effects. Numerical simulations in freestream and wind tunnel conditions were then compared with experimental data. This analysis shows the necessity to consider the wind tunnel walls when experimental and numerical data are compared. Numerical simulations correctly reproduce the flow field for low-to-moderate flow angles. However, discrepancies on the pressure distribution increase when the boundary layer starts to separate from the wingsail. In this regard, the flow generated by the slot between both elements of the wingsail is of paramount importance. This slot flow is analyzed in details through PIV measurements and numerical simulations. While the numerical simulation correctly predicts the jet flow itself, it only

  9. Is Fourier analysis performed by the visual system or by the visual investigator.

    Science.gov (United States)

    Ochs, A L

    1979-01-01

    A numerical Fourier transform was made of the pincushion grid illusion and the spectral components orthogonal to the illusory lines were isolated. Their inverse transform creates a picture of the illusion. The spatial-frequency response of cortical, simple receptive field neurons similarly filters the grid. A complete set of these neurons thus approximates a two-dimensional Fourier analyzer. One cannot conclude, however, that the brain actually uses frequency-domain information to interpret visual images.

  10. Formation Number Of Laminar Vortex Rings. Numerical Simulations

    International Nuclear Information System (INIS)

    Rosenfeld, M.; Rambod, E.; Gharib, M.

    1998-01-01

    The formation time scale of axisymmetric vortex rings is studied numerically for relatively long discharge times. Experimental findings on the existence and universality of a formation time scale, referred to as the formation number, are confirmed. The formation number is indicative of the time a vortex ring acquires its maximal circulation. For vortex rings generated by impulsive motion of a piston, the formation number was found experimentally to be approximately 4. Numerical extension of the experimental study to thick shear layers indicates that the scaled circulation of the pinched-off vortex is relatively insensitive of the details of the formation process, such as the velocity program, velocity profile or vortex generator geometry. In contrast, the formation number does depend on the velocity profile

  11. Towards Online Visualization and Interactive Monitoring of Real-Time CFD Simulations on Commodity Hardware

    Directory of Open Access Journals (Sweden)

    Nils Koliha

    2015-09-01

    Full Text Available Real-time rendering in the realm of computational fluid dynamics (CFD in particular and scientific high performance computing (HPC in general is a comparably young field of research, as the complexity of most problems with practical relevance is too high for a real-time numerical simulation. However, recent advances in HPC and the development of very efficient numerical techniques allow running first optimized numerical simulations in or near real-time, which in return requires integrated and optimized visualization techniques that do not affect performance. In this contribution, we present concepts, implementation details and several application examples of a minimally-invasive, efficient visualization tool for the interactive monitoring of 2D and 3D turbulent flow simulations on commodity hardware. The numerical simulations are conducted with ELBE, an efficient lattice Boltzmann environment based on NVIDIA CUDA (Compute Unified Device Architecture, which provides optimized numerical kernels for 2D and 3D computational fluid dynamics with fluid-structure interactions and turbulence.

  12. Large scale Direct Numerical Simulation of premixed turbulent jet flames at high Reynolds number

    Science.gov (United States)

    Attili, Antonio; Luca, Stefano; Lo Schiavo, Ermanno; Bisetti, Fabrizio; Creta, Francesco

    2016-11-01

    A set of direct numerical simulations of turbulent premixed jet flames at different Reynolds and Karlovitz numbers is presented. The simulations feature finite rate chemistry with 16 species and 73 reactions and up to 22 Billion grid points. The jet consists of a methane/air mixture with equivalence ratio ϕ = 0 . 7 and temperature varying between 500 and 800 K. The temperature and species concentrations in the coflow correspond to the equilibrium state of the burnt mixture. All the simulations are performed at 4 atm. The flame length, normalized by the jet width, decreases significantly as the Reynolds number increases. This is consistent with an increase of the turbulent flame speed due to the increased integral scale of turbulence. This behavior is typical of flames in the thin-reaction zone regime, which are affected by turbulent transport in the preheat layer. Fractal dimension and topology of the flame surface, statistics of temperature gradients, and flame structure are investigated and the dependence of these quantities on the Reynolds number is assessed.

  13. Numerical simulation of "an American haboob"

    OpenAIRE

    Vukovic, A.; Vujadinovic, M.; Pejanovic, G.; Andric, J.; Kumjian, M. R.; Djurdjevic, V.; Dacic, M.; Prasad, A. K.; El-Askary, H. M.; Paris, B. C.; Petkovic, S.; Nickovic, S.; Sprigg, W. A.

    2014-01-01

    A dust storm of fearful proportions hit Phoenix in the early evening hours of 5 July 2011. This storm, an American haboob, was predicted hours in advance because numerical, land–atmosphere modeling, computing power and remote sensing of dust events have improved greatly over the past decade. High-resolution numerical models are required for accurate simulation of the small scales of the haboob process, with high velocity surface winds produced by strong convection and severe...

  14. Theoretical and numerical study of highly anisotropic turbulent flows

    NARCIS (Netherlands)

    Biferale, L.; Daumont, I.; Lanotte, A.; Toschi, F.

    2004-01-01

    We present a detailed numerical study of anisotropic statistical fluctuations in stationary, homogeneous turbulent flows. We address both problems of intermittency in anisotropic sectors, and the relative importance of isotropic and anisotropic fluctuations at different scales on a direct numerical

  15. Reciprocal Engagement Between a Scientist and Visual Displays

    Science.gov (United States)

    Nolasco, Michelle Maria

    In this study the focus of investigation was the reciprocal engagement between a professional scientist and the visual displays with which he interacted. Visual displays are considered inextricable from everyday scientific endeavors and their interpretation requires a "back-and-forthness" between the viewers and the objects being viewed. The query that drove this study was: How does a scientist engage with visual displays during the explanation of his understanding of extremely small biological objects? The conceptual framework was based in embodiment where the scientist's talk, gesture, and body position were observed and microanalyzed. The data consisted of open-ended interviews that positioned the scientist to interact with visual displays when he explained the structure and function of different sub-cellular features. Upon microanalyzing the scientist's talk, gesture, and body position during his interactions with two different visual displays, four themes were uncovered: Naming, Layering, Categorizing, and Scaling . Naming occurred when the scientist added markings to a pre-existing, hand-drawn visual display. The markings had meaning as stand-alone label and iconic symbols. Also, the markings transformed the pre-existing visual display, which resulted in its function as a new visual object. Layering occurred when the scientist gestured over images so that his gestures aligned with one or more of the image's features, but did not touch the actual visual display. Categorizing occurred when the scientist used contrasting categories, e.g. straight vs. not straight, to explain his understanding about different characteristics that the small biological objects held. Scaling occurred when the scientist used gesture to resize an image's features so that they fit his bodily scale. Three main points were drawn from this study. First, the scientist employed a variety of embodied strategies—coordinated talk, gesture, and body position—when he explained the structure

  16. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  17. Is orbital volume associated with eyeball and visual cortex volume in humans?

    Science.gov (United States)

    Pearce, Eiluned; Bridge, Holly

    2013-01-01

    In humans orbital volume increases linearly with absolute latitude. Scaling across mammals between visual system components suggests that these larger orbits should translate into larger eyes and visual cortices in high latitude humans. Larger eyes at high latitudes may be required to maintain adequate visual acuity and enhance visual sensitivity under lower light levels. To test the assumption that orbital volume can accurately index eyeball and visual cortex volumes specifically in humans. Structural Magnetic Resonance Imaging (MRI) techniques are employed to measure eye and orbit (n = 88) and brain and visual cortex (n = 99) volumes in living humans. Facial dimensions and foramen magnum area (a proxy for body mass) were also measured. A significant positive linear relationship was found between (i) orbital and eyeball volumes, (ii) eyeball and visual cortex grey matter volumes and (iii) different visual cortical areas, independently of overall brain volume. In humans the components of the visual system scale from orbit to eye to visual cortex volume independently of overall brain size. These findings indicate that orbit volume can index eye and visual cortex volume in humans, suggesting that larger high latitude orbits do translate into larger visual cortices.

  18. A study of the effect of a visual arts-based program on the scores of Jefferson Scale for Physician Empathy.

    Science.gov (United States)

    Yang, Kuang-Tao; Yang, Jen-Hung

    2013-10-25

    The effect of visual arts interventions on development of empathy has not been quantitatively investigated. A study was conducted on the effect of a visual arts-based program on the scores of the Jefferson Scale for Physician Empathy (JSPE). A total of 110 clerks (n = 92) and first-year postgraduate residents (PGY1s) (n = 18) participating in the program were recruited into this study. The 4-hr program covered the subjects of learning to interpret paintings, interpreting paintings relating to medicine, illness and human suffering, the related-topics of humanitarianism and the other humanities fields and values and meaning. The JSPE was completed at the beginning (pretest) and the end (posttest) of the program. There was no significant difference between the pretest and posttest JSPE scores. The average of the scores for the pretest was lower in the subgroup of PGY1s than the subgroup of clerks (p = 0.0358). An increased but not significantly mean posttest JESPE score was noted for the subgroup of PGY1s. Neither the females nor the males had higher posttest JSPE scores than the pretest scores. Although using a structured visual arts-based program as an intervention may be useful to enhance medical students' empathy, our results failed to show a positive effect on the JSPE Scores for a group of clerks and PGY1s. This suggests that further experimental studies are needed if quantitative evaluation of the effectiveness of visual-arts based programs on empathy is to be investigated.

  19. Color extended visual cryptography using error diffusion.

    Science.gov (United States)

    Kang, InKoo; Arce, Gonzalo R; Lee, Heung-Kyu

    2011-01-01

    Color visual cryptography (VC) encrypts a color secret message into n color halftone image shares. Previous methods in the literature show good results for black and white or gray scale VC schemes, however, they are not sufficient to be applied directly to color shares due to different color structures. Some methods for color visual cryptography are not satisfactory in terms of producing either meaningless shares or meaningful shares with low visual quality, leading to suspicion of encryption. This paper introduces the concept of visual information pixel (VIP) synchronization and error diffusion to attain a color visual cryptography encryption method that produces meaningful color shares with high visual quality. VIP synchronization retains the positions of pixels carrying visual information of original images throughout the color channels and error diffusion generates shares pleasant to human eyes. Comparisons with previous approaches show the superior performance of the new method.

  20. Illustrative visualization of 3D city models

    Science.gov (United States)

    Doellner, Juergen; Buchholz, Henrik; Nienhaus, Marc; Kirsch, Florian

    2005-03-01

    This paper presents an illustrative visualization technique that provides expressive representations of large-scale 3D city models, inspired by the tradition of artistic and cartographic visualizations typically found in bird"s-eye view and panoramic maps. We define a collection of city model components and a real-time multi-pass rendering algorithm that achieves comprehensible, abstract 3D city model depictions based on edge enhancement, color-based and shadow-based depth cues, and procedural facade texturing. Illustrative visualization provides an effective visual interface to urban spatial information and associated thematic information complementing visual interfaces based on the Virtual Reality paradigm, offering a huge potential for graphics design. Primary application areas include city and landscape planning, cartoon worlds in computer games, and tourist information systems.

  1. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    Science.gov (United States)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  2. Lagrangian numerical methods for ocean biogeochemical simulations

    Science.gov (United States)

    Paparella, Francesco; Popolizio, Marina

    2018-05-01

    We propose two closely-related Lagrangian numerical methods for the simulation of physical processes involving advection, reaction and diffusion. The methods are intended to be used in settings where the flow is nearly incompressible and the Péclet numbers are so high that resolving all the scales of motion is unfeasible. This is commonplace in ocean flows. Our methods consist in augmenting the method of characteristics, which is suitable for advection-reaction problems, with couplings among nearby particles, producing fluxes that mimic diffusion, or unresolved small-scale transport. The methods conserve mass, obey the maximum principle, and allow to tune the strength of the diffusive terms down to zero, while avoiding unwanted numerical dissipation effects.

  3. Valuing Treatments for Parkinson Disease Incorporating Process Utility: Performance of Best-Worst Scaling, Time Trade-Off, and Visual Analogue Scales.

    Science.gov (United States)

    Weernink, Marieke G M; Groothuis-Oudshoorn, Catharina G M; IJzerman, Maarten J; van Til, Janine A

    2016-01-01

    The objective of this study was to compare treatment profiles including both health outcomes and process characteristics in Parkinson disease using best-worst scaling (BWS), time trade-off (TTO), and visual analogue scales (VAS). From the model comprising of seven attributes with three levels, six unique profiles were selected representing process-related factors and health outcomes in Parkinson disease. A Web-based survey (N = 613) was conducted in a general population to estimate process-related utilities using profile-based BWS (case 2), multiprofile-based BWS (case 3), TTO, and VAS. The rank order of the six profiles was compared, convergent validity among methods was assessed, and individual analysis focused on the differentiation between pairs of profiles with methods used. The aggregated health-state utilities for the six treatment profiles were highly comparable for all methods and no rank reversals were identified. On the individual level, the convergent validity between all methods was strong; however, respondents differentiated less in the utility of closely related treatment profiles with a VAS or TTO than with BWS. For TTO and VAS, this resulted in nonsignificant differences in mean utilities for closely related treatment profiles. This study suggests that all methods are equally able to measure process-related utility when the aim is to estimate the overall value of treatments. On an individual level, such as in shared decision making, BWS allows for better prioritization of treatment alternatives, especially if they are closely related. The decision-making problem and the need for explicit trade-off between attributes should determine the choice for a method. Copyright © 2016. Published by Elsevier Inc.

  4. Sensory information in local field potentials and spikes from visual and auditory cortices: time scales and frequency bands.

    Science.gov (United States)

    Belitski, Andrei; Panzeri, Stefano; Magri, Cesare; Logothetis, Nikos K; Kayser, Christoph

    2010-12-01

    Studies analyzing sensory cortical processing or trying to decode brain activity often rely on a combination of different electrophysiological signals, such as local field potentials (LFPs) and spiking activity. Understanding the relation between these signals and sensory stimuli and between different components of these signals is hence of great interest. We here provide an analysis of LFPs and spiking activity recorded from visual and auditory cortex during stimulation with natural stimuli. In particular, we focus on the time scales on which different components of these signals are informative about the stimulus, and on the dependencies between different components of these signals. Addressing the first question, we find that stimulus information in low frequency bands (50 Hz), in contrast, is scale dependent, and is larger when the energy is averaged over several hundreds of milliseconds. Indeed, combined analysis of signal reliability and information revealed that the energy of slow LFP fluctuations is well related to the stimulus even when considering individual or few cycles, while the energy of fast LFP oscillations carries information only when averaged over many cycles. Addressing the second question, we find that stimulus information in different LFP bands, and in different LFP bands and spiking activity, is largely independent regardless of time scale or sensory system. Taken together, these findings suggest that different LFP bands represent dynamic natural stimuli on distinct time scales and together provide a potentially rich source of information for sensory processing or decoding brain activity.

  5. Visualizing Structure and Dynamics of Disaccharide Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, J. F.; Beckham, G. T.; Himmel, M. E.; Crowley, M. F.

    2012-01-01

    We examine the effect of several solvent models on the conformational properties and dynamics of disaccharides such as cellobiose and lactose. Significant variation in timescale for large scale conformational transformations are observed. Molecular dynamics simulation provides enough detail to enable insight through visualization of multidimensional data sets. We present a new way to visualize conformational space for disaccharides with Ramachandran plots.

  6. A novel numerical approach for workspace determination of parallel mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Yiqun; Niu, Junchuan; Liu, Zhihui; Zhang, Fuliang [Shandong University, Shandong (China)

    2017-06-15

    In this paper, a novel numerical approach is proposed for workspace determination of parallel mechanisms. Compared with the classical numerical approaches, this presented approach discretizes both location and orientation of the mechanism simultaneously, not only one of the two. This technique makes the presented numerical approach applicable in determining almost all types of workspaces, while traditional numerical approaches are only applicable in determining the constant orientation workspace and orientation workspace. The presented approach and its steps to determine the inclusive orientation workspace and total orientation workspace are described in detail. A lower-mobility parallel mechanism and a six-degrees-of-freedom Stewart platform are set as examples, the workspaces of these mechanisms are estimated and visualized by the proposed numerical approach. Furthermore, the efficiency of the presented approach is discussed. The examples show that the presented approach is applicable in determining the inclusive orientation workspace and total orientation workspace of parallel mechanisms with high efficiency.

  7. Visualizing uncertainties with the North Wyke Farm Platform Data Sets

    Science.gov (United States)

    Harris, Paul; Brunsdon, Chris; Lee, Michael

    2016-04-01

    The North Wyke Farm Platform (NWFP) is a systems-based, farm-scale experiment with the aim of addressing agricultural productivity and ecosystem responses to different management practices. The 63 ha site captures the spatial and/or temporal data necessary to develop a better understanding of the dynamic processes and underlying mechanisms that can be used to model how agricultural grassland systems respond to different management inputs. Via cattle beef and sheep production, the underlying principle is to manage each of three farmlets (each consisting of five hydrologically-isolated sub-catchments) in three contrasting ways: (i) improvement of permanent pasture through use of mineral fertilizers; (ii) improvement through use of legumes; and (iii) improvement through innovation. The connectivity between the timing and intensity of the different management operations, together with the transport of nutrients and potential pollutants from the NWFP is evaluated using numerous inter-linked data collection exercises. In this paper, we introduce some of the visualization opportunities that are possible with this rich data resource, and methods of analysis that might be applied to it, in particular with respect to data and model uncertainty operating across both temporal and spatial dimensions. An important component of the NWFP experiment is the representation of trade-offs with respect to: (a) economic profits, (b) environmental concerns, and (c) societal benefits, under the umbrella of sustainable intensification. Various visualizations exist to display such trade-offs and here we demonstrate ways to tailor them to relay key uncertainties and assessments of risk; and also consider how these visualizations can be honed to suit different audiences.

  8. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.

    Science.gov (United States)

    Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V

    2016-08-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.

  9. Centrifuge in space fluid flow visualization experiment

    Science.gov (United States)

    Arnold, William A.; Wilcox, William R.; Regel, Liya L.; Dunbar, Bonnie J.

    1993-01-01

    A prototype flow visualization system is constructed to examine buoyancy driven flows during centrifugation in space. An axial density gradient is formed by imposing a thermal gradient between the two ends of the test cell. Numerical computations for this geometry showed that the Prandtl number plays a limited part in determining the flow.

  10. Full-scale experimental and numerical study about structural behaviour of a thin-walled cold-formed steel building affected by ground settlements due to land subsidence

    Directory of Open Access Journals (Sweden)

    J. A. Ortiz

    2015-11-01

    Full Text Available Land subsidence due to ground water withdrawal is a problem in many places around the world (Poland, 1984. This causes differential ground settlements that affect masonry structures, because these structural materials do not exhibit an adequate performance beyond a certain level of angular distortion. This work presents the experimental and numerical results about a study regarding the performance of a full-scale thin-walled cold-formed steel building affected by ground differential settlements due to land subsidence. The experimental stage consisted in the construction of a test-building to be subjected to differential settlements in laboratory. The numerical stage consisted in performing a numerical non-linear static pull-down analysis simulating the differential ground settlements of the test-building. The results show that the structural performance of the tested building was very suitable in terms of ductility.

  11. Numerical modelling of so-called secondary ultrasonic echoes

    International Nuclear Information System (INIS)

    Langenberg, K.J.; Fellinger, P.; Hofmann, C.

    1994-01-01

    The formation of secondary ultrasonic echoes is discussed for a particularly simple testing situation. This discussion is based upon the intuitive visualization of elastic wave propagation as obtained with the numerical EFIT-Code (Elastodynamic Finite Integration Technique). The resulting travel times for the econdary echoes contain well-defined limits as they originate from the simple model of grazing incidence plane longitudinal wave mode conversion. (orig.) [de

  12. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  13. Estimating surface fluxes using eddy covariance and numerical ogive optimization

    DEFF Research Database (Denmark)

    Sievers, J.; Papakyriakou, T.; Larsen, Søren Ejling

    2015-01-01

    Estimating representative surface fluxes using eddy covariance leads invariably to questions concerning inclusion or exclusion of low-frequency flux contributions. For studies where fluxes are linked to local physical parameters and up-scaled through numerical modelling efforts, low-frequency con......Estimating representative surface fluxes using eddy covariance leads invariably to questions concerning inclusion or exclusion of low-frequency flux contributions. For studies where fluxes are linked to local physical parameters and up-scaled through numerical modelling efforts, low...

  14. Visual cues for data mining

    Science.gov (United States)

    Rogowitz, Bernice E.; Rabenhorst, David A.; Gerth, John A.; Kalin, Edward B.

    1996-04-01

    This paper describes a set of visual techniques, based on principles of human perception and cognition, which can help users analyze and develop intuitions about tabular data. Collections of tabular data are widely available, including, for example, multivariate time series data, customer satisfaction data, stock market performance data, multivariate profiles of companies and individuals, and scientific measurements. In our approach, we show how visual cues can help users perform a number of data mining tasks, including identifying correlations and interaction effects, finding clusters and understanding the semantics of cluster membership, identifying anomalies and outliers, and discovering multivariate relationships among variables. These cues are derived from psychological studies on perceptual organization, visual search, perceptual scaling, and color perception. These visual techniques are presented as a complement to the statistical and algorithmic methods more commonly associated with these tasks, and provide an interactive interface for the human analyst.

  15. Direct numerical simulation of rotating fluid flow in a closed cylinder

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær; Christensen, Erik Adler

    1995-01-01

    , is validated against experimental visualizations of both transient and stable periodic flows. The complexity of the flow problem is illuminated numerically by injecting flow tracers into the flow domain and following their evolution in time. The vortex dynamics appears as stretching, folding and squeezing...

  16. Macular pigment and visual performance in glare: benefits for photostress recovery, disability glare, and visual discomfort.

    Science.gov (United States)

    Stringham, James M; Garcia, Paul V; Smith, Peter A; McLin, Leon N; Foutch, Brian K

    2011-09-22

    One theory of macular pigment's (MP) presence in the fovea is to improve visual performance in glare. This study sought to determine the effect of MP level on three aspects of visual performance in glare: photostress recovery, disability glare, and visual discomfort. Twenty-six subjects participated in the study. Spatial profiles of MP optical density were assessed with heterochromatic flicker photometry. Glare was delivered via high-bright-white LEDs. For the disability glare and photostress recovery portions of the experiment, the visual task consisted of correct identification of a 1° Gabor patch's orientation. Visual discomfort during the glare presentation was assessed with a visual discomfort rating scale. Pupil diameter was monitored with an infrared (IR) camera. MP level correlated significantly with all the outcome measures. Higher MP optical densities (MPODs) resulted in faster photostress recovery times (average P disability glare contrast thresholds (average P visual discomfort (P = 0.002). Smaller pupil diameter during glare presentation significantly correlated with higher visual discomfort ratings (P = 0.037). MP correlates with three aspects of visual performance in glare. Unlike previous studies of MP and glare, the present study used free-viewing conditions, in which effects of iris pigmentation and pupil size could be accounted for. The effects described, therefore, can be extended more confidently to real-world, practical visual performance benefits. Greater iris constriction resulted (paradoxically) in greater visual discomfort. This finding may be attributable to the neurobiologic mechanism that mediates the pain elicited by light.

  17. The Mass-Ratio Distribution of Visual Binary Stars

    NARCIS (Netherlands)

    Hogeveen, S.J.

    1990-01-01

    The selection effects that govern the observations of Visual Binary Stars are in- vestigated, in order to obtain a realistic statistical distribution of the mass-ratio q = Msec=Mprim. To this end a numerical simulation programme has been developed, which `generates' binary stars and `looks' at

  18. High accuracy mantle convection simulation through modern numerical methods

    KAUST Repository

    Kronbichler, Martin

    2012-08-21

    Numerical simulation of the processes in the Earth\\'s mantle is a key piece in understanding its dynamics, composition, history and interaction with the lithosphere and the Earth\\'s core. However, doing so presents many practical difficulties related to the numerical methods that can accurately represent these processes at relevant scales. This paper presents an overview of the state of the art in algorithms for high-Rayleigh number flows such as those in the Earth\\'s mantle, and discusses their implementation in the Open Source code Aspect (Advanced Solver for Problems in Earth\\'s ConvecTion). Specifically, we show how an interconnected set of methods for adaptive mesh refinement (AMR), higher order spatial and temporal discretizations, advection stabilization and efficient linear solvers can provide high accuracy at a numerical cost unachievable with traditional methods, and how these methods can be designed in a way so that they scale to large numbers of processors on compute clusters. Aspect relies on the numerical software packages deal.II and Trilinos, enabling us to focus on high level code and keeping our implementation compact. We present results from validation tests using widely used benchmarks for our code, as well as scaling results from parallel runs. © 2012 The Authors Geophysical Journal International © 2012 RAS.

  19. Correlated and uncorrelated heart rate fluctuations during relaxing visualization

    Science.gov (United States)

    Papasimakis, N.; Pallikari, F.

    2010-05-01

    The heart rate variability (HRV) of healthy subjects practicing relaxing visualization is studied by use of three multiscale analysis techniques: the detrended fluctuation analysis (DFA), the entropy in natural time (ENT) and the average wavelet (AWC) coefficient. The scaling exponent of normal interbeat interval increments exhibits characteristics of the presence of long-range correlations. During relaxing visualization the HRV dynamics change in the sense that two new features emerge independent of each other: a respiration-induced periodicity that often dominates the HRV at short scales (sleep.

  20. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    International Nuclear Information System (INIS)

    Chen, Goong; Wang, Yi-Ching; Gu, Cong; Perronnet, Alain; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O

    2017-01-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics . This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not

  1. The advanced role of computational mechanics and visualization in science and technology: analysis of the Germanwings Flight 9525 crash

    Science.gov (United States)

    Chen, Goong; Wang, Yi-Ching; Perronnet, Alain; Gu, Cong; Yao, Pengfei; Bin-Mohsin, Bandar; Hajaiej, Hichem; Scully, Marlan O.

    2017-03-01

    Computational mathematics, physics and engineering form a major constituent of modern computational science, which now stands on an equal footing with the established branches of theoretical and experimental sciences. Computational mechanics solves problems in science and engineering based upon mathematical modeling and computing, bypassing the need for expensive and time-consuming laboratory setups and experimental measurements. Furthermore, it allows the numerical simulations of large scale systems, such as the formation of galaxies that could not be done in any earth bound laboratories. This article is written as part of the 21st Century Frontiers Series to illustrate some state-of-the-art computational science. We emphasize how to do numerical modeling and visualization in the study of a contemporary event, the pulverizing crash of the Germanwings Flight 9525 on March 24, 2015, as a showcase. Such numerical modeling and the ensuing simulation of aircraft crashes into land or mountain are complex tasks as they involve both theoretical study and supercomputing of a complex physical system. The most tragic type of crash involves ‘pulverization’ such as the one suffered by this Germanwings flight. Here, we show pulverizing airliner crashes by visualization through video animations from supercomputer applications of the numerical modeling tool LS-DYNA. A sound validation process is challenging but essential for any sophisticated calculations. We achieve this by validation against the experimental data from a crash test done in 1993 of an F4 Phantom II fighter jet into a wall. We have developed a method by hybridizing two primary methods: finite element analysis and smoothed particle hydrodynamics. This hybrid method also enhances visualization by showing a ‘debris cloud’. Based on our supercomputer simulations and the visualization, we point out that prior works on this topic based on ‘hollow interior’ modeling can be quite problematic and, thus, not

  2. Numerical and physical testing of upscaling techniques for constitutive properties

    International Nuclear Information System (INIS)

    McKenna, S.A.; Tidwell, V.C.

    1995-01-01

    This paper evaluates upscaling techniques for hydraulic conductivity measurements based on accuracy and practicality for implementation in evaluating the performance of the potential repository at Yucca Mountain. Analytical and numerical techniques are compared to one another, to the results of physical upscaling experiments, and to the results obtained on the original domain. The results from different scaling techniques are then compared to the case where unscaled point scale statistics are used to generate realizations directly at the flow model grid-block scale. Initital results indicate that analytical techniques provide upscaling constitutive properties from the point measurement scale to the flow model grid-block scale. However, no single analytic technique proves to be adequate for all situations. Numerical techniques are also accurate, but they are time intensive and their accuracy is dependent on knowledge of the local flow regime at every grid-block

  3. Milking liquid nano-droplets by an IR laser: a new modality for the visualization of electric field lines

    International Nuclear Information System (INIS)

    Vespini, Veronica; Coppola, Sara; Grilli, Simonetta; Paturzo, Melania; Ferraro, Pietro

    2013-01-01

    Liquid handling at micron- and nano-scale is of paramount importance in many fields of application such as biotechnology and biochemistry. In fact, the microfluidics technologies play an important role in lab-on-a-chip devices and, in particular, the dispensing of liquid droplets is a required functionality. Different approaches have been developed for manipulating, dispensing and controlling nano-droplets under a wide variety of configurations. Here we demonstrate that nano-droplets can be drawn from liquid drop or film reservoirs through a sort of milking effect achieved by the absorption of IR laser radiation into a pyroelectric crystal. The generation of the pyroelectric field induced by the IR laser is calculated numerically and a specific experiment has been designed to visualize the electric field stream lines that are responsible for the liquid milking effect. The experiments performed are expected to open a new route for the visualization, measure and characterization procedures in the case of electrohydrodynamic applications. (paper)

  4. Time scales of DNAPL migration in sandy aquifers examined via numerical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard, J.I.; Pang, T.; Kueper, B.H. [University of Edinburgh, Edinburgh (United Kingdom). Inst. of Infrastructure & Environmental

    2007-03-15

    The time required for dense nonaqueous phase liquid (DNAPL) to cease migrating following release to the subsurface is a valuable component of a site conceptual model. This study uses numerical simulation to investigate the migration of six different DNAPLs in sandy aquifers. The most influential parameters governing migration cessation time are the density and viscosity of the DNAPL and the mean hydraulic conductivity of the aquifer. Releases of between 1 and 40 drums of chlorinated solvent DNAPLs, characterized by relatively high density and low viscosity, require on the order of months to a few years to cease migrating in a heterogeneous medium sand aquifer having an average hydraulic conductivity of 7.4 x 10{sup -3} cm/s. In contrast to this, the release of 20 drums of coal tar {rho}{sub D} = 1061 kg/m{sup 3}, {mu}{sub D} = 0.161 Pa(.)s) requires more than 100 years to cease migrating in the same aquifer. Altering the mean hydraulic conductivity of the aquifer results in a proportional change in cessation times. Parameters that exhibit relatively little influence on migration time scales are the DNAPL-water interfacial tension, release volume, source capillary pressure, mean aquifer porosity, and ambient ground water hydraulic gradient. This study also demonstrates that low-density DNAPLs (e.g., coal tar) give rise to greater amounts of lateral spreading and greater amounts of pooling on capillary barriers than high-density DNAPLs such as trichloroethylene or tetrachloroethylene.

  5. Experimental and numerical studies in a vortex tube

    International Nuclear Information System (INIS)

    Sohn, Chang Hyun; Kim, Chang Soo; Gowda, B. H. L Lakshmana; Jung, Ui Hyun

    2006-01-01

    The present investigation deals with the study of the internal flow phenomena of the counter-flow type vortex tube using experimental testing and numerical simulation. Visualization was carried out using the surface tracing method, injecting dye on the vortex tube wall using a needle. Vortex tube is made of acrylic to visualize the surface particle tracing and the input air pressure was varied from 0.1 MPa to 0.3 MPa. The experimentally visualized results on the tube show that there is an apparent sudden changing of the trajectory on the vortex tube wall which was observed in every experimental test case. This may indicate the stagnation position of the vortex flow. The visualized stagnation position moves towards the vortex generator with increase in cold flow ratio and input pressure. Three-dimensional computational study is also conducted to obtain more detailed flow information in the vortex tube. Calculated total pressure, static pressure and total temperature distributions in the vortex tube were in good agreement with the experimental data. The computational particle trace on the vortex tube wall is very similar to that observed in experiments

  6. Defining the minimal important difference for the visual analogue scale assessing dyspnea in patients with malignant pleural effusions.

    Directory of Open Access Journals (Sweden)

    Eleanor K Mishra

    Full Text Available The minimal important difference (MID is essential for interpreting the results of randomised controlled trials (RCTs. Despite a number of RCTs in patients with malignant pleural effusions (MPEs which use the visual analogue scale for dyspnea (VASD as an outcome measure, the MID has not been established.Patients with suspected MPE undergoing a pleural procedure recorded their baseline VASD and their post-procedure VASD (24 hours after the pleural drainage, and in parallel assessed their breathlessness on a 7 point Likert scale.The mean decrease in VASD in patients with a MPE reporting a 'small but just worthwhile decrease' in their dyspnea (i.e. equivalent to the MID was 19mm (95% CI 14-24mm. The mean drainage volume required to produce a change in VASD of 19mm was 760ml.The mean MID for the VASD in patients with a MPE undergoing a pleural procedure is 19mm (95% CI 14-24mm. Thus choosing an improvement of 19mm in the VASD would be justifiable in the design and analysis of future MPE studies.

  7. Numerical Experiments Providing New Insights into Plasma Focus Fusion Devices

    Directory of Open Access Journals (Sweden)

    Sing Lee

    2010-04-01

    Full Text Available Recent extensive and systematic numerical experiments have uncovered new insights into plasma focus fusion devices including the following: (1 a plasma current limitation effect, as device static inductance is reduced towards very small values; (2 scaling laws of neutron yield and soft x-ray yield as functions of storage energies and currents; (3 a global scaling law for neutron yield as a function of storage energy combining experimental and numerical data showing that scaling deterioration has probably been interpreted as neutron ‘saturation’; and (4 a fundamental cause of neutron ‘saturation’. The ground-breaking insights thus gained may completely change the directions of plasma focus fusion research.

  8. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  9. VISUAL3D - An EIT network on visualization of geomodels

    Science.gov (United States)

    Bauer, Tobias

    2017-04-01

    When it comes to interpretation of data and understanding of deep geological structures and bodies at different scales then modelling tools and modelling experience is vital for deep exploration. Geomodelling provides a platform for integration of different types of data, including new kinds of information (e.g., new improved measuring methods). EIT Raw Materials, initiated by the EIT (European Institute of Innovation and Technology) and funded by the European Commission, is the largest and strongest consortium in the raw materials sector worldwide. The VISUAL3D network of infrastructure is an initiative by EIT Raw Materials and aims at bringing together partners with 3D-4D-visualisation infrastructure and 3D-4D-modelling experience. The recently formed network collaboration interlinks hardware, software and expert knowledge in modelling visualization and output. A special focus will be the linking of research, education and industry and integrating multi-disciplinary data and to visualize the data in three and four dimensions. By aiding network collaborations we aim at improving the combination of geomodels with differing file formats and data characteristics. This will create an increased competency in modelling visualization and the ability to interchange and communicate models more easily. By combining knowledge and experience in geomodelling with expertise in Virtual Reality visualization partners of EIT Raw Materials but also external parties will have the possibility to visualize, analyze and validate their geomodels in immersive VR-environments. The current network combines partners from universities, research institutes, geological surveys and industry with a strong background in geological 3D-modelling and 3D visualization and comprises: Luleå University of Technology, Geological Survey of Finland, Geological Survey of Denmark and Greenland, TUBA Freiberg, Uppsala University, Geological Survey of France, RWTH Aachen, DMT, KGHM Cuprum, Boliden, Montan

  10. Wired! and Visualizing Venice: Scaling up Digital Art History

    OpenAIRE

    Lanzoni, Kristin Huffman; Olson, Mark James-Vrooman; Szabo, Victoria E.

    2015-01-01

    This article focuses on Visualizing Venice, an interdisciplinary, cross-cultural collaboration that engages in mapping, 3-D modeling, and multimedia representations of historical change in Venice, Italy. Through a “laboratory” approach that integrates students and faculty in multi-year research teams, we ask new questions and pursue emerging lines of inquiry about architectural monuments, their relation to the larger urban setting, and the role of sculptural and painted decoration in sacred s...

  11. Allen Brain Atlas-Driven Visualizations: a web-based gene expression energy visualization tool.

    Science.gov (United States)

    Zaldivar, Andrew; Krichmar, Jeffrey L

    2014-01-01

    The Allen Brain Atlas-Driven Visualizations (ABADV) is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA) across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  12. Allen Brain Atlas-Driven Visualizations: A Web-Based Gene Expression Energy Visualization Tool

    Directory of Open Access Journals (Sweden)

    Andrew eZaldivar

    2014-05-01

    Full Text Available The Allen Brain Atlas-Driven Visualizations (ABADV is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  13. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    Science.gov (United States)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  14. Exclusively Visual Analysis of Classroom Group Interactions

    Science.gov (United States)

    Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric

    2016-01-01

    Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data…

  15. Methods of scaling threshold color difference using printed samples

    Science.gov (United States)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  16. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  17. [Acceptance and understandability of various methods of health valuations for the chronically ill: willingness to pay, visual analogue scale and rating scale].

    Science.gov (United States)

    Meder, M; Farin, E

    2009-11-01

    Health valuations are one way of measuring patient preferences with respect to the results of their treatment. The study examines three different methods of health valuations--willingness to pay (WTP), visual analogue scale (VAS), and a rating question for evaluating the subjective significance. The goal is to test the understandability and acceptance of these methods for implementation in questionnaires. In various rehabilitation centres, a total of six focus groups were conducted with 5-9 patients each with a mean age of 57.1 years. The illnesses considered were chronic-ischaemic heart disease, chronic back pain, and breast cancer. Patients filled out a questionnaire that was then discussed in the group. In addition to the quantitative evaluation of the data in the questionnaire, a qualitative analysis of the contents of the group discussion protocols was made. We have results from a total of 42 patients. 14.6% of the patients had "great difficulties" understanding the WTP or rated it as "completely incomprehensible"; this value was 7.3% for VAS and 0% for the rating scale. With respect to acceptance, 31.0% of the patients indicated that they were "not really" or "not at all" willing to answer such a WTP question in a questionnaire; this was 6.6% for the VAS, and again 0% for the rating scale. The qualitative analysis provided an indication as to why some patients view the WTP question in particular in a negative light. Many difficulties in understanding it were related to the formulation of the question and the structure of the questionnaire. However, the patients' statements also made it apparent that the hypothetical nature of the WTP questionnaire was not always recognised. The most frequent reason for the lack of acceptance of the WTP was the patients' fear of negative financial consequences of their responses. With respect to understandability and acceptance, VAS questions appear to be better suited for reflecting patient preferences than WTP questions. The

  18. Visual reproduction on the Wechsler Memory Scale-Revised as a predictor of Alzheimer's disease in Japanese patients with mild cognitive impairments.

    Science.gov (United States)

    Hori, Takumi; Sanjo, Nobuo; Tomita, Makoto; Mizusawa, Hidehiro

    2013-01-01

    The Visual Reproduction (VR) test is used to assess mild cognitive impairment (MCI), but the characteristics of visual memory in Japanese MCI patients remain unclear. VR scores of 27 MCI patients were evaluated using the Wechsler Memory Scale-Revised. Scores of MCI, no-dementia, and Alzheimer's disease (AD) groups were then compared. The annual conversion rate of MCI to AD was 18.8%. Mean VR-I and VR-II baseline scores for MCI patients were 33.3 ± 5.6 and 20.5 ± 14.0, respectively. Mean VR-II scores for converted and nonconverted MCI patients were 7.2 ± 8.7 and 29.8 ± 9.3, respectively. It is likely that VR-II and VR-II/I scores are more sensitive for predicting conversion to AD in Japanese than in American MCI patients. Our results indicate that VR is a sensitive and useful measure for predicting the conversion of Japanese MCI patients to AD within 2 years. Copyright © 2013 S. Karger AG, Basel.

  19. Costs of cervical cancer screening and treatment using visual inspection with acetic acid (VIA) and cryotherapy in Ghana: the importance of scale.

    OpenAIRE

    Quentin, W; Adu-Sarkodie, Y; Terris-Prestholt, F; Legood, R; Opoku, BK; Mayaud, P

    2011-01-01

    OBJECTIVES: To estimate the incremental costs of visual inspection with acetic acid (VIA) and cryotherapy at cervical cancer screening facilities in Ghana; to explore determinants of costs through modelling; and to estimate national scale-up and annual programme costs. METHODS: Resource-use data were collected at four out of six active VIA screening centres, and unit costs were ascertained to estimate the costs per woman of VIA and cryotherapy. Modelling and sensitivity analysis were used to ...

  20. Emergence of realism: Enhanced visual artistry and high accuracy of visual numerosity representation after left prefrontal damage.

    Science.gov (United States)

    Takahata, Keisuke; Saito, Fumie; Muramatsu, Taro; Yamada, Makiko; Shirahase, Joichiro; Tabuchi, Hajime; Suhara, Tetsuya; Mimura, Masaru; Kato, Motoichiro

    2014-05-01

    Over the last two decades, evidence of enhancement of drawing and painting skills due to focal prefrontal damage has accumulated. It is of special interest that most artworks created by such patients were highly realistic ones, but the mechanism underlying this phenomenon remains to be understood. Our hypothesis is that enhanced tendency of realism was associated with accuracy of visual numerosity representation, which has been shown to be mediated predominantly by right parietal functions. Here, we report a case of left prefrontal stroke, where the patient showed enhancement of artistic skills of realistic painting after the onset of brain damage. We investigated cognitive, functional and esthetic characteristics of the patient׳s visual artistry and visual numerosity representation. Neuropsychological tests revealed impaired executive function after the stroke. Despite that, the patient׳s visual artistry related to realism was rather promoted across the onset of brain damage as demonstrated by blind evaluation of the paintings by professional art reviewers. On visual numerical cognition tasks, the patient showed higher performance in comparison with age-matched healthy controls. These results paralleled increased perfusion in the right parietal cortex including the precuneus and intraparietal sulcus. Our data provide new insight into mechanisms underlying change in artistic style due to focal prefrontal lesion. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. At-risk and intervention thresholds of occupational stress using a visual analogue scale

    Science.gov (United States)

    Pereira, Bruno; Moustafa, Farès; Naughton, Geraldine; Lesage, François-Xavier; Lambert, Céline

    2017-01-01

    Background The visual analogue scale (VAS) is widely used in clinical practice by occupational physicians to assess perceived stress in workers. However, a single cut-off (black-or-white decision) inadequately discriminates between workers with and without stress. We explored an innovative statistical approach to distinguish an at-risk population among stressed workers, and to establish a threshold over which an action is urgently required, via the use of two cut-offs. Methods Participants were recruited during annual work medical examinations by a random sample of workers from five occupational health centres. We previously proposed a single cut-off of VAS stress in comparison with the Perceived Stress Scale (PSS14). Similar methodology was used in the current study, along with a gray zone approach. The lower limit of the gray zone supports sensitivity (“at-risk” threshold; interpreted as requiring closer surveillance) and the upper limit supports specificity (i.e. “intervention” threshold–emergency action required). Results We included 500 workers (49.6% males), aged 40±11 years, with a PSS14 score of 3.8±1.4 and a VAS score of 4.0±2.4. Using a receiver operating characteristic curve and the PSS cut-off score of 7.2, the optimal VAS threshold was 6.8 (sensitivity = 0.89, specificity = 0.87). The lower and upper thresholds of the gray zone were 5 and 8.2, respectively. Conclusions We identified two clinically relevant cut-offs on the VAS of stress: a first cut-off of 5.0 for an at-risk population, and a second cut-off of 8.2 over which an action is urgently required. Future investigations into the relationships between this upper threshold and deleterious events are required. PMID:28586383

  2. Aviation Model: A Fine-Scale Numerical Weather Prediction System for Aviation Applications at the Hong Kong International Airport

    Directory of Open Access Journals (Sweden)

    Wai-Kin Wong

    2013-01-01

    Full Text Available The Hong Kong Observatory (HKO is planning to implement a fine-resolution Numerical Weather Prediction (NWP model for supporting the aviation weather applications at the Hong Kong International Airport (HKIA. This new NWP model system, called Aviation Model (AVM, is configured at a horizontal grid spacing of 600 m and 200 m. It is based on the WRF-ARW (Advance Research WRF model that can have sufficient computation efficiency in order to produce hourly updated forecasts up to 9 hours ahead on a future high performance computer system with theoretical peak performance of around 10 TFLOPS. AVM will be nested inside the operational mesoscale NWP model of HKO with horizontal resolution of 2 km. In this paper, initial numerical experiment results in forecast of windshear events due to seabreeze and terrain effect are discussed. The simulation of sea-breeze-related windshear is quite successful, and the headwind change observed from flight data could be reproduced in the model forecast. Some impacts of physical processes on generating the fine-scale wind circulation and development of significant convection are illustrated. The paper also discusses the limitations in the current model setup and proposes methods for the future development of AVM.

  3. Evaluation of pain incidence and pain management in a South ...

    African Journals Online (AJOL)

    Design. A prospective observational study, using the Numerical Rating Scale for pain (NRS pain), Numerical Rating Scale for anxiety (NRS anxiety), the Alder Hey Triage Pain Score (AHTPS), the COMFORT behaviour scale and the Touch Visual Pain Scale (TVPS). All patients were assessed at admission; those who were ...

  4. Wind conditions in urban layout - Numerical and experimental research

    Science.gov (United States)

    Poćwierz, Marta; Zielonko-Jung, Katarzyna

    2018-01-01

    This paper presents research which compares the numerical and the experimental results for different cases of airflow around a few urban layouts. The study is concerned mostly with the analysis of parameters, such as pressure and velocity fields, which are essential in the building industry. Numerical simulations have been performed by the commercial software Fluent, with the use of a few different turbulence models, including popular k-ɛ, k-ɛ realizable or k-ω. A particular attention has been paid to accurate description of the conditions on the inlet and the selection of suitable computing grid. The pressure measurement near buildings and oil visualization were undertaken and described accordingly.

  5. GRNsight: a web application and service for visualizing models of small- to medium-scale gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Kam D. Dahlquist

    2016-09-01

    Full Text Available GRNsight is a web application and service for visualizing models of gene regulatory networks (GRNs. A gene regulatory network (GRN consists of genes, transcription factors, and the regulatory connections between them which govern the level of expression of mRNA and protein from genes. The original motivation came from our efforts to perform parameter estimation and forward simulation of the dynamics of a differential equations model of a small GRN with 21 nodes and 31 edges. We wanted a quick and easy way to visualize the weight parameters from the model which represent the direction and magnitude of the influence of a transcription factor on its target gene, so we created GRNsight. GRNsight automatically lays out either an unweighted or weighted network graph based on an Excel spreadsheet containing an adjacency matrix where regulators are named in the columns and target genes in the rows, a Simple Interaction Format (SIF text file, or a GraphML XML file. When a user uploads an input file specifying an unweighted network, GRNsight automatically lays out the graph using black lines and pointed arrowheads. For a weighted network, GRNsight uses pointed and blunt arrowheads, and colors the edges and adjusts their thicknesses based on the sign (positive for activation or negative for repression and magnitude of the weight parameter. GRNsight is written in JavaScript, with diagrams facilitated by D3.js, a data visualization library. Node.js and the Express framework handle server-side functions. GRNsight’s diagrams are based on D3.js’s force graph layout algorithm, which was then extensively customized to support the specific needs of GRNs. Nodes are rectangular and support gene labels of up to 12 characters. The edges are arcs, which become straight lines when the nodes are close together. Self-regulatory edges are indicated by a loop. When a user mouses over an edge, the numerical value of the weight parameter is displayed. Visualizations can

  6. The multiple sclerosis visual pathway cohort: understanding neurodegeneration in MS.

    Science.gov (United States)

    Martínez-Lapiscina, Elena H; Fraga-Pumar, Elena; Gabilondo, Iñigo; Martínez-Heras, Eloy; Torres-Torres, Ruben; Ortiz-Pérez, Santiago; Llufriu, Sara; Tercero, Ana; Andorra, Magi; Roca, Marc Figueras; Lampert, Erika; Zubizarreta, Irati; Saiz, Albert; Sanchez-Dalmau, Bernardo; Villoslada, Pablo

    2014-12-15

    Multiple Sclerosis (MS) is an immune-mediated disease of the Central Nervous System with two major underlying etiopathogenic processes: inflammation and neurodegeneration. The latter determines the prognosis of this disease. MS is the main cause of non-traumatic disability in middle-aged populations. The MS-VisualPath Cohort was set up to study the neurodegenerative component of MS using advanced imaging techniques by focusing on analysis of the visual pathway in a middle-aged MS population in Barcelona, Spain. We started the recruitment of patients in the early phase of MS in 2010 and it remains permanently open. All patients undergo a complete neurological and ophthalmological examination including measurements of physical and disability (Expanded Disability Status Scale; Multiple Sclerosis Functional Composite and neuropsychological tests), disease activity (relapses) and visual function testing (visual acuity, color vision and visual field). The MS-VisualPath protocol also assesses the presence of anxiety and depressive symptoms (Hospital Anxiety and Depression Scale), general quality of life (SF-36) and visual quality of life (25-Item National Eye Institute Visual Function Questionnaire with the 10-Item Neuro-Ophthalmic Supplement). In addition, the imaging protocol includes both retinal (Optical Coherence Tomography and Wide-Field Fundus Imaging) and brain imaging (Magnetic Resonance Imaging). Finally, multifocal Visual Evoked Potentials are used to perform neurophysiological assessment of the visual pathway. The analysis of the visual pathway with advance imaging and electrophysilogical tools in parallel with clinical information will provide significant and new knowledge regarding neurodegeneration in MS and provide new clinical and imaging biomarkers to help monitor disease progression in these patients.

  7. Does an increase in compression force really improve visual image quality in mammography? – An initial investigation

    International Nuclear Information System (INIS)

    Mercer, C.E.; Hogg, P.; Cassidy, S.; Denton, E.R.E.

    2013-01-01

    Objective: Literature speculates that visual image quality (IQ) and compression force levels may be directly related. This small study investigates whether a relationship exists between compression force levels and visual IQ. Method: To investigate how visual IQ varies with different levels of compression force, 39 clients were selected over a 6 year screening period that had received markedly different amounts of compression force on each of their three sequential screens. Images for the 3 screening episodes for all women were scored visually using 3 different IQ scales. Results: Correlation coefficients between the 3 IQ scales were positive and high (0.82, 0.9 and 0.85). For the scales, the IQ scores their correlation does not vary significantly, even though different compression levels had been applied. Kappa IQ scale 1: 0.92, 0.89, 0.89. ANOVA IQ scale 2: p = 0.98, p = 0.55, p = 0.56. ICC IQ scale 3: 0.97, 0.93, 0.91. Conclusion: For the 39 clients there is no difference in visual IQ when different amounts of compression are applied. We believe that further work should be conducted into compression force and image quality as ‘higher levels’ of compression force may not be justified in the attainment of suitable visual image quality

  8. Numerical analysis and scale experiment design of the hot water layer system of the Brazilian Multipurpose Reactor (RMB reactor)

    International Nuclear Information System (INIS)

    Schweizer, Fernando Lage Araújo

    2014-01-01

    The Brazilian Multipurpose Reactor (RMB) consists in a 30 MW open pool research reactor and its design is currently in development. The RMB is intended to produce a neutron flux applied at material irradiation for radioisotope production and materials and nuclear fuel tests. The reactor is immersed in a deep water pool needed for radiation shielding and thermal protection. A heating and purifying system is applied in research reactors with high thermal power in order to create a Hot Water Layer (HWL) on the pool top preventing that contaminated water from the reactor core neighboring reaches its surface reducing the room radiation dose rate. This dissertation presents a study of the HWL behavior during the reactor operation first hours where perturbations due to the cooling system and pool heating induce a mixing flow in the HWL reducing its protection. Numerical simulations using the CFD code CFX 14.0 have been performed for theoretical dose rate estimation during reactor operation, for a 1/10 scaled down model using dimensional analysis and mesh testing as an initial verification of the commercial code application. Equipment and sensor needed for an experimental bench project were defined by the CFD numerical simulation. (author)

  9. Medicine in words and numbers: a cross-sectional survey comparing probability assessment scales

    Directory of Open Access Journals (Sweden)

    Koele Pieter

    2007-06-01

    Full Text Available Abstract Background In the complex domain of medical decision making, reasoning under uncertainty can benefit from supporting tools. Automated decision support tools often build upon mathematical models, such as Bayesian networks. These networks require probabilities which often have to be assessed by experts in the domain of application. Probability response scales can be used to support the assessment process. We compare assessments obtained with different types of response scale. Methods General practitioners (GPs gave assessments on and preferences for three different probability response scales: a numerical scale, a scale with only verbal labels, and a combined verbal-numerical scale we had designed ourselves. Standard analyses of variance were performed. Results No differences in assessments over the three response scales were found. Preferences for type of scale differed: the less experienced GPs preferred the verbal scale, the most experienced preferred the numerical scale, with the groups in between having a preference for the combined verbal-numerical scale. Conclusion We conclude that all three response scales are equally suitable for supporting probability assessment. The combined verbal-numerical scale is a good choice for aiding the process, since it offers numerical labels to those who prefer numbers and verbal labels to those who prefer words, and accommodates both more and less experienced professionals.

  10. Predicting SF-6D utility scores from the Oswestry disability index and numeric rating scales for back and leg pain.

    Science.gov (United States)

    Carreon, Leah Y; Glassman, Steven D; McDonough, Christine M; Rampersaud, Raja; Berven, Sigurd; Shainline, Michael

    2009-09-01

    Cross-sectional cohort. The purpose of this study is to provide a model to allow estimation of utility from the Short Form (SF)-6D using data from the Oswestry Disability Index (ODI), Back Pain Numeric Rating Scale (BPNRS), and the Leg Pain Numeric Rating Scale (LPNRS). Cost-utility analysis provides important information about the relative value of interventions and requires a measure of utility not often available from clinical trial data. The ODI and numeric rating scales for back (BPNRS) and leg pain (LPNRS), are widely used disease-specific measures for health-related quality of life in patients with lumbar degenerative disorders. The purpose of this study is to provide a model to allow estimation of utility from the SF-6D using data from the ODI, BPNRS, and the LPNRS. SF-36, ODI, BPNRS, and LPNRS were prospectively collected before surgery, at 12 and 24 months after surgery in 2640 patients undergoing lumbar fusion for degenerative disorders. Spearman correlation coefficients for paired observations from multiple time points between ODI, BPNRS, and LPNRS, and SF-6D utility scores were determined. Regression modeling was done to compute the SF-6D score from the ODI, BPNRS, and LPNRS. Using a separate, independent dataset of 2174 patients in which actual SF-6D and ODI scores were available, the SF-6D was estimated for each subject and compared to their actual SF-6D. In the development sample, the mean age was 52.5 +/- 15 years and 34% were male. In the validation sample, the mean age was 52.9 +/- 14.2 years and 44% were male. Correlations between the SF-6D and the ODI, BPNRS, and LPNRS were statistically significant (P < 0.0001) with correlation coefficients of 0.82, 0.78, and 0.72, respectively. The regression equation using ODI, BPNRS,and LPNRS to predict SF-6D had an R of 0.69 and a root mean square error of 0.076. The model using ODI alone had an R of 0.67 and a root mean square error of 0.078. The correlation coefficient between the observed and estimated

  11. Visualizing Dynamic Bitcoin Transaction Patterns.

    Science.gov (United States)

    McGinn, Dan; Birch, David; Akroyd, David; Molina-Solana, Miguel; Guo, Yike; Knottenbelt, William J

    2016-06-01

    This work presents a systemic top-down visualization of Bitcoin transaction activity to explore dynamically generated patterns of algorithmic behavior. Bitcoin dominates the cryptocurrency markets and presents researchers with a rich source of real-time transactional data. The pseudonymous yet public nature of the data presents opportunities for the discovery of human and algorithmic behavioral patterns of interest to many parties such as financial regulators, protocol designers, and security analysts. However, retaining visual fidelity to the underlying data to retain a fuller understanding of activity within the network remains challenging, particularly in real time. We expose an effective force-directed graph visualization employed in our large-scale data observation facility to accelerate this data exploration and derive useful insight among domain experts and the general public alike. The high-fidelity visualizations demonstrated in this article allowed for collaborative discovery of unexpected high frequency transaction patterns, including automated laundering operations, and the evolution of multiple distinct algorithmic denial of service attacks on the Bitcoin network.

  12. Visualizing Dynamic Bitcoin Transaction Patterns

    Science.gov (United States)

    McGinn, Dan; Birch, David; Akroyd, David; Molina-Solana, Miguel; Guo, Yike; Knottenbelt, William J.

    2016-01-01

    Abstract This work presents a systemic top-down visualization of Bitcoin transaction activity to explore dynamically generated patterns of algorithmic behavior. Bitcoin dominates the cryptocurrency markets and presents researchers with a rich source of real-time transactional data. The pseudonymous yet public nature of the data presents opportunities for the discovery of human and algorithmic behavioral patterns of interest to many parties such as financial regulators, protocol designers, and security analysts. However, retaining visual fidelity to the underlying data to retain a fuller understanding of activity within the network remains challenging, particularly in real time. We expose an effective force-directed graph visualization employed in our large-scale data observation facility to accelerate this data exploration and derive useful insight among domain experts and the general public alike. The high-fidelity visualizations demonstrated in this article allowed for collaborative discovery of unexpected high frequency transaction patterns, including automated laundering operations, and the evolution of multiple distinct algorithmic denial of service attacks on the Bitcoin network. PMID:27441715

  13. Comparison of Wechsler Memory Scale-Fourth Edition (WMS-IV) and Third Edition (WMS-III) dimensional structures: improved ability to evaluate auditory and visual constructs.

    Science.gov (United States)

    Hoelzle, James B; Nelson, Nathaniel W; Smith, Clifford A

    2011-03-01

    Dimensional structures underlying the Wechsler Memory Scale-Fourth Edition (WMS-IV) and Wechsler Memory Scale-Third Edition (WMS-III) were compared to determine whether the revised measure has a more coherent and clinically relevant factor structure. Principal component analyses were conducted in normative samples reported in the respective technical manuals. Empirically supported procedures guided retention of dimensions. An invariant two-dimensional WMS-IV structure reflecting constructs of auditory learning/memory and visual attention/memory (C1 = .97; C2 = .96) is more theoretically coherent than the replicable, heterogeneous WMS-III dimension (C1 = .97). This research suggests that the WMS-IV may have greater utility in identifying lateralized memory dysfunction.

  14. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity.

    Science.gov (United States)

    Al-Awami, Ali K; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter; Hadwiger, Markus

    2014-12-01

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  15. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity

    KAUST Repository

    Al-Awami, Ali K.; Beyer, Johanna; Strobelt, Hendrik; Kasthuri, Narayanan; Lichtman, Jeff W.; Pfister, Hanspeter; Hadwiger, Markus

    2014-01-01

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  16. NeuroLines: A Subway Map Metaphor for Visualizing Nanoscale Neuronal Connectivity

    KAUST Repository

    Al-Awami, Ali K.

    2014-12-31

    We present NeuroLines, a novel visualization technique designed for scalable detailed analysis of neuronal connectivity at the nanoscale level. The topology of 3D brain tissue data is abstracted into a multi-scale, relative distance-preserving subway map visualization that allows domain scientists to conduct an interactive analysis of neurons and their connectivity. Nanoscale connectomics aims at reverse-engineering the wiring of the brain. Reconstructing and analyzing the detailed connectivity of neurons and neurites (axons, dendrites) will be crucial for understanding the brain and its development and diseases. However, the enormous scale and complexity of nanoscale neuronal connectivity pose big challenges to existing visualization techniques in terms of scalability. NeuroLines offers a scalable visualization framework that can interactively render thousands of neurites, and that supports the detailed analysis of neuronal structures and their connectivity. We describe and analyze the design of NeuroLines based on two real-world use-cases of our collaborators in developmental neuroscience, and investigate its scalability to large-scale neuronal connectivity data.

  17. Interactive Collaborative Visualization in the Geosciences

    Science.gov (United States)

    Bollig, E. F.; Kadlec, B. J.; Erlebacher, G.; Yuen, D. A.; Palchuk, Y. M.

    2004-12-01

    Datasets in the earth sciences continue growing in size due to higher experimental resolving power, and numerical simulations at higher resolutions. Over the last several years, an increasing number of scientists have turned to visualization to represent their vast datasets in a meaningful fashion. In most cases, datasets are downloaded and then visualized on a local workstation with 2D or 3D software packages. However, it becomes inconvenient to download datasets of several gigabytes unless network bandwidth is sufficiently high (10 Gbits/sec). We are investigating the use of Virtual Network Computing (VNC) to provide interactive three-dimensional visualization services to the user community. Specialized software [1,2] enables OpenGL-based visualization software to capitalize on the hardware capabilities of modern graphics cards and transfer session information to clients through the VNC protocol. The virtue of this software is that it does not require any changes to visualization software. Session information is displayed within java applets, enabling the use of a wide variety of clients, including hand-held devices. The VNC protocol makes collaboration and interaction between multiple users possible. We demonstrate the collaborative VNC system with the commercial 3D visualization system Amira (http://www.tgs.com) and the open source VTK (http://www.vtk.org) over a 100 Mbit network. We also present ongoing work for integrating VNC within the Naradabrokering Grid environment. [1] Stegmaier, S. and Magallon, M. and T. Ertl, "A Generic Solution for Hardware-Accelerated Remote Visualization," Joint Eurographics -- IEEE TCVG Symposium on Visualization, 2002. [2] VirtualGL--3D without boundaries http://virtualgl.sourceforge.net/installation.htm

  18. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  19. Numerical and experimental analysis of the impact of a nuclear spent fuel cask

    Energy Technology Data Exchange (ETDEWEB)

    Aquaro, D. [Department of Mechanical, Nuclear and Production Engineering (DIMNP), Pisa University, Via Diotisalvi, Pisa (Italy); Zaccari, N., E-mail: nicola.zaccari@enel.i [Department of Mechanical, Nuclear and Production Engineering (DIMNP), Pisa University, Via Diotisalvi, Pisa (Italy); Di Prinzio, M.; Forasassi, G. [Department of Mechanical, Nuclear and Production Engineering (DIMNP), Pisa University, Via Diotisalvi, Pisa (Italy)

    2010-04-15

    This paper deals with the numerical and experimental analyses of a shell type shock absorber for a nuclear spent fuel cask. Nine-meter free drop tests performed on reduced scale models are described. The results are compared with numerical simulations performed with FEM computer codes, considering reduced scale models as well as the prototype. The paper shows the results of a similitude analysis, with which the data obtained by means of the reduced scale models can be extrapolated to the prototype. Small discrepancies were obtained using large-scale models (1:2 and 1:6), while small-scale models (1:12) did not give reliable results. A 1:9 scale model provided useful information with a less than 20% error.

  20. A Flexible Visualization Tool for Rapid Access to EFIT Results

    International Nuclear Information System (INIS)

    Zhang Ruirui; Xiao Bingjia; Luo Zhengping

    2014-01-01

    This paper introduces the design and implementation of an interactive tool, the EASTViewer, for the visualization of plasma equilibrium reconstruction results for EAST (the Experimental Advanced Superconducting Tokamak). Aimed at the operating system independently, Python, when combined with the PyGTK toolkit, is used as the programming language. Using modular design, the EASTViewer provides a unified interface with great flexibility. It is easy to access numerous data sources either from local data files or an MDSplus tree, and with the pre-defined configuration files, it can be extended to other tokamaks. The EASTViewer has been used as the major tool to visualize equilibrium data since the second EAST campaign in 2008, and it has been verified that the EASTViewer features a user-friendly interface, and has easy access to numerous data sources and cross-platforms. (fusion engineering)