Novel applications of the x-ray tracing software package McXtrace
Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer
2014-01-01
We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....
Powerful scriptable ray tracing package xrt
Klementiev, Konstantin; Chernikov, Roman
2014-09-01
We present an open source python based ray tracing tool that offers several useful features in graphical presentation, material properties, advanced calculations of synchrotron sources, implementation of diffractive and refractive elements, complex (also closed) surfaces and multiprocessing. The package has many usage examples which are supplied together with the code and visualized on its web page. We exemplify the present version by modeling (i) a curved crystal analyzer, (ii) a quarter wave plate, (iii) Bragg-Fresnel optics and (iv) multiple reflective and non-sequential optics (polycapillary). The present version implements the use of OpenCL framework that executes calculations on both CPUs and GPUs. Currently, the calculations of an undulator source on a GPU show a gain of about two orders of magnitude in computing time. The development version is successful in modelling the wavefront propagation. Two examples of diffraction on a plane mirror and a plane blazed grating are given for a beam with a finite energy band.
RayTrace: A Simplified Ray Tracing Software for use in AutoCad
Reimann, Gregers Peter; Tang, C.K.
2005-01-01
A design aid tool for testing and development of daylighting systems was developed. A simplified ray tracing software was programmed in Lisp for AutoCad. Only fully specularly reflective, fully transparent and fully absorbant surfaces can be defined in the software. The software is therefore best...
RayTrace: A Simplified Ray Tracing Software for use in AutoCad
Reimann, Gregers Peter; Tang, C.K.
2005-01-01
A design aid tool for testing and development of daylighting systems was developed. A simplified ray tracing software was programmed in Lisp for AutoCad. Only fully specularly reflective, fully transparent and fully absorbant surfaces can be defined in the software. The software is therefore best...
McXtrace: A modern ray-tracing package for X-ray instrumentation
Bergbäck Knudsen, Erik; Prodi, A.; Willendrup, Peter Kjær
2011-01-01
we present the developments of the McXtrace project, a free, open source software package based on Monte Carlo ray tracing for simulations and optimisation of complete X-ray instruments. The methodology of building a simulation is presented through an example beamline, namely Beamline 811 at MAX-...
Ray-tracing software comparison for linear focusing solar collectors
Osório, Tiago; Horta, Pedro; Larcher, Marco; Pujol-Nadal, Ramón; Hertel, Julian; van Rooyen, De Wet; Heimsath, Anna; Schneider, Simon; Benitez, Daniel; Frein, Antoine; Denarie, Alice
2016-05-01
Ray-Tracing software tools have been widely used in the optical design of solar concentrating collectors. In spite of the ability of these tools to assess the geometrical and material aspects impacting the optical performance of concentrators, their use in combination with experimental measurements in the framework of collector testing procedures as not been implemented, to the date, in none of the current solar collector testing standards. In the latest revision of ISO9806 an effort was made to include linear focusing concentrating collectors but some practical and theoretical difficulties emerged. A Ray-Tracing analysis could provide important contributions to overcome these issues, complementing the experimental results obtained through thermal testing and allowing the achievement of more thorough testing outputs with lower experimental requirements. In order to evaluate different available software tools a comparison study was conducted. Taking as representative technologies for line-focus concentrators the Parabolic Trough Collector and the Linear Fresnel Reflector Collector, two exemplary cases with predefined conditions - geometry, sun model and material properties - were simulated with different software tools. This work was carried out within IEA/SHC Task 49 "Solar Heat Integration in Industrial Processes".
MCViNE -- An object oriented Monte Carlo neutron ray tracing simulation package
Lin, Jiao Y Y; Granroth, Garrett E; Abernathy, Douglas L; Lumsden, Mark D; Winn, Barry; Aczel, Adam A; Aivazis, Michael; Fultz, Brent
2015-01-01
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is a versatile Monte Carlo (MC) neutron ray-tracing program that provides researchers with tools for performing computer modeling and simulations that mirror real neutron scattering experiments. By adopting modern software engineering practices such as using composite and visitor design patterns for representing and accessing neutron scatterers, and using recursive algorithms for multiple scattering, MCViNE is flexible enough to handle sophisticated neutron scattering problems including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can take advantage of simulation components in linear-chain-based MC ray tracing packages widely used in instrument design and optimization, as well as NumPy-based components that make prototypes useful and easy to develop. These developments have enabled us to carry out detailed simulations of neutron scatteri...
User and programmers guide to the neutron ray-tracing package McStas, version 1.2
Nielsen, K.; Lefmann, K.
2000-01-01
The software package McStas is a tool for writing Monte Carlo ray-tracing simulations of neutron scattering instruments with very high complexity and precision. The simulations can compute all aspects of the performance of instruments and can thus be usedto optimize the use of existing equipment...
MCViNE - An object oriented Monte Carlo neutron ray tracing simulation package
Lin, Jiao Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A.; Aivazis, Michael; Fultz, Brent
2016-02-01
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.
The Gaussian Laser Angular Distribution in HYDRA's 3D Laser Ray Trace Package
Sepke, Scott M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-04-10
In this note, the angular distribution of rays launched by the 3D LZR ray trace package is derived for Gaussian beams (npower==2) with bm model=3±. Beams with bm model=+3 have a nearly at distribution, and beams with bm model=-3 have a nearly linear distribution when the spot size is large compared to the wavelength.
Modeling pyramidal sensors in ray-tracing software by a suitable user-defined surface
Antichi, Jacopo; Munari, Matteo; Magrin, Demetrio; Riccardi, Armando
2016-04-01
Following the unprecedented results in terms of performances delivered by the first light adaptive optics system at the Large Binocular Telescope, there has been a wide-spread and increasing interest on the pyramid wavefront sensor (PWFS), which is the key component, together with the adaptive secondary mirror, of the adaptive optics (AO) module. Currently, there is no straightforward way to model a PWFS in standard sequential ray-tracing software. Common modeling strategies tend to be user-specific and, in general, are unsatisfactory for general applications. To address this problem, we have developed an approach to PWFS modeling based on user-defined surface (UDS), whose properties reside in a specific code written in C language, for the ray-tracing software ZEMAX™. With our approach, the pyramid optical component is implemented as a standard surface in ZEMAX™, exploiting its dynamic link library (DLL) conversion then greatly simplifying ray tracing and analysis. We have utilized the pyramid UDS DLL surface-referred to as pyramidal acronyms may be too risky (PAM2R)-in order to design the current PWFS-based AO system for the Giant Magellan Telescope, evaluating tolerances, with particular attention to the angular sensitivities, by means of sequential ray-tracing tools only, thus verifying PAM2R reliability and robustness. This work indicates that PAM2R makes the design of PWFS as simple as that of other optical standard components. This is particularly suitable with the advent of the extremely large telescopes era for which complexity is definitely one of the main challenges.
Rybkin, G
2012-01-01
Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages - platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis pro...
The Super Gaussian Laser Intensity Profile in HYDRA's 3D Laser Ray Trace Package
Sepke, Scott M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-01-05
In this note, the laser focal plane intensity pro le for a beam modeled using the 3D ray trace package in HYDRA is determined. First, the analytical model is developed followed by a practical numerical model for evaluating the resulting computationally intensive normalization factor for all possible input parameters.
Weeratunga, S K
2008-11-06
Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can be easily shared between these two code frameworks and concludes with a set of recommendations for its development.
Rybkin, Grigory
2012-12-01
Software packaging is indispensable part of build and prerequisite for deployment processes. Full ATLAS software stack consists of TDAQ, HLT, and Offline software. These software groups depend on some 80 external software packages. We present tools, package PackDist, developed and used to package all this software except for TDAQ project. PackDist is based on and driven by CMT, ATLAS software configuration and build tool, and consists of shell and Python scripts. The packaging unit used is CMT project. Each CMT project is packaged as several packages—platform dependent (one per platform available), source code excluding header files, other platform independent files, documentation, and debug information packages (the last two being built optionally). Packaging can be done recursively to package all the dependencies. The whole set of packages for one software release, distribution kit, also includes configuration packages and contains some 120 packages for one platform. Also packaged are physics analysis projects (currently 6) used by particular physics groups on top of the full release. The tools provide an installation test for the full distribution kit. Packaging is done in two formats for use with the Pacman and RPM package managers. The tools are functional on the platforms supported by ATLAS—GNU/Linux and Mac OS X. The packaged software is used for software deployment on all ATLAS computing resources from the detector and trigger computing farms, collaboration laboratories computing centres, grid sites, to physicist laptops, and CERN VMFS and covers the use cases of running all applications as well as of software development.
The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis
Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.
2009-01-01
This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.
The Ettention software package
Dahmen, Tim, E-mail: Tim.Dahmen@dfki.de [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany); Marsalek, Lukas [Eyen SE, Na Nivách 1043/16, 141 00 Praha 4 (Czech Republic); Saarland University, 66123 Saarbrücken (Germany); Marniok, Nico [Saarland University, 66123 Saarbrücken (Germany); Turoňová, Beata [Saarland University, 66123 Saarbrücken (Germany); IMPRS-CS, Max-Planck Institute for Informatics, Campus E 1.4, 66123 Saarbrücken (Germany); Bogachev, Sviatoslav [Saarland University, 66123 Saarbrücken (Germany); Trampert, Patrick; Nickels, Stefan [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Slusallek, Philipp [German Research Center for Artificial Intelligence GmbH (DFKI), 66123 Saarbrücken (Germany); Saarland University, 66123 Saarbrücken (Germany)
2016-02-15
We present a novel software package for the problem “reconstruction from projections” in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. - Highlights: • Novel software package for “reconstruction from projections” in electron microscopy. • Support for high-resolution reconstructions on iterative reconstruction algorithms. • Support for CPU, GPU and Xeon Phi. • Integration in the IMOD software. • Platform for algorithm researchers: object oriented, modular design.
Christophe Lièbe
2010-01-01
Full Text Available This paper presents a new software for design of through-the-wall imaging radars. The first part describes the evolution of a ray tracing simulator, originally designed for propagation of narrowband signals, and then for ultra-wideband signals. This simulator allows to obtain temporal channel response to a wide-band emitter (3 GHz to 10 GHz. An experimental method is also described to identify the propagation paths. Simulation results are compared to propagation experiments under the same conditions. Different configurations are tested and then discussed. Finally, a configuration of through-the-wall imaging radar is proposed, with different antennas patterns and different targets. Simulated images will be helpful for understanding the experiment obtained images.
The Ettention software package.
Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp
2016-02-01
We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.
Software package requirements and procurement
1996-01-01
This paper outlines the problems of specifying requirements and deploying these requirements in the procurement of software packages. Despite the fact that software construction de novo is the exception rather than the rule, little or no support for the task of formulating requirements to support assessment and selection among existing software packages has been developed. We analyse the problems arising in this process and review related work. We outline the key components of a programme of ...
TASC Graphics Software Package.
1982-12-01
RD-I55 861 TSC GRPHICS SOFTWRE PCKRGE(U) NLYTIC SCIENCES i/I RD -t CORP RERDING MA M R TANG DEC 82 TR-1946-6U~~cLss AFG L-TR-gi-1388 Fi9629-89-C...extensions were made to allow TGSP to use color graphics. 2.1 INTERACTIVE TGSP NCAR was designed to be a general plot package for use with many different...plotting devices. It is designed to accept high level commands and generate an intermediate set of commands called metacode and to then use device
RAY TRACING IMPLEMENTATION IN JAVA PROGRAMMING LANGUAGE
Aybars UĞUR
2002-01-01
Full Text Available In this paper realism in computer graphics and components providing realism are discussed at first. It is mentioned about illumination models, surface rendering methods and light sources for this aim. After that, ray tracing which is a technique for creating two dimensional image of a three-dimensional virtual environment is explained briefly. A simple ray tracing algorithm was given. "SahneIzle" which is a ray tracing program implemented in Java programming language which can be used on the internet is introduced. As a result, importance of network-centric ray tracing software is discussed.
Yuan, Cadmus C. A.
2015-12-01
Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.
Gatland, Ian R.
2002-01-01
Proposes a ray tracing approach to thin lens analysis based on a vector form of Snell's law for paraxial rays as an alternative to the usual approach in introductory physics courses. The ray tracing approach accommodates skew rays and thus provides a complete analysis. (Author/KHR)
Software design practice using two SCADA software packages
Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.
1996-01-01
Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....
Software design practice using two SCADA software packages
Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.
1996-01-01
Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....
Lam, Wai Sze Tiffany
Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for
Software packages for food engineering needs
Abakarov, Alik
2011-01-01
The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...
FITSH -- a software package for image processing
Pál, András
2011-01-01
In this paper we describe the main features of the software package named FITSH, intended to provide a standalone environment for analysis of data acquired by imaging astronomical detectors. The package provides utilities both for the full pipeline of subsequent related data processing steps (incl. image calibration, astrometry, source identification, photometry, differential analysis, low-level arithmetic operations, multiple image combinations, spatial transformations and interpolations, etc.) and for aiding the interpretation of the (mainly photometric and/or astrometric) results. The package also features a consistent implementation of photometry based on image subtraction, point spread function fitting and aperture photometry and provides easy-to-use interfaces for comparisons and for picking the most suitable method for a particular problem. This set of utilities found in the package are built on the top of the commonly used UNIX/POSIX shells (hence the name of the package), therefore both frequently us...
Software Package for Bio-Signal Analysis
2002-10-15
We have developed a MatlabTM based software package for bio -signal analysis. The software is based on modular design and can thus be easily adapted...to fit on analysis of various kind of time variant or event-related bio -signals. Currently analysis programs for event-related potentials (ERP) heart...rate variability (HRV), galvanic skin responses (GSR) and quantitative EEG (qEEG) are implemented. A tool for time varying spectral analysis of bio
Software Package STATISTICA and Educational Process
Demidova Liliya
2016-01-01
Full Text Available The paper describes the main aspects of application of the software package STATISTICA in the educational process. Technologies of data mining which can be useful for students researches have been considered. The main tools of these technologies have been discussed.
Package Coupling Measurement in Object-Oriented Software
Varun Gupta; Jitender Kumar Chhabra
2009-01-01
The grouping of correlated classes into a package helps in better organization of modern object-oriented software. The quality of such packages needs to be measured so as to estimate their utilization. In this paper, new package coupling metrics are proposed, which also take into consideration the hierarchical structure of packages and direction of connections among package elements. The proposed measures have been validated theoretically as well as empirically using 18 packages taken from two open source software systems. The results obtained from this study show strong correlation between package coupling and understandability of the package which suggests that proposed metrics could be further used to represent other external software quality factors.
An Overview on Wavelet Software Packages
无
2001-01-01
Wavelet analysis provides very powerful problem-solving tools foranalyzing, en coding, compressing, reconstructing, and modeling signals and images. The amount of wavelets-related software has been constantly multiplying. Many wavelet ana lysis tools are widely available. This overview represents a significant survey for many currently available packages. It will be of great benefit to engineers and researchers for using the toolkits and developing new software. The beginner to learning wavelets can also get a great help from the review. If you browse a round at some of the Internet sites listed in the reference of this paper, you m ay find more plentiful wavelet resources.
IONORT: IONOsphere Ray-Tracing - Ray-tracing program in ionospheric magnetoplasma
Bianchi, Cesidio; Settimi, Alessandro; Azzarone, Adriano
2010-01-01
The application package "IONORT" for the calculation of ray-tracing can be used by customers using the Windows operating system. It is a program whose interface with the user is created in MATLAB. In fact, the program launches an executable that integrates the system of differential equations written in Fortran and importing the output in the MATLAB program, which generates graphics and other information on the ray. This work is inspired mainly by the program of Jones and Stephenson, widespre...
ASPT software source code: ASPT signal excision software package
Parliament, Hugh
1992-08-01
The source code for the ASPT Signal Excision Software Package which is part of the Adaptive Signal Processing Testbed (ASPT) is presented. The source code covers the programs 'excision', 'ab.out', 'd0.out', 'bd1.out', 'develop', 'despread', 'sorting', and 'convert'. These programs are concerned with collecting data, filtering out interference from a spread spectrum signal, analyzing the results, and developing and testing new filtering algorithms.
Ray-tracing and physical-optics analysis of the aperture efficiency in a radio telescope.
Olmi, Luca; Bolli, Pietro
2007-07-01
The performance of telescope systems working at microwave or visible-IR wavelengths is typically described in terms of different parameters according to the wavelength range. Most commercial ray-tracing packages have been specifically designed for use with visible-IR systems and thus, though very flexible and sophisticated, do not provide the appropriate parameters to fully describe microwave antennas and to compare with specifications. We demonstrate that the Strehl ratio is equal to the phase efficiency when the apodization factor is taken into account. The phase efficiency is the most critical contribution to the aperture efficiency of an antenna and the most difficult parameter to optimize during the telescope design. The equivalence between the Strehl ratio and the phase efficiency gives the designer/user of the telescope the opportunity to use the faster commercial ray-tracing software to optimize the design. We also discuss the results of several tests performed to check the validity of this relationship that we carried out using a ray-tracing software, ZEMAX, and a full Physical Optics software, GRASP9.3, applied to three different telescope designs that span a factor of approximately 10 in terms of D/lambda. The maximum measured discrepancy between phase efficiency and Strehl ratio varies between approximately 0.4% and 1.9% up to an offset angle of >40 beams, depending on the optical configuration, but it is always less than 0.5% where the Strehl ratio is >0.95.
IONORT: IONOsphere Ray-Tracing
Bianchi, C.; Settimi, A; Azzarone, A.
2010-01-01
Il pacchetto applicativo “IONORT” per il calcolo del ray-tracing può essere utilizzato dagli utenti che impiegano il sistema operativo Windows. È un programma la cui interfaccia grafica con l’utente è realizzata in MATLAB. In realtà, il programma lancia un eseguibile che integra il sistema d’equazioni differenziali scritto in linguaggio Fortran e ne importa l’output nel programma MATLAB, il quale genera i grafici e altre informazioni sul raggio. A completamento di questa premessa va detto ...
Software refactoring at the package level using clustering techniques
Alkhalid, A.
2011-01-01
Enhancing, modifying or adapting the software to new requirements increases the internal software complexity. Software with high level of internal complexity is difficult to maintain. Software refactoring reduces software complexity and hence decreases the maintenance effort. However, software refactoring becomes quite challenging task as the software evolves. The authors use clustering as a pattern recognition technique to assist in software refactoring activities at the package level. The approach presents a computer aided support for identifying ill-structured packages and provides suggestions for software designer to balance between intra-package cohesion and inter-package coupling. A comparative study is conducted applying three different clustering techniques on different software systems. In addition, the application of refactoring at the package level using an adaptive k-nearest neighbour (A-KNN) algorithm is introduced. The authors compared A-KNN technique with the other clustering techniques (viz. single linkage algorithm, complete linkage algorithm and weighted pair-group method using arithmetic averages). The new technique shows competitive performance with lower computational complexity. © 2011 The Institution of Engineering and Technology.
International Inventory of Software Packages in the Information Field.
Keren, Carl, Ed.; Sered, Irina, Ed.
Designed to provide guidance in selecting appropriate software for library automation, information storage and retrieval, or management of bibliographic databases, this inventory describes 188 computer software packages. The information was obtained through a questionnaire survey of 600 software suppliers and developers who were asked to describe…
RAY TRACING RENDER MENGGUNAKAN FRAGMENT ANTI ALIASING
Febriliyan Samopa
2008-07-01
Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Rendering is generating surface and three-dimensional effects on an object displayed on a monitor screen. Ray tracing as a rendering method that traces ray for each image pixel has a drawback, that is, aliasing (jaggies effect. There are some methods for executing anti aliasing. One of those methods is OGSS (Ordered Grid Super Sampling. OGSS is able to perform aliasing well. However, this method requires more computation time since sampling of all pixels in the image will be increased. Fragment Anti Aliasing (FAA is a new alternative method that can cope with the drawback. FAA will check the image when performing rendering to a scene. Jaggies effect is only happened at curve and gradient object. Therefore, only this part of object that will experience sampling magnification. After this sampling magnification and the pixel values are computed, then downsample is performed to retrieve the original pixel values. Experimental results show that the software can implement ray tracing well in order to form images, and it can implement FAA and OGSS technique to perform anti aliasing. In general, rendering using FAA is faster than using OGSS
ADAPTIVE ASYNCHRONOUS SIMULATED AND METRICAL SOFTWARE PACKAGE IN AIRCRAFT
Mr. Vladimir A. Tikhomirov
2016-09-01
Full Text Available The article is devoted to the peculiarities and problems of development of applied measuring software packages at the stage of mass production for aircraft avionics testing and measuring.
Validation of Ray Tracing Code Refraction Effects
Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.
2008-01-01
NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.
Library Automation Software Packages used in Academic Libraries of Nepal
Sharma (Baral), Sabitri
2007-01-01
This thesis presents a comparative assessment of the library automation software packages used in Nepalese academic libraries. It focuses on the evaluation of software on the basis of certain important checkpoints. It also highlights the importance of library automation, library activities and services.
Quantification of myocardial perfusion defects using three different software packages
Svensson, Annmarie; Aakesson, Liz [Department of Clinical Physiology, Malmoe University Hospital, 205 02, Malmoe (Sweden); Edenbrandt, Lars [Department of Clinical Physiology, Malmoe University Hospital, 205 02, Malmoe (Sweden); Department of Clinical Physiology, Sahlgrenska University Hospital, Gothenburg (Sweden)
2004-02-01
Software packages are widely used for quantification of myocardial perfusion defects. The quantification is used to assist the physician in his/her interpretation of the study. The purpose of this study was to compare the quantification of reversible perfusion defects by three different commercially available software packages. We included 50 consecutive patients who underwent myocardial perfusion single-photon emission tomography (SPET) with a 2-day technetium-99m tetrofosmin protocol. Two experienced technologists processed the studies using the following three software packages: Cedars Quantitative Perfusion SPECT, Emory Cardiac Toolbox and 4D-MSPECT. The same sets of short axis slices were used as input to all three software packages. Myocardial uptake was scored in 20 segments for both the rest and the stress studies. The summed difference score (SDS) was calculated for each patient and the SDS values were classified into: normal (<4), mildly abnormal (4-8), moderately abnormal (9-13), and severely abnormal (>13). All three software packages were in agreement that 21 patients had a normal SDS, four patients had a mildly abnormal SDS and one patient had a severely abnormal SDS. In the remaining 24 patients (48%) there was disagreement between the software packages regarding SDS classification. A difference in classification of more than one step between the highest and lowest scores, for example from normal to moderately abnormal or from mildly to severely abnormal, was found in six of these 24 patients. Widely used software packages commonly differ in their quantification of myocardial perfusion defects. The interpreting physician should be aware of these differences when using scoring systems. (orig.)
Recent developments in the ABINIT software package
Gonze, X.; Jollet, F.; Abreu Araujo, F.; Adams, D.; Amadon, B.; Applencourt, T.; Audouze, C.; Beuken, J.-M.; Bieder, J.; Bokhanchuk, A.; Bousquet, E.; Bruneval, F.; Caliste, D.; Côté, M.; Dahm, F.; Da Pieve, F.; Delaveau, M.; Di Gennaro, M.; Dorado, B.; Espejo, C.; Geneste, G.; Genovese, L.; Gerossier, A.; Giantomassi, M.; Gillet, Y.; Hamann, D. R.; He, L.; Jomard, G.; Laflamme Janssen, J.; Le Roux, S.; Levitt, A.; Lherbier, A.; Liu, F.; Lukačević, I.; Martin, A.; Martins, C.; Oliveira, M. J. T.; Poncé, S.; Pouillon, Y.; Rangel, T.; Rignanese, G.-M.; Romero, A. H.; Rousseau, B.; Rubel, O.; Shukri, A. A.; Stankovski, M.; Torrent, M.; Van Setten, M. J.; Van Troeye, B.; Verstraete, M. J.; Waroquiers, D.; Wiktor, J.; Xu, B.; Zhou, A.; Zwanziger, J. W.
2016-08-01
ABINIT is a package whose main program allows one to find the total energy, charge density, electronic structure and many other properties of systems made of electrons and nuclei, (molecules and periodic solids) within Density Functional Theory (DFT), Many-Body Perturbation Theory (GW approximation and Bethe-Salpeter equation) and Dynamical Mean Field Theory (DMFT). ABINIT also allows to optimize the geometry according to the DFT forces and stresses, to perform molecular dynamics simulations using these forces, and to generate dynamical matrices, Born effective charges and dielectric tensors. The present paper aims to describe the new capabilities of ABINIT that have been developed since 2009. It covers both physical and technical developments inside the ABINIT code, as well as developments provided within the ABINIT package. The developments are described with relevant references, input variables, tests and tutorials.
Reverse ray tracing for transformation optics.
Hu, Chia-Yu; Lin, Chun-Hung
2015-06-29
Ray tracing is an important technique for predicting optical system performance. In the field of transformation optics, the Hamiltonian equations of motion for ray tracing are well known. The numerical solutions to the Hamiltonian equations of motion are affected by the complexities of the inhomogeneous and anisotropic indices of the optical device. Based on our knowledge, no previous work has been conducted on ray tracing for transformation optics with extreme inhomogeneity and anisotropicity. In this study, we present the use of 3D reverse ray tracing in transformation optics. The reverse ray tracing is derived from Fermat's principle based on a sweeping method instead of finding the full solution to ordinary differential equations. The sweeping method is employed to obtain the eikonal function. The wave vectors are then obtained from the gradient of that eikonal function map in the transformed space to acquire the illuminance. Because only the rays in the points of interest have to be traced, the reverse ray tracing provides an efficient approach to investigate the illuminance of a system. This approach is useful in any form of transformation optics where the material property tensor is a symmetric positive definite matrix. The performance and analysis of three transformation optics with inhomogeneous and anisotropic indices are explored. The ray trajectories and illuminances in these demonstration cases are successfully solved by the proposed reverse ray tracing method.
Parallel Ray Tracing Using the Message Passing Interface
2007-09-01
efficiency of 97.9% and a normalized ray-tracing rate of 6.95 ?106 rays ? surfaces/(s ? processor) in a system with 22 planar surfaces, two paraboloid ...with 22 planar surfaces, two paraboloid reflectors, and one hyperboloid refractor. The need for a load-balancing software was obviated by the use of a...specified for each type of optical surface—planar, spherical, paraboloid , hyperboloid, aspheric—and whether it applies for reflection or refraction. The
An Integrated Software Package to Enable Predictive Simulation Capabilities
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang; Palmer, Bruce J.; Sharma, Poorva; Huang, Zhenyu
2016-08-11
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package, as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.
GPS Software Packages Deliver Positioning Solutions
2010-01-01
"To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."
Integrated software packages in the physical laboratory
Bok, J.; Barvík, I.; Praus, P.; Heřman, P.; Čermáková, D.
1990-11-01
The automation of a UV-VIS spectrometer and a single-photon counting apparatus by an IBM-AT is described. Software needed for the computer control, data acquisition and processing was developed in the ASYST environment. This enabled us to use its very good graphics, its support of I/O cards, and its other excellent properties. Also we show ways to overcome some minor shortcomings using the multilanguage programming.
Development of CAD software package of intellectualized casting technology
HOU Hua; CHENG Jun; XU Hong
2005-01-01
Based on the numerical simulation of solidification, a computer aid design(CAD) software package of casting technique was developed to design the rising system intelligently. The software can calculate the size and locate the situations of the isolated melts. According to the liquid shrinkage of each isolated melts and the standard parameters of risers in the database, the riser's situation and the size can be identified intelligently as long as the riser's shape is selected. 3-D software and simulation analysis of CAST soft/computer aid engineering(CAE) software show that the design of the riser and the running system is feasible.
Comparison of four software packages applied to a scattering problem
Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren
1999-01-01
We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...
The Guelph PIXE software package IV
Campbell, J. L.; Boyd, N. I.; Grassi, N.; Bonnick, P.; Maxwell, J. A.
2010-10-01
Following the introduction of GUPIXWIN in 2005, a number of upgrades have been made in the interests of extending the applicability of the program. Extension of the proton upper energy limit to 5 MeV facilitates the simultaneous use of PIXE with other ion beam analysis techniques. Also, the increased penetration depth enables the complete PIXE analysis of paintings. A second database change is effected in which recently recommended values of L-subshell fluorescence and Coster-Kronig yields are adopted. A Monte Carlo code has been incorporated in the GUPIX package to provide detector efficiency values that are more accurate than those of the previous approximate analytical formula. Silicon escape peak modeling is extended to the back face of silicon drift detectors. An improved description of the attenuation in dura-coated beryllium detector windows is devised. Film thickness determination is enhanced. A new batch mode facility is designed to handle two-detector PIXE, with one detector measuring major elements and the other simultaneously measuring trace elements.
GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses
Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru
2013-01-01
The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from t...
A Simple Interactive Software Package for Plotting, Animating, and Calculating
Engelhardt, Larry
2012-01-01
We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…
SPECTRW: A software package for nuclear and atomic spectroscopy
Kalfas, C.A., E-mail: kalfas@inp.demokritos.gr [National Centre for Scientific Research Demokritos, Institute of Nuclear & Particle Physics, 15310 Agia Paraskevi, Attiki (Greece); Axiotis, M. [National Centre for Scientific Research Demokritos, Institute of Nuclear & Particle Physics, 15310 Agia Paraskevi, Attiki (Greece); Tsabaris, C. [Hellenic Centre for Marine Research, Institute of Oceanography, 46.7 Km Athens-Sounio Ave, P.O. Box 712, Anavyssos 19013 (Greece)
2016-09-11
A software package to be used in nuclear and atomic spectroscopy is presented. Apart from analyzing γ and X-ray spectra, it offers many additional features such as de-convolution of multiple photopeaks, sample analysis and activity determination, detection system evaluation and an embedded code for spectra simulation.
Integrated software package for laser diodes characterization
Sporea, Dan G.; Sporea, Radu A.
2003-10-01
The characteristics of laser diodes (wavelength of the emitted radiation, output optical power, embedded photodiode photocurrent, threshold current, serial resistance, external quantum efficiency) are strongly influenced by their driving circumstances (forward current, case temperature). In order to handle such a complex investigation in an efficient and objective manner, the operation of several instruments (a laser diode driver, a temperature controller, a wavelength meter, a power meter, and a laser beam analyzer) is synchronously controlled by a PC, through serial and GPIB communication. For each equipment, instruments drivers were designed using the industry standards graphical programming environment - LabVIEW from National Instruments. All the developed virtual instruments operate under the supervision of a managing virtual instrument, which sets the driving parameters for each unit under test. The manager virtual instrument scans as appropriate the driving current and case temperature values for the selected laser diode. The software enables data saving in Excel compatible files. In this way, sets of curves can be produced according to the testing cycle needs.
The Alba ray tracing code: ART
Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi
2013-09-01
The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.
IONORT: IONOsphere Ray-Tracing - Ray-tracing program in ionospheric magnetoplasma
Bianchi, Cesidio; Azzarone, Adriano
2010-01-01
The application package "IONORT" for the calculation of ray-tracing can be used by customers using the Windows operating system. It is a program whose interface with the user is created in MATLAB. In fact, the program launches an executable that integrates the system of differential equations written in Fortran and imports the output in the MATLAB program, which generates graphics and other information on the ray. This work is inspired mainly by the program of Jones and Stephenson, widespread in the scientific community that is interested in radio propagation via the ionosphere. The program is written in FORTRAN 77, a mainframe CDC-3800. The code itself, as well as being very elegant, is highly efficient and provides the basis for many programs now in use mainly in the Coordinate Registration (CR) of Over The Horizon (OTH) radar. The input and output of this program require devices no longer in use for several decades and there are no compilers that accept instructions written for that type of mainframe. For ...
COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE
V. T. Kalugin
2015-01-01
Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.
The khmer software package: enabling efficient nucleotide sequence analysis.
Crusoe, Michael R; Alameldin, Hussien F; Awad, Sherine; Boucher, Elmar; Caldwell, Adam; Cartwright, Reed; Charbonneau, Amanda; Constantinides, Bede; Edvenson, Greg; Fay, Scott; Fenton, Jacob; Fenzl, Thomas; Fish, Jordan; Garcia-Gutierrez, Leonor; Garland, Phillip; Gluck, Jonathan; González, Iván; Guermond, Sarah; Guo, Jiarong; Gupta, Aditi; Herr, Joshua R; Howe, Adina; Hyer, Alex; Härpfer, Andreas; Irber, Luiz; Kidd, Rhys; Lin, David; Lippi, Justin; Mansour, Tamer; McA'Nulty, Pamela; McDonald, Eric; Mizzi, Jessica; Murray, Kevin D; Nahum, Joshua R; Nanlohy, Kaben; Nederbragt, Alexander Johan; Ortiz-Zuazaga, Humberto; Ory, Jeramia; Pell, Jason; Pepe-Ranney, Charles; Russ, Zachary N; Schwarz, Erich; Scott, Camille; Seaman, Josiah; Sievert, Scott; Simpson, Jared; Skennerton, Connor T; Spencer, James; Srinivasan, Ramakrishnan; Standage, Daniel; Stapleton, James A; Steinman, Susan R; Stein, Joe; Taylor, Benjamin; Trimble, Will; Wiencko, Heather L; Wright, Michael; Wyss, Brian; Zhang, Qingpeng; Zyme, En; Brown, C Titus
2015-01-01
The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at https://github.com/dib-lab/khmer/.
Backward ray tracing for ultrasonic imaging
Breeuwer, R.
1990-01-01
Focused ultrasonic beams frequently pass one or more media interfaces, strongly affecting the ultrasonic beamshape and focusing. A computer program, based on backward ray tracing was developed to compute the shape of a corrected focusing mirror. This shape is verified with another program; then the
Backward ray tracing for ultrasonic imaging
Breeuwer, R.
1990-01-01
Focused ultrasonic beams frequently pass one or more media interfaces, strongly affecting the ultrasonic beamshape and focusing. A computer program, based on backward ray tracing was developed to compute the shape of a corrected focusing mirror. This shape is verified with another program; then the
Software Package Completed for Alloy Design at the Atomic Level
Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.
2001-01-01
As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.
An interactive software package for validating satellite data
Muraleedharan, P.M.; Pankajakshan, T.
of this relationship. This would enable one to understand the significance of such changes in view of noticeable environmental perturbations. This is essential for any validation exercise as the satellite often retrieves the skin temperature and is quite sensitive... skin and multichannel SST. J. Geophys. Res., 97, 5569 - 5595, 1992. [6] Page 12 of 12Gayana (Concepción) - AN INTERACTIVE SOFTWARE PACKAGE FOR VAL... 8/11/2006http://www.scielo.cl/scielo.php?script=sci_arttext&pid=S0717-65382004000300018&lng=... ...
IT LONG TAIL STRATEGY FOR SOFTWARE PACKAGE COMPANY
Andreas Winata
2010-10-01
Full Text Available Long Tail Strategy is a business strategy which explains that the total revenue from the sale of non-popular products may exceed the total income from popular products. This may happen since generally there is only a small number of popular products, which is in great demand, while there are many of the non-popular types which is sold in small amounts. This research aims to better understand the role of IT behind the success of the Long Tail strategy. Results show stages of how to develop IT strategy, including identification, analysis, determines on a strategy, until implementation. The results of this study will help software developers to plan IT strategy by implementing an accurate Long Tail Strategy.Keywords: Long Tail, IT Strategy, Services, Software Package
SEDA: A software package for the Statistical Earthquake Data Analysis.
Lombardi, A M
2017-03-14
In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.
SEDA: A software package for the Statistical Earthquake Data Analysis
Lombardi, A. M.
2017-03-01
In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.
SEDA: A software package for the Statistical Earthquake Data Analysis
Lombardi, A. M.
2017-01-01
In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482
GEMBASSY: an EMBOSS associated software package for comprehensive genome analyses.
Itaya, Hidetoshi; Oshita, Kazuki; Arakawa, Kazuharu; Tomita, Masaru
2013-08-29
The popular European Molecular Biology Open Software Suite (EMBOSS) currently contains over 400 tools used in various bioinformatics researches, equipped with sophisticated development frameworks for interoperability and tool discoverability as well as rich documentations and various user interfaces. In order to further strengthen EMBOSS in the fields of genomics, we here present a novel EMBOSS associated software (EMBASSY) package named GEMBASSY, which adds more than 50 analysis tools from the G-language Genome Analysis Environment and its Representational State Transfer (REST) and SOAP web services. GEMBASSY basically contains wrapper programs of G-language REST/SOAP web services to provide intuitive and easy access to various annotations within complete genome flatfiles, as well as tools for analyzing nucleic composition, calculating codon usage, and visualizing genomic information. For example, analysis methods such as for calculating distance between sequences by genomic signatures and for predicting gene expression levels from codon usage bias are effective in the interpretation of meta-genomic and meta-transcriptomic data. GEMBASSY tools can be used seamlessly with other EMBOSS tools and UNIX command line tools. The source code written in C is available from GitHub (https://github.com/celery-kotone/GEMBASSY/) and the distribution package is freely available from the GEMBASSY web site (http://www.g-language.org/gembassy/).
dMODELS: A software package for modeling volcanic deformation
Battaglia, Maurizio
2017-04-01
dMODELS is a software package that includes the most common source models used to interpret deformation measurements near active volcanic centers. The emphasis is on estimating the parameters of analytical models of deformation by inverting data from the Global Positioning System (GPS), Interferometric Synthetic Aperture Radar (InSAR), tiltmeters and strainmeters. Source models include: (a) pressurized spherical, ellipsoidal and sill-like magma chambers in an elastic, homogeneous, flat half-space; (b) pressurized spherical magma chambers with topography corrections; and (c) the solutions for a dislocation (fracture) in an elastic, homogeneous, flat half-space. All of the equations have been extended to include deformation and strain within the Earth's crust (as opposed to only at the Earth's surface) and verified against finite element models. Although actual volcanic sources are not embedded cavities of simple shape, we assume that these models may reproduce the stress field created by the actual magma intrusion or hydrothermal fluid injection. The dMODELS software employs a nonlinear inversion algorithm to determine the best-fit parameters for the deformation source by searching for the minimum of the cost function χv2 (chi square per degrees of freedom). The non-linear inversion algorithm is a combination of local optimization (interior-point method) and random search. This approach is more efficient for hyper-parameter optimization than trials on a grid. The software has been developed using MATLAB, but compiled versions that can be run using the free MATLAB Compiler Runtime (MCR) module are available for Windows 64-bit operating systems. The MATLAB scripts and compiled files are open source and intended for teaching and research. The software package includes both functions for forward modeling and scripts for data inversion. A software demonstration will be available during the meeting. You are welcome to contact the author at mbattaglia@usgs.gov for
Implementation of Refined Ray Tracing inside a Space Module
Balamati Choudhury
2012-08-01
Full Text Available Modern space modules are susceptible to EM radiation from both external and internal sources within the space module. Since the EM waves for various operations are frequently in the high-frequency domain, asymptotic raytheoretic methods are often the most optimal choice for deterministic EM field analysis. In this work, surface modeling of a typical manned space module is done by hybridizing a finite segment of right circular cylinder and a general paraboloid of revolution (GPOR frustum. A transmitting source is placed inside the space module and test rays are launched from the transmitter. The rays are allowed to propagate inside the cavity. Unlike the available ray-tracing package, that use numerical search methods, a quasi-analytical ray-propagation model is developed to obtain the ray-path details inside the cavity which involves the ray-launching, ray-bunching, and an adaptive cube for ray-reception.
OSPRay - A CPU Ray Tracing Framework for Scientific Visualization.
Wald, I; Johnson, G P; Amstutz, J; Brownlee, C; Knoll, A; Jeffers, J; Gunther, J; Navratil, P
2017-01-01
Scientific data is continually increasing in complexity, variety and size, making efficient visualization and specifically rendering an ongoing challenge. Traditional rasterization-based visualization approaches encounter performance and quality limitations, particularly in HPC environments without dedicated rendering hardware. In this paper, we present OSPRay, a turn-key CPU ray tracing framework oriented towards production-use scientific visualization which can utilize varying SIMD widths and multiple device backends found across diverse HPC resources. This framework provides a high-quality, efficient CPU-based solution for typical visualization workloads, which has already been integrated into several prevalent visualization packages. We show that this system delivers the performance, high-level API simplicity, and modular device support needed to provide a compelling new rendering framework for implementing efficient scientific visualization workflows.
Dynamic ray tracing and its application in triangulated media
Rueger, A.
1993-07-01
Hale and Cohen (1991) developed software to generate two-dimensional computer models of complex geology. Their method uses a triangulation technique designed to support efficient and accurate computation of seismic wavefields for models of the earth`s interior. Subsequently, Hale (1991) used this triangulation approach to perform dynamic ray tracing and create synthetic seismograms based on the method of Gaussian beams. Here, I extend this methodology to allow an increased variety of ray-theoretical experiments. Specifically, the developed program GBmod (Gaussian Beam MODeling) can produce arbitrary multiple sequences and incorporate attenuation and density variations. In addition, I have added an option to perform Fresnel-volume ray tracing (Cerveny and Soares, 1992). Corrections for reflection and transmission losses at interfaces, and for two-and-one-half-dimensional (2.5-D) spreading are included. However, despite these enhancements, difficulties remain in attempts to compute accurate synthetic seismograms if strong lateral velocity inhomogeneities are present. Here, these problems are discussed and, to a certain extent, reduced. I provide example computations of high-frequency seismograms based on the method of Gaussian beams to exhibit the advantages and disadvantages of the proposed modeling method and illustrate new features for both surface and vertical seismic profiling (VSP) acquisition geometries.
IL RAY-TRACING NELLA IONOSFERA
Azzarone, A.; Bianchi, C.; Settimi, A
2010-01-01
Il pacchetto applicativo “IONORT” per il calcolo del ray-tracing può essere utilizzato dagli utenti che impiegano il sistema operativo Windows. È un programma la cui interfaccia grafica con l’utente è realizzata in MATLAB. In realtà, il programma lancia un eseguibile che integra il sistema d’equazioni differenziali scritto in linguaggio Fortran e ne importa l’output nel programma MATLAB, il quale genera i grafici e altre informazioni sul raggio. A completamento di questa premessa va detto che...
Software Packages to Support Electrical Engineering Virtual Lab
Manuel Travassos Valdez
2012-03-01
Full Text Available The use of Virtual Reality Systems (VRS, as a learning aid, encourages the creation of tools that allow users/students to simulate educational environments on a computer. This article presents a way of building a VRS system with Software Packages to support Electrical Engineering Virtual Laboratories to be used in a near future in the teaching of the curriculum unit of Circuit Theory. The steps required for the construction of a project are presented in this paper. The simulation is still under construction and intends to use a three-dimensional virtual environment laboratory electric measurement, which will allow users/students to experiment and test the modeled equipment. Therefore, there are still no links available for further examination. The result may demonstrate the future potential of applications of Virtual Reality Systems as an efficient and cost-effective learning system.
scoringRules - A software package for probabilistic model evaluation
Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian
2016-04-01
Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages.
gr-MRI: A software package for magnetic resonance imaging using software defined radios
Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.
2016-09-01
The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.
Development of an engine system simulation software package - ESIM
Erlandsson, Olof
2000-10-01
A software package, ESIM is developed for simulating internal combustion engine systems, including models for engine, manifolds, turbocharger, charge-air cooler (inter cooler) and inlet air heater. This study focus on the thermodynamic treatment and methods used in the models. It also includes some examples of system simulations made with these models for validation purposes. The engine model can be classified as a zero-dimensional, single zone model. It includes calculation of the valve flow process, models for heat release and models for in-cylinder, exhaust port and manifold heat transfer. Models are developed for handling turbocharger performance and charge air cooler characteristics. The main purpose of the project related to this work is to use the ESIM software to study heat balance and performance of homogeneous charge compression ignition (HCCI) engine systems. A short description of the HCCI engine is therefore included, pointing out the difficulties, or challenges regarding the HCCI engine, from a system perspective. However, the relations given here, and the code itself, is quite general, making it possible to use these models to simulate spark ignited, as well as direct injected engines.
V. Krzhizhanovskaya; S. Ryaboshuk
2009-01-01
This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex
Interactive Ray Tracing for Virtual TV Studio Applications
Philipp Slusallek
2005-12-01
Full Text Available In the last years, the well known ray tracing algorithm gained new popularity with the introduction of interactive ray tracing methods. The high modularity and the ability to produce highly realistic images make ray tracing an attractive alternative to raster graphics hardware.Interactive ray tracing also proved its potential in the field of Mixed Reality rendering and provides novel methods for seamless integration of real and virtual content. Actor insertion methods, a subdomain of Mixed Reality and closely related to virtual television studio techniques, can use ray tracing for achieving high output quality in conjunction with appropriate visual cues like shadows and reflections at interactive frame rates.In this paper, we show how interactive ray tracing techniques can provide new ways of implementing future virtual studio applications.
New developments in the McStas neutron instrument simulation package
Willendrup, Peter Kjær; Bergbäck Knudsen, Erik; Klinkby, Esben Bryndt
2014-01-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments...
US Army Radiological Bioassay and Dosimetry: The RBD software package
Eckerman, K. F.; Ward, R. C.; Maddox, L. B.
1993-01-01
The RBD (Radiological Bioassay and Dosimetry) software package was developed for the U. S. Army Material Command, Arlington, Virginia, to demonstrate compliance with the radiation protection guidance 10 CFR Part 20 (ref. 1). Designed to be run interactively on an IBM-compatible personal computer, RBD consists of a data base module to manage bioassay data and a computational module that incorporates algorithms for estimating radionuclide intake from either acute or chronic exposures based on measurement of the worker's rate of excretion of the radionuclide or the retained activity in the body. In estimating the intake,RBD uses a separate file for each radionuclide containing parametric representations of the retention and excretion functions. These files also contain dose-per-unit-intake coefficients used to compute the committed dose equivalent. For a given nuclide, if measurements exist for more than one type of assay, an auxiliary module, REPORT, estimates the intake by applying weights assigned in the nuclide file for each assay. Bioassay data and computed results (estimates of intake and committed dose equivalent) are stored in separate data bases, and the bioassay measurements used to compute a given result can be identified. The REPORT module creates a file containing committed effective dose equivalent for each individual that can be combined with the individual's external exposure.
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
O`Dell, B.H.
1995-11-01
A software package was developed to replace the software provided by LakeShore for their model 7300 vibrating sample magnetometer (VSM). Several problems with the original software`s functionality caused this group to seek a new software package. The new software utilizes many features that were unsupported in the LakeShore software, including a more functional step mode, point averaging mode, vector moment measurements, and calibration for field offset. The developed software interfaces the VSM through a menu driven graphical user interface, and bypasses the VSM`s on board processor leaving control of the VSM up to the software. The source code for this software is readily available to any one. By having the source, the experimentalist has full control of data acquisition and can add routines specific to their experiment.
Sahraneshin Samani, Fazel; Moore, Jodene K; Khosravani, Pardis; Ebrahimi, Marzieh
2014-08-01
Flow cytometers designed to analyze large particles are enabling new applications in biology. Data analysis is a critical component of the process FCM. In this article we compare features of four free software packages including WinMDI, Cyflogic, Flowing software, and Cytobank.
Baltser, Jana; Bergbäck Knudsen, Erik; Vickery, Anette
2011-01-01
of X-ray beamline designs for particular user experiments. In this work we used the newly developed McXtrace ray-tracing package and the SRW wave-optics code to simulate the beam propagation of X-ray undulator radiation through such a "transfocator" as implemented at ID- 11 at ESRF. By applying two...
Desire characteristics of a generic 'no frills' software engineering tools package
Rhodes, J.J.
1986-07-29
Increasing numbers of vendors are developing software engineering tools to meet the demands of increasingly complex software systems, higher reliability goals for software products, higher programming labor costs, and management's desire to more closely associate software lifecycle costs with the estimated development schedule. Some vendors have chosen a dedicated workstation approach to achieve high user interactivity through windowing and mousing. Other vendors are using multi-user mainframes with low cost terminals to economize on the costs of the hardware and the tools software. For all of the potential customers of software tools, the question remains: What are the minimum functional requirements that a software engineering tools package must have in order to be considered useful throughout the entire software lifecycle. This paper describes the desired characteristics of a non-existent but realistic 'no frills' software engineering tools package. 3 refs., 5 figs.
Floating point software package for use on LSI-11 computers at SLAC
Hendra, R.G.
1981-06-01
A floating point software package has been devised to allow full use of the floating point hardware of the LSI-11 and MODEL40 computers. The procedures are written for the most part in the PL-11 language. The package may be run under the RT-11 operating system, or in RAM or EPROM as part of the KERNEL package. The current set of procedures has been run successfully in all three modes.
Development of ray tracing visualization program by Monte Carlo method
Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro
1997-09-01
Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)
Development of ray tracing visualization program by Monte Carlo method
Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro
1997-09-01
Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)
Risk Analysis and Decision-Making Software Package (1997 Version) User Manual
Chung, F.T.H.
1999-02-11
This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Sang-Kyu Jung
Full Text Available Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
Haas, Rudolf; Weinreich, Bernhard
2007-07-01
The software package comprises simulation, design and calculation tools: Professional configuration of photovoltaic systems; Design and optimization of PV systems and components; 3D visualization of shading situations; Economic efficiency and profit calculations; Software status replort; Measuring technology for characteristics, insolation, infrared radiation, etc.; Databases for modules, inverters and supports; Insolation maps for Germany dating back to 1998; Check lists: Site, diemensioning, comparison of systems, etc.; Useful addresses, bibliography, manufacturers; Other renewable energy sources, and much more. (orig.)
A software package for the full GBTX lifecycle
Feger, S; Marin, M Barros; Leitao, P; Moreira, P; Porret, D; Wyllie, K
2015-01-01
This work presents the software environment surrounding the GBTX. The GBTX is a high speed bidirectional ASIC, implementing radiation hard optical links for high-energy physics experiments. Having more than 300 8-bit configuration registers, it poses challenges addressed by a wide variety of software components. This paper focuses on the software used for characterization as well as radiation and production testing of the GBTX. It also highlights tools made available to the designers and users, enabling them to create customized configurations. The paper shows how storing data for the full GBTX lifecycle is planned to ensure a good quality tracking of their devices.
A software package for the full GBTX lifecycle
Feger, S; Marin, M Barros; Leitao, P; Moreira, P; Porret, D; Wyllie, K
2015-01-01
This work presents the software environment surrounding the GBTX. The GBTX is a high speed bidirectional ASIC, implementing radiation hard optical links for high-energy physics experiments. Having more than 300 8-bit configuration registers, it poses challenges addressed by a wide variety of software components. This paper focuses on the software used for characterization as well as radiation and production testing of the GBTX. It also highlights tools made available to the designers and users, enabling them to create customized configurations. The paper shows how storing data for the full GBTX lifecycle is planned to ensure a good quality tracking of their devices.
Effective organizational solutions for implementation of DBMS software packages
Jones, D.
1984-01-01
The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.
An Ada Linear-Algebra Software Package Modeled After HAL/S
Klumpp, Allan R.; Lawson, Charles L.
1990-01-01
New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Mehdi Ghorbaninia
2014-08-01
Full Text Available This paper investigates the effects of different factors on development of open source enterprise resources planning software packages. The study designs a questionnaire in Likert scale and distributes it among 210 experts in the field of open source software package development. Cronbach alpha has been calculated as 0.93, which is well above the minimum acceptable level. Using Pearson correlation as well as stepwise regression analysis, the study determines three most important factors including fundamental issues, during and after implementation of open source software development. The study also determines a positive and strong relationship between fundamental factors and after implementation factors (r=0.9006, Sig. = 0.000.
An Ada Linear-Algebra Software Package Modeled After HAL/S
Klumpp, Allan R.; Lawson, Charles L.
1990-01-01
New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.
Accuracy of Giovanni and Marksim Software Packages for ...
Agricultural adaptation to climate change requires accurate, unbiased, and reliable climate data. ... simulation models are important tools for generating rainfall data in areas with limited or no .... software has a global partial-temporal coverage.
BEANS - a software package for distributed Big Data analysis
Hypki, Arkadiusz
2016-01-01
BEANS software is a web based, easy to install and maintain, new tool to store and analyse data in a distributed way for a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in the so-called Big Data. Creation of BEANS software is an answer to the growing needs of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field or open source software.
QuickDirect - Payload Control Software Template Package Project
National Aeronautics and Space Administration — To address the need to quickly, cost-effectively and reliably develop software to control science instruments deployed on spacecraft, QuickFlex proposes to create a...
The quality and testing PH-SFT infrastructure for the external LHC software packages deployment
CERN. Geneva; MENDEZ LORENZO, Patricia; MATO VILA, Pere
2015-01-01
The PH-SFT group is responsible for the build, test, and deployment of the set of external software packages used by the LHC experiments. This set includes ca. 170 packages including Grid packages and Montecarlo generators provided for different versions. A complete build structure has been established to guarantee the quality of the packages provided by the group. This structure includes an experimental build and three daily nightly builds, each of them dedicated to a specific ROOT version including v6.02, v6.04, and the master. While the former build is dedicated to the test of new packages, versions and dependencies (basically SFT internal used), the three latter ones are the responsible for the deployment to AFS of the set of stable and well tested packages requested by the LHC experiments so they can apply their own builds on top. In all cases, a c...
SIMODIS - a software package for simulating nuclear reactor components
Guimaraes, Lamartine; Borges, Eduardo M. [Centro Tecnico Aeroespacial (CTA-IEAv), Sao Jose dos Campos, SP (Brazil). Inst. de Estudos Avancados. E-mail: guimarae@ieav.cta.br; Oliveira Junior, Nilton S.; Santos, Glauco S.; Bueno, Mariana F. [Universidade Bras Cubas, Mogi das Cruzes, SP (Brazil)
2000-07-01
In this paper it is presented the initial development effort in building a nuclear reactor component simulation package. This package was developed to be used in the MATLAB simulation environment. It uses the graphical capabilities from MATLAB and the advantages of compiled languages, as for instance FORTRAN and C{sup ++}. From the MATLAB it takes the facilities for better displaying the calculated results. From the compiled languages it takes processing speed. So far models from reactor core, UTSG and OTSG have been developed. Also, a series a user-friendly graphical interfaces have been developed for the above models. As a by product a set of water and sodium thermal and physical properties have been developed and may be used directly as a function from MATLAB, or by being called from a model, as part of its calculation process. The whole set was named SIMODIS, which stands for SIstema MODular Integrado de Simulacao. (author)
Development of a software package for solid-angle calculations using the Monte Carlo method
Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai
2014-02-01
Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.
Evaluation of open source data mining software packages
Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt
2009-01-01
Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...
Versatile Software Package For Near Real-Time Analysis of Experimental Data
Wieseman, Carol D.; Hoadley, Sherwood T.
1998-01-01
This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.
First Release of Gauss-Legendre Sky Pixelization (GLESP) software package for CMB analysis
Doroshkevich, A G; Verkhodanov, O V; Novikov, D I; Turchaninov, V I; Novikov, I D; Christensen, P R; Chiang, L Y
2005-01-01
We report the release of the Gauss--Legendre Sky Pixelization (GLESP) software package version 1.0. In this report we present the main features and functions for processing and manipulation of sky signals. Features for CMB polarization is underway and to be incorporated in a future release. Interested readers can visit http://www.glesp.nbi.dk (www.glesp.nbi.dk) and register for receiving the package.
Ray Tracing RF Field Prediction: An Unforgiving Validation
E. M. Vitucci
2015-01-01
Full Text Available The prediction of RF coverage in urban environments is now commonly considered a solved problem with tens of models proposed in the literature showing good performance against measurements. Among these, ray tracing is regarded as one of the most accurate ones available. In the present work, however, we show that a great deal of work is still needed to make ray tracing really unleash its potential in practical use. A very extensive validation of a state-of-the-art 3D ray tracing model is carried out through comparison with measurements in one of the most challenging environments: the city of San Francisco. Although the comparison is based on RF cellular coverage at 850 and 1900 MHz, a widely studied territory, very relevant sources of error and inaccuracy are identified in several cases along with possible solutions.
Simplification of vector ray tracing by the groove function.
Hu, Zhongwen; Liu, Zuping; Wang, Qiuping
2005-01-01
Tracing rays through arbitrary diffraction gratings (including holographic gratings of the second generation fabricated on a curved substrate) by the vector form is somewhat complicated. Vector ray tracing utilizes the local groove density, the calculation of which highly depends on how the grooves are generated. Characterizing a grating by its groove function, available for almost arbitrary gratings, is much simpler than doing so by its groove density, essentially being a vector. Applying the concept of Riemann geometry, we give an expression of the groove density by the groove function. The groove function description of a grating can thus be incorporated into vector ray tracing, which is beneficial especially at the design stage. A unified explicit grating ray-tracing formalism is given as well.
Three-dimensional polarization ray-tracing calculus II: retardance.
Yun, Garam; McClain, Stephen C; Chipman, Russell A
2011-06-20
The concept of retardance is critically analyzed for ray paths through optical systems described by a three-by-three polarization ray-tracing matrix. Algorithms are presented to separate the effects of retardance from geometric transformations. The geometric transformation described by a "parallel transport matrix" characterizes nonpolarizing propagation through an optical system, and also provides a proper relationship between sets of local coordinates along the ray path. The proper retardance is calculated by removing this geometric transformation from the three-by-three polarization ray-tracing matrix. Two rays with different ray paths through an optical system can have the same polarization ray-tracing matrix but different retardances. The retardance and diattenuation of an aluminum-coated three fold-mirror system are analyzed as an example.
Open Source Scanning Probe Microscopy Control Software Package Gxsm
Zahl P.; Wagner, T.; Moller, R.; Klust, A.
2009-08-10
Gxsm is a full featured and modern scanning probe microscopy (SPM) software. It can be used for powerful multidimensional image/data processing, analysis, and visualization. Connected toan instrument, it is operating many different avors of SPM, e.g., scanning tunneling microscopy(STM) and atomic force microscopy (AFM) or in general two-dimensional multi channel data acquisition instruments. The Gxsm core can handle different data types, e.g., integer and oating point numbers. An easily extendable plug-in architecture provides many image analysis and manipulation functions. A digital signal processor (DSP) subsystem runs the feedback loop, generates the scanning signals and acquires the data during SPM measurements. The programmable Gxsm vector probe engine performs virtually any thinkable spectroscopy and manipulation task, such as scanning tunneling spectroscopy (STS) or tip formation. The Gxsm software is released under the GNU general public license (GPL) and can be obtained via the Internet.
PALSfit3: A software package for analysing positron lifetime spectra
Kirkegaard, Peter; Olsen, Jens V.; Eldrup, Morten Mostgaard
been used extensively by the positron annihilation community. The present document describes the mathematical foundation of the PALSfit3 model as well as a number of features of the program. The cornerstones of PALSfit3 are two least squares fitting modules: POSITRONFIT and RESOLUTIONFIT. In both...... in a text window. PALSfit3 is verified on Windows XP and Windows 7, 8 and 10. The PALSfit3 software can be acquired from the Technical University of Denmark (http://PALSfit.dk)...
A Relative Comparison of Leading Supply Chain Management Software Packages
Zhongxian Wang; Ruiliang Yan; Kimberly Hollister; Ruben Xing
2009-01-01
Supply Chain Management (SCM) has proven to be an effective tool that aids companies in the development of competitive advantages. SCM Systems are relied on to manage warehouses, transportation, trade logistics and various other issues concerning the coordinated movement of products and services from suppliers to customers. Although in todayâ€™s fast paced business environment, numerous supply chain solution tools are readily available to companies, choosing the right SCM software is not an e...
Development of a software package for solid-angle calculations using the Monte Carlo method
Zhang, Jie, E-mail: zhangjie_scu@163.com [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China); College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Chen, Xiulian [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Zhang, Changsheng [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China); Li, Gang [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Xu, Jiayun, E-mail: xjy@scu.edu.cn [College of Physical Science and Technology, Sichuan University, Chengdu 610064 (China); Sun, Guangai [Key Laboratory for Neutron Physics of Chinese Academy of Engineering Physics, Institute of Nuclear Physics and Chemistry, Mianyang 621900 (China)
2014-02-01
Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C{sup ++}, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4. -- Highlights: • This software package (SAC) can give accurate solid-angle values. • SAC calculate solid angles using the Monte Carlo method and it has higher computation speed than Geant4. • A simple but effective variance reduction technique which was put forward by the authors has been applied in SAC. • A visualization function and a graphical user interface are also integrated in SAC.
Simulating water, solute, and heat transport in the subsurface with the VS2DI software package
Healy, R.W.
2008-01-01
The software package VS2DI was developed by the U.S. Geological Survey for simulating water, solute, and heat transport in variably saturated porous media. The package consists of a graphical preprocessor to facilitate construction of a simulation, a postprocessor for visualizing simulation results, and two numerical models that solve for flow and solute transport (VS2DT) and flow and heat transport (VS2DH). The finite-difference method is used to solve the Richards equation for flow and the advection-dispersion equation for solute or heat transport. This study presents a brief description of the VS2DI package, an overview of the various types of problems that have been addressed with the package, and an analysis of the advantages and limitations of the package. A review of other models and modeling approaches for studying water, solute, and heat transport also is provided. ?? Soil Science Society of America. All rights reserved.
Maximize Your Investment 10 Key Strategies for Effective Packaged Software Implementations
Beaubouef, Grady Brett
2009-01-01
This is a handbook covering ten principles for packaged software implementations that project managers, business owners, and IT developers should pay attention to. The book also has practical real-world coverage including a sample agenda for conducting business solution modeling, customer case studies, and a road map to implement guiding principles. This book is aimed at enterprise architects, development leads, project managers, business systems analysts, business systems owners, and anyone who wants to implement packaged software effectively. If you are a customer looking to implement COTS s
GPU-based Ray Tracing of Dynamic Scenes
Christopher Lux
2010-08-01
Full Text Available Interactive ray tracing of non-trivial scenes is just becoming feasible on single graphics processing units (GPU. Recent work in this area focuses on building effective acceleration structures, which work well under the constraints of current GPUs. Most approaches are targeted at static scenes and only allow navigation in the virtual scene. So far support for dynamic scenes has not been considered for GPU implementations. We have developed a GPU-based ray tracing system for dynamic scenes consisting of a set of individual objects. Each object may independently move around, but its geometry and topology are static.
Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Livia Almeida Bueno; Freitas, Deborah Queiroz, E-mail: tataventorini@hotmail.com [Universidade Estadual de Campinas (UNICAMP), Piracicaba, SP (Brazil). Faculdade de Odontologia
2015-03-01
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (‑0.11 and ‑0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p > 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data. (author)
Software package as an information center product. [Activities of Argonne Code Center
Butler, M. K.
1977-01-01
The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables. (RWR)
3-D TECATE/BREW: Thermal, stress, and birefringent ray-tracing codes for solid-state laser design
Gelinas, R. J.; Doss, S. K.; Nelson, R. G.
1994-07-01
This report describes the physics, code formulations, and numerics that are used in the TECATE (totally Eulerian code for anisotropic thermo-elasticity) and BREW (birefringent ray-tracing of electromagnetic waves) codes for laser design. These codes resolve thermal, stress, and birefringent optical effects in 3-D stationary solid-state systems. This suite of three constituent codes is a package referred to as LASRPAK.
无
2002-01-01
This paper introduces a software specially in calculating the contribution rate of machanization in agriculture by usng economy math method ,computer technology and Visual Basic 6. 0 version. The software package has friendly interface,simple operating way and accurate, feasible calculating method. It greatly changes the condition in the past which had considerable lots of data and miscellaneous and trivial methods,which were even hard to seek answer. So it has very high practicl value.
Masmoudi, Nabil
2014-01-01
We present an approximate, but efficient and sufficiently accurate P-wave ray tracing and dynamic ray tracing procedure for 3D inhomogeneous, weakly orthorhombic media with varying orientation of symmetry planes. In contrast to commonly used approaches, the orthorhombic symmetry is preserved at any point of the model. The model is described by six weak-anisotropy parameters and three Euler angles, which may vary arbitrarily, but smoothly, throughout the model. We use the procedure for the calculation of rays and corresponding two-point traveltimes in a VSP experiment in a part of the BP benchmark model generalized to orthorhombic symmetry.
Zheng, Guangyong; Xu, Yaochen; Zhang, Xiujun; Liu, Zhi-Ping; Wang, Zhuo; Chen, Luonan; Zhu, Xin-Guang
2016-12-23
A gene regulatory network (GRN) represents interactions of genes inside a cell or tissue, in which vertexes and edges stand for genes and their regulatory interactions respectively. Reconstruction of gene regulatory networks, in particular, genome-scale networks, is essential for comparative exploration of different species and mechanistic investigation of biological processes. Currently, most of network inference methods are computationally intensive, which are usually effective for small-scale tasks (e.g., networks with a few hundred genes), but are difficult to construct GRNs at genome-scale. Here, we present a software package for gene regulatory network reconstruction at a genomic level, in which gene interaction is measured by the conditional mutual information measurement using a parallel computing framework (so the package is named CMIP). The package is a greatly improved implementation of our previous PCA-CMI algorithm. In CMIP, we provide not only an automatic threshold determination method but also an effective parallel computing framework for network inference. Performance tests on benchmark datasets show that the accuracy of CMIP is comparable to most current network inference methods. Moreover, running tests on synthetic datasets demonstrate that CMIP can handle large datasets especially genome-wide datasets within an acceptable time period. In addition, successful application on a real genomic dataset confirms its practical applicability of the package. This new software package provides a powerful tool for genomic network reconstruction to biological community. The software can be accessed at http://www.picb.ac.cn/CMIP/ .
Real time ray tracing of skeletal implicit surfaces
Rouiller, Olivier; Bærentzen, Jakob Andreas
Modeling and rendering in real time is usually done via rasterization of polygonal meshes. We present a method to model with skeletal implicit surfaces and an algorithm to ray trace these surfaces in real time in the GPU. Our skeletal representation of the surfaces allows to create smooth models...
Ray Tracing Modelling of Reflector for Vertical Bifacial Panel
Jakobsen, Michael Linde; Thorsteinsson, Sune; Poulsen, Peter Behrensdorff
2016-01-01
Bifacial solar panels have recently become a new attractive building block for PV systems. In this work we propose a reflector system for a vertical bifacial panel, and use ray tracing modelling to model the performance. Particularly, we investigate the impact of the reflector volume being filled...
Ray tracing and refraction in the modified US1976 atmosphere
van der Werf, SY
2003-01-01
A new and flexible ray-tracing procedure for calculating astronomical refraction is outlined and applied to the US1976 standard atmosphere. This atmosphere is generalized to allow for a free choice of the temperature and pressure at sea level, and in this form it has been named the modified US1976
Comparison of four software packages for CT lung volumetry in healthy individuals
Nemec, Stefan F. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States); Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria); Molinari, Francesco [Centre Hospitalier Regional Universitaire de Lille, Department of Radiology, Lille (France); Dufresne, Valerie [CHU de Charleroi - Hopital Vesale, Pneumologie, Montigny-le-Tilleul (Belgium); Gosset, Natacha [CHU Tivoli, Service d' Imagerie Medicale, La Louviere (Belgium); Silva, Mario; Bankier, Alexander A. [Harvard Medical School, Department of Radiology, Beth Israel Deaconess Medical Center, Boston, MA (United States)
2015-06-01
To compare CT lung volumetry (CTLV) measurements provided by different software packages, and to provide normative data for lung densitometric measurements in healthy individuals. This retrospective study included 51 chest CTs of 17 volunteers (eight men and nine women; mean age, 30 ± 6 years), who underwent spirometrically monitored CT at total lung capacity (TLC), functional residual capacity (FRC), and mean inspiratory capacity (MIC). Volumetric differences assessed by four commercial software packages were compared with analysis of variance (ANOVA) for repeated measurements and benchmarked against the threshold for acceptable variability between spirometric measurements. Mean lung density (MLD) and parenchymal heterogeneity (MLD-SD) were also compared with ANOVA. Volumetric differences ranged from 12 to 213 ml (0.20 % to 6.45 %). Although 16/18 comparisons (among four software packages at TLC, MIC, and FRC) were statistically significant (P < 0.001 to P = 0.004), only 3/18 comparisons, one at MIC and two at FRC, exceeded the spirometry variability threshold. MLD and MLD-SD significantly increased with decreasing volumes, and were significantly larger in lower compared to upper lobes (P < 0.001). Lung volumetric differences provided by different software packages are small. These differences should not be interpreted based on statistical significance alone, but together with absolute volumetric differences. (orig.)
Dikkers, Riksta; Willems, Tineke P.; de Jonge, Gonda J.; Marquering, Henk A.; Greuter, Marcel J. W.; van Ooijen, Peter M. A.; van der Weide, Marijke C. Jansen; Oudkerk, Matthijs
2009-01-01
Purpose: The purpose of this study was to investigate the noninvasive quantification of coronary artery stenosis using cardiac software packages and vessel phantoms with known stenosis severity. Materials and Methods: Four different sizes of vessel phantoms were filled with contrast agent and
Manual of spIds, a software package for parameter identification in dynamic systems
Everaars, C.T.H.; Hemker, P.W.; Stortelder, W.J.H.
1995-01-01
This report contains the manual of spIds, version 1.0, a software package for parameter identification in dynamic systems. SpIdslabel{ab:spIds is an acronym of underline{simulation and underline{parameter underline{identification in underline{dynamic underline {systems. It can be applied on wide var
Application of CyboCon Advanced Adjustment and Control Software Package in Delayed Coking Unit
Guo Hua
2002-01-01
This article refers to application of the CyboCon software package based upon the model-free adaptive control (MFA) in the 800-kt/a delayed coking unit to realize an advanced adjustment and control strategy for the temperature control of the heater. Operation tests have revealed the convenience in operating system and simplicity in maintenance, leading to good economic benefits.
A Microcomputer-Based Software Package for Eye-Monitoring Research. Technical Report No. 434.
McConkie, George W.; And Others
A software package is described that collects and reduces eye behavior data (eye position and pupil size) using an IBM-PC compatible computer. Written in C language for speed and portability, it includes several features: (1) data can be simultaneously collected from other sources (such as electroencephalography and electromyography); (2)…
PyPedal, an open source software package for pedigree analysis
The open source software package PyPedal (http://pypedal.sourceforge.net/) was first released in 2002, and provided users with a set of simple tools for manipulating pedigrees. Its flexibility has been demonstrated by its used in a number of settings for large and small populations. After substantia...
Application of modern software packages to calculating the solidification of high-speed steels
Morozov, S. I.
2015-12-01
The solidification of high-speed steels is calculated with the Pandat and JMatPro software packages. The results of calculating equilibrium and nonequilibrium solidification are presented and discussed. The nonequilibrium solidification is simulated using the Shelley-Gulliver model. The fraction of carbides changes as a function of the carbon content in the steels.
Edwards, T.B.
2000-03-22
The purpose of this report is to provide software verification and validation for the statistical packages used by the Statistical Consulting Section (SCS) of the Savannah River Technology Center. The need for this verification and validation stems from the requirements of the Quality Assurance programs that are frequently applicable to the work conducted by SCS. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore the software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are reevaluated using these new tools, this report is to be revised to address their verification and validation.
MATEO: a software package for the molecular design of energetic materials.
Mathieu, Didier
2010-04-15
To satisfy the need of energetic materials chemists for reliable and efficient predictive tools in order to select the most promising candidates for synthesis, a custom software package is developed. Making extensive use of publicly available software, it integrates a wide range of models and can be used for a variety of tasks, from the calculation of molecular properties to the prediction of the performance of heterogeneous materials, such as propellant compositions based on ammonium perchlorate/aluminium mixtures. The package is very easy to use through a graphical desktop environment. According to the material provided as input, suitable models and parameters are automatically selected. Therefore, chemists can apply advanced predictive models without having to learn how to use complex computer codes. To make the package more versatile, a command-line interface is also provided. It facilitates the assessment of various procedures by model developers.
A software algorithm/package for control loop configuration and eco-efficiency.
Munir, M T; Yu, W; Young, B R
2012-11-01
Software is a powerful tool to help us analyze industrial information and control processes. In this paper, we will show our recently development of a software algorithm/package which can help us select the more eco-efficient control configuration. Nowadays, the eco-efficiency of all industrial processes/plants has become more and more important; engineers need to find a way to integrate control loop configuration and measurements of eco-efficiency. The exergy eco-efficiency factor; a new measure of eco-efficiency for control loop configuration has been developed. This software algorithm/package will combine a commercial simulator, VMGSim, and Excel together to calculate the exergy eco-efficiency factor.
,
2015-01-01
We have developed a non-sequential ray-tracing simulation library, ROot-BAsed Simulator for ray Tracing (ROBAST), which is aimed for wide use in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++ and fully utilizes the geometry library of the ROOT analysis framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used existed. To reduce the unnecessary effort demanded when different research groups develop multiple ray-tracing simulators, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the proposed telescope designs for CTA, ROBAST is currently being used for three telescopes: a Schwarzschild--Couder telescope, one of the Schwarzschild--Couder small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulations and the development of hexagonal light concentrators that has be...
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user
Odyssey: Ray tracing and radiative transfer in Kerr spacetime
Pu, Hung-Yi; Yun, Kiyun; Younsi, Ziri; Yoon, Suk-Jin
2016-01-01
Odyssey is a GPU-based General Relativistic Radiative Transfer (GRRT) code for computing images and/or spectra in Kerr metric describing the spacetime around a rotating black hole. Odyssey is implemented in CUDA C/C++. For flexibility, the namespace structure in C++ is used for different tasks; the two default tasks presented in the source code are the redshift of a Keplerian disk and the image of a Keplerian rotating shell at 340GHz. Odyssey_Edu, an educational software package for visualizing the ray trajectories in the Kerr spacetime that uses Odyssey, is also available.
3D ultrasonic ray tracing in AutoCAD®
Reilly, D.; Leggat, P.; McNab, A.
2001-04-01
To assist with the design and validation of testing procedures for NDT, add-on modules have been developed for AutoCAD® 2000. One of the modules computes and displays ultrasonic 3D ray tracing. Another determines paths between two points, for instance a probe and a target or two probes. The third module displays phased array operational modes and calculates element delays for phased array operation. The modules can be applied to simple or complex solid model components.
Improved algorithm of ray tracing in ICF cryogenic targets
Zhang, Rui; Yang, Yongying; Ling, Tong; Jiang, Jiabin
2016-10-01
The high precision ray tracing inside inertial confinement fusion (ICF) cryogenic targets plays an important role in the reconstruction of the three-dimensional density distribution by algebraic reconstruction technique (ART) algorithm. The traditional Runge-Kutta methods, which is restricted by the precision of the grid division and the step size of ray tracing, cannot make an accurate calculation in the case of refractive index saltation. In this paper, we propose an improved algorithm of ray tracing based on the Runge-Kutta methods and Snell's law of refraction to achieve high tracing precision. On the boundary of refractive index, we apply Snell's law of refraction and contact point search algorithm to ensure accuracy of the simulation. Inside the cryogenic target, the combination of the Runge-Kutta methods and self-adaptive step algorithm are employed for computation. The original refractive index data, which is used to mesh the target, can be obtained by experimental measurement or priori refractive index distribution function. A finite differential method is performed to calculate the refractive index gradient of mesh nodes, and the distance weighted average interpolation methods is utilized to obtain refractive index and gradient of each point in space. In the simulation, we take ideal ICF target, Luneberg lens and Graded index rod as simulation model to calculate the spot diagram and wavefront map. Compared the simulation results to Zemax, it manifests that the improved algorithm of ray tracing based on the fourth-order Runge-Kutta methods and Snell's law of refraction exhibits high accuracy. The relative error of the spot diagram is 0.2%, and the peak-to-valley (PV) error and the root-mean-square (RMS) error of the wavefront map is less than λ/35 and λ/100, correspondingly.
Tropospheric Refraction Modeling Using Ray-Tracing and Parabolic Equation
P. Pechac
2005-12-01
Full Text Available Refraction phenomena that occur in the lower atmospheresignificantly influence the performance of wireless communicationsystems. This paper provides an overview of corresponding computationalmethods. Basic properties of the lower atmosphere are mentioned.Practical guidelines for radiowave propagation modeling in the loweratmosphere using ray-tracing and parabolic equation methods are given.In addition, a calculation of angle-of-arrival spectra is introducedfor multipath propagation simulations.
Ray Tracing Modelling of Reflector for Vertical Bifacial Panel
Jakobsen, Michael Linde; Thorsteinsson, Sune; Poulsen, Peter Behrensdorff
2016-01-01
Bifacial solar panels have recently become a new attractive building block for PV systems. In this work we propose a reflector system for a vertical bifacial panel, and use ray tracing modelling to model the performance. Particularly, we investigate the impact of the reflector volume being filled...... with a refractive medium, and shows the refractive medium improves the reflector performance since it directs almost all the light incident on the incoming plane into the PV panel....
The Search for Efficiency in Arboreal Ray Tracing Applications
van Leeuwen, M.; Disney, M.; Chen, J. M.; Gomez-Dans, J.; Kelbe, D.; van Aardt, J. A.; Lewis, P.
2016-12-01
Forest structure significantly impacts a range of abiotic conditions, including humidity and the radiation regime, all of which affect the rate of net and gross primary productivity. Current forest productivity models typically consider abstract media to represent the transfer of radiation within the canopy. Examples include the representation forest structure via a layered canopy model, where leaf area and inclination angles are stratified with canopy depth, or as turbid media where leaves are randomly distributed within space or within confined geometric solids such as blocks, spheres or cones. While these abstract models are known to produce accurate estimates of primary productivity at the stand level, their limited geometric resolution restricts applicability at fine spatial scales, such as the cell, leaf or shoot levels, thereby not addressing the full potential of assimilation of data from laboratory and field measurements with that of remote sensing technology. Recent research efforts have explored the use of laser scanning to capture detailed tree morphology at millimeter accuracy. These data can subsequently be used to combine ray tracing with primary productivity models, providing an ability to explore trade-offs among different morphological traits or assimilate data from spatial scales, spanning the leaf- to the stand level. Ray tracing has a major advantage of allowing the most accurate structural description of the canopy, and can directly exploit new 3D structural measurements, e.g., from laser scanning. However, the biggest limitation of ray tracing models is their high computational cost, which currently limits their use for large-scale applications. In this talk, we explore ways to more efficiently exploit ray tracing simulations and capture this information in a readily computable form for future evaluation, thus potentially enabling large-scale first-principles forest growth modelling applications.
Ray tracing reconstruction investigation for C-arm tomosynthesis
Malalla, Nuhad A. Y.; Chen, Ying
2016-04-01
C-arm tomosynthesis is a three dimensional imaging technique. Both x-ray source and the detector are mounted on a C-arm wheeled structure to provide wide variety of movement around the object. In this paper, C-arm tomosynthesis was introduced to provide three dimensional information over a limited view angle (less than 180o) to reduce radiation exposure and examination time. Reconstruction algorithms based on ray tracing method such as ray tracing back projection (BP), simultaneous algebraic reconstruction technique (SART) and maximum likelihood expectation maximization (MLEM) were developed for C-arm tomosynthesis. C-arm tomosynthesis projection images of simulated spherical object were simulated with a virtual geometric configuration with a total view angle of 40 degrees. This study demonstrated the sharpness of in-plane reconstructed structure and effectiveness of removing out-of-plane blur for each reconstruction algorithms. Results showed the ability of ray tracing based reconstruction algorithms to provide three dimensional information with limited angle C-arm tomosynthesis.
Bill2d -- a software package for classical two-dimensional Hamiltonian systems
Solanpää, Janne; Räsänen, Esa
2016-01-01
We present Bill2d, a modern and efficient C++ package for classical simulations of two-dimensional Hamiltonian systems. Bill2d can be used for various billiard and diffusion problems with one or more charged particles with interactions, different external potentials, an external magnetic field, periodic and open boundaries, etc. The software package can also calculate many key quantities in complex systems such as Poincar\\'e sections, survival probabilities, and diffusion coefficients. While aiming at a large class of applicable systems, the code also strives for ease-of-use, efficiency, and modularity for the implementation of additional features. The package comes along with a user guide, a developer's manual, and a documentation of the application program interface (API).
BILL2D - A software package for classical two-dimensional Hamiltonian systems
Solanpää, J.; Luukko, P. J. J.; Räsänen, E.
2016-02-01
We present BILL2D, a modern and efficient C++ package for classical simulations of two-dimensional Hamiltonian systems. BILL2D can be used for various billiard and diffusion problems with one or more charged particles with interactions, different external potentials, an external magnetic field, periodic and open boundaries, etc. The software package can also calculate many key quantities in complex systems such as Poincaré sections, survival probabilities, and diffusion coefficients. While aiming at a large class of applicable systems, the code also strives for ease-of-use, efficiency, and modularity for the implementation of additional features. The package comes along with a user guide, a developer's manual, and a documentation of the application program interface (API).
User's Guide for the MapImage Reprojection Software Package, Version 1.01
Finn, Michael P.; Trent, Jason R.
2004-01-01
Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional
A welding document management software package based on a Client/Server structure
魏艳红; 杨春利; 王敏
2003-01-01
According to specifications for Welding Procedure Qualification of ASME IX Section and Chinese code, JB 4708-2000, a software package for managing welding documents has been rebuilt. Consequently, the new software package can be used in a Limited Area Network (LAN) with 4 different levels of authorities for different users. Therefore, the welding documents, including DWPS (Design for Welding Procedure Specifications), PQRs (Procedure Qualification Records) and WPS (Welding Procedure Specifications) can be shared within a company. At the same time, the system provides users various functions such as browsing, copying, editing, searching and printing records, and helps users to make decision of whether a new PQR test is necessary or not according to the codes above as well. Furthermore, super users can also browse the history of record modification and retrieve the records when needed.
High performance computing software package for multitemporal Remote-Sensing computations
Asaad Chahboun
2010-10-01
Full Text Available With the huge satellite data actually stored, remote sensing multitemporal study is nowadays one of the most challenging fields of computer science. The multicore hardware support and Multithreading can play an important role in speeding up algorithm computations. In the present paper, a software package (called Multitemporal Software Package for Satellite Remote sensing data (MSPSRS has been developed for the multitemporal treatment of satellite remote sensing images in a standard format. Due to portability intend, the interface was developed using the QT application framework and the core wasdeveloped integrating C++ classes. MSP.SRS can run under different operating systems (i.e., Linux, Mac OS X, Windows, Embedded Linux, Windows CE, etc.. Final benchmark results, using multiple remote sensing biophysical indices, show a gain up to 6X on a quad core i7 personal computer.
The GeoSteiner software package for computing Steiner trees in the plane
Juhl, Daniel; Warme, David M.; Winter, Pawel;
The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the Geo......Steiner approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....
Determination of stress-strain state of the wooden church log walls with software package
Chulkova Anastasia
2016-01-01
Full Text Available The restoration of architectural monuments is going on all over the world today. The main aim of restoration is the renewal of stable functioning of building constructions in normal state. In this article, we have tried to figure out with special software the bearing capacity of log cabins of the Church of Transfiguration on Kizhi island. As shown in research results, determination of stress-strain stage with software package is necessary for the bearing capacity computation as well as field tests.
PsyToolkit: a software package for programming psychological experiments using Linux.
Stoet, Gijsbert
2010-11-01
PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
Spruyt, Adriaan; Clarysse, Jeroen; Vansteenwegen, Debora; Baeyens, Frank; Hermans, Dirk
2010-01-01
We describe Affect 4.0, a user-friendly software package for implementing psychological and psychophysiological experiments. Affect 4.0 can be used to present visual, acoustic, and/or tactile stimuli in highly complex (i.e., semirandomized and response-contingent) sequences. Affect 4.0 is capable of registering response latencies and analog behavioral input with millisecond accuracy. Affect 4.0 is available free of charge.
Simulation of combustion products flow in the Laval nozzle in the software package SIFIN
Alhussan, K. A.; Teterev, A. V.
2017-07-01
Developed specialized multifunctional software package SIFIN (Simulation of Internal Flow In the Nozzle) designed for the numerical simulation of the flow of products of combustion in a Laval nozzle. It allows to design the different profiles of the nozzles, to simulate flow of multicomponent media based energy release by burning, to study the effect of swirling flow of products of combustion at the nozzle settings, to investigate the nature of the expiry of the gas jet with varying degrees of pressure ratio.
Edwards, T.B.
2001-01-16
The purpose of this report is to provide software verification and validation (v and v) for the statistical packages utilized by the Statistical Consulting Section (SCS) of the Savannah River Technology Center (SRTC). The need for this v and v stems from the requirements of the Quality Assurance (QA) programs that are frequently applicable to the work conducted by SCS. This document is designed to comply with software QA requirements specified in the 1Q Manual Quality Assurance Procedure 20-1, Revision 6. Revision 1 of this QA plan adds JMP Version 4 to the family of (commercially-available) statistical tools utilized by SCS. JMP Version 3.2.2 is maintained as a support option due to features unique to this version of JMP that have not as yet been incorporated into Version 4. SCS documents that include JMP output should provide a clear indication of the version or versions of JMP that were used. The IBM Personal Computer 300PL and 300XL are both Pentium II based desktops. Therefore, th e software verification and validation in this report is valid interchangeably between both platforms. As new computing platforms, statistical packages, or revisions to existing packages are introduced into the Statistical Consulting Section, the appropriate problems from this report are to be re-evaluated, and this report is to be revised to address their verification and validation.
Gizzatkulov Nail M
2010-08-01
Full Text Available Abstract Background Systems biology research and applications require creation, validation, extensive usage of mathematical models and visualization of simulation results by end-users. Our goal is to develop novel method for visualization of simulation results and implement it in simulation software package equipped with the sophisticated mathematical and computational techniques for model development, verification and parameter fitting. Results We present mathematical simulation workbench DBSolve Optimum which is significantly improved and extended successor of well known simulation software DBSolve5. Concept of "dynamic visualization" of simulation results has been developed and implemented in DBSolve Optimum. In framework of the concept graphical objects representing metabolite concentrations and reactions change their volume and shape in accordance to simulation results. This technique is applied to visualize both kinetic response of the model and dependence of its steady state on parameter. The use of the dynamic visualization is illustrated with kinetic model of the Krebs cycle. Conclusion DBSolve Optimum is a user friendly simulation software package that enables to simplify the construction, verification, analysis and visualization of kinetic models. Dynamic visualization tool implemented in the software allows user to animate simulation results and, thereby, present them in more comprehensible mode. DBSolve Optimum and built-in dynamic visualization module is free for both academic and commercial use. It can be downloaded directly from http://www.insysbio.ru.
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
A Software Package Using a Mesh-grid Method for Simulating HPGe Detector Efficiencies
Kevin Jackman
2009-10-01
Traditional ways of determining the absolute full-energy peak efficiencies of high-purity germanium (HPGe) detectors are often time consuming, cost prohibitive, or not feasible. A software package, KMESS (Kevin’s Mesh Efficiency Simulator Software), was developed to assist in predicting these efficiencies. It uses a semiempirical mesh-grid method and works for arbitrary source shapes and counting geometries. The model assumes that any gamma-ray source shape can be treated as a large enough collection of point sources. The code is readily adaptable, has a web-based graphical front-end, and could easily be coupled to a 3D scanner. As will be shown, this software can estimate absolute full-energy peak efficiencies with good accuracy in reasonable computation times. It has applications to the field of gamma-ray spectroscopy because it is a quick and accurate way to assist in performing quantitative analyses using HPGe detectors.
A software package using a mesh-grid method for simulating HPGe detector efficiencies
Gritzo, Russell E [Los Alamos National Laboratory; Jackman, Kevin R [REMOTE SENSING LAB; Biegalski, Steven R [UT AUSTIN
2009-01-01
Traditional ways of determining the absolute full-energy peak efficiencies of high-purity germanium (HPGe) detectors are often time consuming, cost prohibitive, or not feasible. A software package, KMESS (Kevin's Mesh Efficiency Simulator Software), was developed to assist in predicting these efficiencies. It uses a semiempirical mesh-grid method and works for arbitrary source shapes and counting geometries. The model assumes that any gamma-ray source shape can be treated as a large enough collection of point sources. The code is readily adaptable, has a web-based graphical front-end. and could easily be coupled to a 3D scanner. As will be shown. this software can estimate absolute full-energy peak efficiencies with good accuracy in reasonable computation times. It has applications to the field of gamma-ray spectroscopy because it is a quick and accurate way to assist in performing quantitative analyses using HPGe detectors.
Software package for the design and analysis of DNA origami structures
Andersen, Ebbe Sloth; Nielsen, Morten Muhlig; Dong, Mingdong
was observed on the mica surface with a fraction of the dolphin nanostructures showing extensive tail flexibility of approximately 90 degrees. The Java editor and tools are free software distributed under the GNU license. The open architecture of the editor makes it easy for the scientific community......A software package was developed for the semi-automated design of DNA origamis and further data analysis of Atomic Force Microscopy (AFM) images. As an example, we design the shape of a bottlenose dolphin and analyze it by means of high resolution AFM imaging. A high yield of DNA dolphins...... to contribute new tools and functionalities. Documentation, tutorials and software will be made available online....
SWISTRACK - AN OPEN SOURCE, SOFTWARE PACKAGE APPLICABLE TO TRACKING OF FISH LOCOMOTION AND BEHAVIOUR
Steffensen, John Fleng
2010-01-01
, Swistrack can be easily adopted for the tracking offish. Benefits associated with the free software include: • Contrast or marker based tracking enabling tracking of either the whole animal, or tagged marks placed upon the animal • The ability to track multiple tags placed upon an individual animal • Highly...... including swimming speed, acceleration and directionality of movements as well as the examination of locomotory panems during swimming. SWiSlrdL:k, a [n: t; and downloadable software package (available from www.sourceforge.com) is widely used for tracking robots, humans and other animals. Accordingly...... effective background subtraction algorithms and filters ensuring smooth tracking of fish • Application of tags of different colour enables the software to track multiple fish without the problem of track exchange between individuals • Low processing requirements enable tracking in real-time • Further...
Anukalpana 2.0: A Performance Evaluation Software Package for Akash Surface to Air Missile System
G.S. Raju
1997-07-01
Full Text Available Abstract : "An air defence system is a complex dynamic system comprising sensors, control centres, launchers and missiles. Practical evaluation of such a complex system is almost impossible and very expensive. Further, during development of the system, there is a necessity to evaluate certain design characteristics before it is implemented. Consequently, need arises for a comprehensive simulation package which will simulate various subsystems of the air defence weapon system, so that performance of the system can be evaluated. With the above objectives in mind, a software package, called Anukalpana 2.0, has been developed. The first version of the package was developed at the Indian Institute of Science, Bangalore. This program has been subsequently updated. The main objectives of this package are: (i evaluation of the performance of Akash air defence system and other similar air defence systems against any specified aerial threat, (ii investigation of effectiveness of the deployment tactics and operational logic employed at the firing batteries and refining them, (iii provision of aid for refining standard operating procedures (SOPs for the multitarget defence, and (iv exploring the possibility of using it as a user training tool at the level of Air Defence Commanders. The design specification and the simulation/modelling philosophy adopted for the development of this package are discussed at length. Since Akash air defence system has many probabilistic events, Monte Carlo method of simulation is used for both threat and defence. Implementation details of the package are discussed in brief. These include: data flow diagrams and interface details. Analysis of results for certain input cases is also covered."
Microseismic network design assessment based on 3D ray tracing
Näsholm, Sven Peter; Wuestefeld, Andreas; Lubrano-Lavadera, Paul; Lang, Dominik; Kaschwich, Tina; Oye, Volker
2016-04-01
There is increasing demand on the versatility of microseismic monitoring networks. In early projects, being able to locate any triggers was considered a success. These early successes led to a better understanding of how to extract value from microseismic results. Today operators, regulators, and service providers work closely together in order to find the optimum network design to meet various requirements. In the current study we demonstrate an integrated and streamlined network capability assessment approach. It is intended for use during the microseismic network design process prior to installation. The assessments are derived from 3D ray tracing between a grid of event points and the sensors. Three aspects are discussed: 1) Magnitude of completeness or detection limit; 2) Event location accuracy; and 3) Ground-motion hazard. The network capability parameters 1) and 2) are estimated at all hypothetic event locations and are presented in the form of maps given a seismic sensor coordinate scenario. In addition, the ray tracing traveltimes permit to estimate the point-spread-functions (PSFs) at the event grid points. PSFs are useful in assessing the resolution and focusing capability of the network for stacking-based event location and imaging methods. We estimate the performance for a hypothetical network case with 11 sensors. We consider the well-documented region around the San Andreas Fault Observatory at Depth (SAFOD) located north of Parkfield, California. The ray tracing is done through a detailed velocity model which covers a 26.2 by 21.2 km wide area around the SAFOD drill site with a resolution of 200 m both for the P-and S-wave velocities. Systematic network capability assessment for different sensor site scenarios prior to installation facilitates finding a final design which meets the survey objectives.
Simplifying numerical ray tracing for characterization of optical systems.
Gagnon, Yakir Luc; Speiser, Daniel I; Johnsen, Sönke
2014-07-20
Ray tracing, a computational method for tracing the trajectories of rays of light through matter, is often used to characterize mechanical or biological visual systems with aberrations that are larger than the effect of diffraction inherent in the system. For example, ray tracing may be used to calculate geometric point spread functions (PSFs), which describe the image of a point source after it passes through an optical system. Calculating a geometric PSF is useful because it gives an estimate of the detail and quality of the image formed by a given optical system. However, when using ray tracing to calculate a PSF, the accuracy of the estimated PSF directly depends on the number of discrete rays used in the calculation; higher accuracies may require more computational power. Furthermore, adding optical components to a modeled system will increase its complexity and require critical modifications so that the model will describe the system correctly, sometimes necessitating a completely new model. Here, we address these challenges by developing a method that represents rays of light as a continuous function that depends on the light's initial direction. By utilizing Chebyshev approximations (via the chebfun toolbox in MATLAB) for the implementation of this method, we greatly simplified the calculations for the location and direction of the rays. This method provides high precision and fast calculation speeds that allow the characterization of any symmetrical optical system (with a centered point source) in an analytical-like manner. Next, we demonstrate our methods by showing how they can easily calculate PSFs for complicated optical systems that contain multiple refractive and/or reflective interfaces.
Trans-Ionospheric High Frequency Signal Ray Tracing
Wright, S.; Gillespie, R. J.
2012-09-01
All electromagnetic radiation undergoes refraction as it propagates through the atmosphere. Tropospheric refraction is largely governed by interaction of the radiation with bounded electrons; ionospheric refraction is primarily governed by free electron interactions. The latter phenomenon is important for propagation and refraction of High Frequency (HF) through Extremely High Frequency (EHF) signals. The degree to which HF to EHF signals are bent is dependent upon the integrated refractive effect of the ionosphere: a result of the signal's angle of incidence with the boundaries between adjacent ionospheric regions, the magnitude of change in electron density between two regions, as well as the frequency of the signal. In the case of HF signals, the ionosphere may bend the signal so much that it is directed back down towards the Earth, making over-the-horizon HF radio communication possible. Ionospheric refraction is a major challenge for space-based geolocation applications, where the ionosphere is typically the biggest contributor to geolocation error. Accurate geolocation requires an algorithm that accurately reflects the physical process of a signal transiting the ionosphere, and an accurate specification of the ionosphere at the time of the signal transit. Currently implemented solutions are limited by both the algorithm chosen to perform the ray trace and by the accuracy of the ionospheric data used in the calculations. This paper describes a technique for adapting a ray tracing algorithm to run on a General-Purpose Graphics Processing Unit (GPGPU or GPU), and using a physics-based model specifying the ionosphere at the time of signal transit. This technique allows simultaneous geolocation of significantly more signals than an equivalently priced Central Processing Unit (CPU) based system. Additionally, because this technique makes use of the most widely accepted numeric algorithm for ionospheric ray tracing and a timely physics-based model of the ionosphere
The libRadtran software package for radiative transfer calculations (version 2.0.1)
Emde, Claudia; Buras-Schnell, Robert; Kylling, Arve; Mayer, Bernhard; Gasteiger, Josef; Hamann, Ulrich; Kylling, Jonas; Richter, Bettina; Pause, Christian; Dowling, Timothy; Bugliaro, Luca
2016-05-01
libRadtran is a widely used software package for radiative transfer calculations. It allows one to compute (polarized) radiances, irradiance, and actinic fluxes in the solar and thermal spectral regions. libRadtran has been used for various applications, including remote sensing of clouds, aerosols and trace gases in the Earth's atmosphere, climate studies, e.g., for the calculation of radiative forcing due to different atmospheric components, for UV forecasting, the calculation of photolysis frequencies, and for remote sensing of other planets in our solar system. The package has been described in Mayer and Kylling (2005). Since then several new features have been included, for example polarization, Raman scattering, a new molecular gas absorption parameterization, and several new parameterizations of cloud and aerosol optical properties. Furthermore, a graphical user interface is now available, which greatly simplifies the usage of the model, especially for new users. This paper gives an overview of libRadtran version 2.0.1 with a focus on new features. Applications including these new features are provided as examples of use. A complete description of libRadtran and all its input options is given in the user manual included in the libRadtran software package, which is freely available at http://www.libradtran.org.
GET_HOMOLOGUES, a versatile software package for scalable and robust microbial pangenome analysis.
Contreras-Moreira, Bruno; Vinuesa, Pablo
2013-12-01
GET_HOMOLOGUES is an open-source software package that builds on popular orthology-calling approaches making highly customizable and detailed pangenome analyses of microorganisms accessible to nonbioinformaticians. It can cluster homologous gene families using the bidirectional best-hit, COGtriangles, or OrthoMCL clustering algorithms. Clustering stringency can be adjusted by scanning the domain composition of proteins using the HMMER3 package, by imposing desired pairwise alignment coverage cutoffs, or by selecting only syntenic genes. The resulting homologous gene families can be made even more robust by computing consensus clusters from those generated by any combination of the clustering algorithms and filtering criteria. Auxiliary scripts make the construction, interrogation, and graphical display of core genome and pangenome sets easy to perform. Exponential and binomial mixture models can be fitted to the data to estimate theoretical core genome and pangenome sizes, and high-quality graphics can be generated. Furthermore, pangenome trees can be easily computed and basic comparative genomics performed to identify lineage-specific genes or gene family expansions. The software is designed to take advantage of modern multiprocessor personal computers as well as computer clusters to parallelize time-consuming tasks. To demonstrate some of these capabilities, we survey a set of 50 Streptococcus genomes annotated in the Orthologous Matrix (OMA) browser as a benchmark case. The package can be downloaded at http://www.eead.csic.es/compbio/soft/gethoms.php and http://maya.ccg.unam.mx/soft/gethoms.php.
QED v 1.0: a software package for quantitative electron diffraction data treatment.
Belletti, D; Calestani, G; Gemmi, M; Migliori, A
2000-03-01
A new software package for quantitative electron diffraction data treatment of unknown structures is described. No "a priori" information is required by the package which is able to perform in successive steps the 2-D indexing of digitised diffraction patterns, the extraction of the intensity of the collected reflections and the 3-D indexing of all recorded patterns, giving as results the lattice parameters of the investigated structure and a series of data files (one for each diffraction pattern) containing the measured intensities and the relative e.s.d.s of the 3-D indexed reflections. The software package is mainly conceived for the treatment of diffraction patterns taken with a Gatan CCD Slow-Scan Camera, but it can also deal with generic digitised plates. The program is designed to extract intensity data suitable for structure solution techniques in electron crystallography. The integration routine is optimised for a correct background evaluation, a necessary condition to deal with weak spots of irregular shape and an intensity just above the background.
Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.
2017-06-01
SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.
Photorealistic ray tracing to visualize automobile side mirror reflective scenes.
Lee, Hocheol; Kim, Kyuman; Lee, Gang; Lee, Sungkoo; Kim, Jingu
2014-10-20
We describe an interactive visualization procedure for determining the optimal surface of a special automobile side mirror, thereby removing the blind spot, without the need for feedback from the error-prone manufacturing process. If the horizontally progressive curvature distributions are set to the semi-mathematical expression for a free-form surface, the surface point set can then be derived through numerical integration. This is then converted to a NURBS surface while retaining the surface curvature. Then, reflective scenes from the driving environment can be virtually realized using photorealistic ray tracing, in order to evaluate how these reflected images would appear to drivers.
Adaptive image ray-tracing for astrophysical simulations
Parkin, E R
2010-01-01
A technique is presented for producing synthetic images from numerical simulations whereby the image resolution is adapted around prominent features. In so doing, adaptive image ray-tracing (AIR) improves the efficiency of a calculation by focusing computational effort where it is needed most. The results of test calculations show that a factor of >~ 4 speed-up, and a commensurate reduction in the number of pixels required in the final image, can be achieved compared to an equivalent calculation with a fixed resolution image.
The ray-tracing mapping operator in an asymmetric atmosphere
无
2008-01-01
In a spherically symmetric atmosphere, the refractive index profile is retrieved from bending angle measurements through Abel integral transform. As horizontal refractivity inhomogeneity becomes significant in the moist low atmosphere, the error in refractivity profile obtained from Abel inversion reaches about 10%. One way to avoid this error is to directly assimilate bending angle profile into numerical weather models. This paper discusses the 2D ray-tracing mapping operator for bending angle in an asymmetric atmosphere. Through simulating computations, the retrieval error of the refractivity in horizontal inhomogeneity is assessed. The step length of 4 rank Runge-Kutta method is also tested.
Ray tracing study for non-imaging daylight collectors
Wittkopf, Stephen [Solar Energy Research Institute of Singapore (SERIS), National University of Singapore (NUS), 7 Engineering Drive 1, Block E3A, 06-01, Singapore 117574 (Singapore); Solar Energy and Building Physics Laboratory (LESO), Swiss Federal Institute of Technology Lausanne (EPFL) (Switzerland); Oliver Grobe, Lars; Geisler-Moroder, David [Solar Energy Research Institute of Singapore (SERIS), National University of Singapore (NUS), 7 Engineering Drive 1, Block E3A, 06-01, Singapore 117574 (Singapore); Compagnon, Raphael [College of Engineering and Architecture of Fribourg (EIA-FR), University of Applied Sciences of Western Switzerland (HES-SO) (Switzerland); Kaempf, Jerome; Linhart, Friedrich; Scartezzini, Jean-Louis [Solar Energy and Building Physics Laboratory (LESO), Swiss Federal Institute of Technology Lausanne (EPFL) (Switzerland)
2010-06-15
This paper presents a novel method to study how well non-imaging daylight collectors pipe diffuse daylight into long horizontal funnels for illuminating deep buildings. Forward ray tracing is used to derive luminous intensity distributions curves (LIDC) of such collectors centered in an arc-shaped light source representing daylight. New photometric characteristics such as 2D flux, angular spread and horizontal offset are introduced as a function of such LIDC. They are applied for quantifying and thus comparing different collector contours. (author)
Ray-tracing optical modeling of negative dysphotopsia
Hong, Xin; Liu, Yueai; Karakelle, Mutlu; Masket, Samuel; Fram, Nicole R.
2011-12-01
Negative dysphotopsia is a relatively common photic phenomenon that may occur after implantation of an intraocular lens. The etiology of negative dysphotopsia is not fully understood. In this investigation, optical modeling was developed using nonsequential-component Zemax ray-tracing technology to simulate photic phenomena experienced by the human eye. The simulation investigated the effects of pupil size, capsulorrhexis size, and bag diffusiveness. Results demonstrated the optical basis of negative dysphotopsia. We found that photic structures were mainly influenced by critical factors such as the capsulorrhexis size and the optical diffusiveness of the capsular bag. The simulations suggested the hypothesis that the anterior capsulorrhexis interacting with intraocular lens could induce negative dysphotopsia.
Connell, Paul
2014-05-01
In designing the MXGS coded mask imager of the ASIM mission on the ISS, to detect and locate gamma-rays from Terrestrial Gamma-ray Flashes, it was necessary to write software to simulate the expansion of gamma-ray photons from 15-20 km altitudes for an initial estimate of TGF spectra and diffuse beam structure likely to be observed at orbital altitudes. From this a new detailed LEPTRACK simulation software package has been developed to track all electron-photon scattering via Bremsstrahlung and ionization, and via any spatial electric-magnetic field geometies which will drive the Relativistic Runaway Electron Avalanche (RREA) process at the heart of TGF origin. LEPTRACK uses the standard physics of keV-MeV photon interactions, Bremsstrahlung scattering, Binary-Electron-Bethe models of electron ionization-scattering, positron Bhabha scattering and annihilation. Unlike simulation packages GEANT4, EGS, etc, the physics of these processes is transferred outside the software and controlled by a standard database of text files of total scattering cross sections, differential energy transfer and deflection angle PDFs - easy to read and plot - but which can also be changed, if the user understands the physics involved and wishes to create their own modified database. It also uses a superparticle spatial mesh system to control particle density and flux fields, electric field evolution, and exponential avalanche growth. Results will be presented of TGF simulations using macro electric field geometries expected in storm clouds and micro field geometries expected around streamer tips - and combinations of both - and will include video displays showing the evolving ionization structure of electron trajectories, the time evolution of photon-electron-positron density and flux fields, local molecular ion densities, the dielectric effect of induced local electric fields - and the important effect of the local earth magnetic field on circular lepton feedback and TGF beam direction
A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.
Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.
The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.
Implementing a Simulation Study Using Multiple Software Packages for Structural Equation Modeling
Sunbok Lee
2015-07-01
Full Text Available A Monte Carlo simulation study is an essential tool for evaluating the behavior of various quantitative methods including structural equation modeling (SEM under various conditions. Typically, a large number of replications are recommended for a Monte Carlo simulation study, and therefore automating a Monte Carlo simulation study is important to get the desired number of replications for a simulation study. This article is intended to provide concrete examples for automating a Monte Carlo simulation study using some standard software packages for SEM: Mplus, LISREL, SAS PROC CALIS, and R package lavaan. Also, the equivalence between the multilevel SEM and hierarchical linear modeling (HLM is discussed, and relevant examples are provided. It is hoped that the codes in this article can provide some building blocks for researchers to write their own code to automate simulation procedures.
MedLinac2: a GEANT4 based software package for radiotherapy
Barbara Caccia
2010-06-01
Full Text Available Dose distribution evaluation in oncological radiotherapy treatments is an outstanding problem that requires sophisticated computing technologies to optimize the clinical results (i.e. increase the dose to the tumour and reduce the dose to the healthy tissues. Nowdays, dose calculation algorithms based on the Monte Carlo method are generally regarded as the most accurate tools for radiotherapy. The flexibility of the GEANT4 (GEometry ANd Tracking Monte Carlo particle transport simulation code allows a wide range of applications, from high-energy to medical physics. In order to disseminate and encourage the use of Monte Carlo method in oncological radiotherapy, a software package based on the GEANT4 Monte Carlo toolkit has been developed. The developed package (MedLinac2 allows to simulate in an adequate flexible way a linear accelerator for radiotherapy and to evaluate the dose distributions.
Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis
2016-10-01
Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.
Software and hardware package for justification of safety of nuclear legacy facilities
P.A. Blokhin
2017-03-01
Full Text Available Determination of future fate for nuclear legacy facilities is becoming an extremely important near-term issue. This includes decommissioning options to be identified based on detailed justifications of respective designs. No general practice has been developed in Russia to address such issues, while the initial steps to this end have been made as part of the federal target program “Ensuring Nuclear and Radiation Safety for 2008 and Up to the Year 2015”. Problems arising in justification of decommissioning options for such facilities, in terms of radiation protection and safety assessments both for the public and personnel, differ greatly from tasks involved in design of new nuclear installations. The explanation is a critical shortage of information on both nuclear legacy facilities as such and on the RW they contain. Extra complexities stem from regulatory requirements to facilities of this type having changed greatly since the time these facilities were built. This puts priority on development of approaches to justification of nuclear, radiation and environmental safety. A software and hardware package, OBOYAN, has been developed to solve a great variety of tasks to be addressed as part of this problem based on a combination of software and hardware tools enabling analysis and justification of the NLS safety in their current state and in a long term. The package's key components are computational modules used to model radiation fields, radionuclide migration and distribution of contamination in water and air, as well as to estimate human doses and risks. The purpose of the study is to describe the structure and the functional capabilities of the package and to provide examples of the package application.
Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)
2010-08-15
We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)
Ray tracing and ECRH absorption modeling in the HSX stellarator
Weir, G. M.; Likin, K. M.; Marushchenko, N. B.; Turkin, Y.
2015-09-01
To increase flexibility in ECRH experiments on the helically symmetric experiment (HSX), a second gyrotron and transmission line have been installed. The second antenna includes a steerable mirror for off-axis heating, and the launched power may be modulated for use in heat pulse propagation experiments. The extraordinary wave at the second harmonic of the electron gyrofrequency or the ordinary wave at the fundamental resonance are used for plasma start-up and heating on HSX. The tracing visualized ray tracing code (Marushchenko et al 2007 Plasma Fusion Res. 2 S1129) is used to estimate single-pass absorption and to model multi-pass wave damping in the three-dimensional HSX geometry. The single-pass absorption of the ordinary wave at the fundamental resonance is calculated to be as high as 30%, while measurements of the total absorption indicate that 45% of the launched power is absorbed. A multi-pass ray tracing model correctly predicts the experimental absorption and indicates that the launched power is absorbed within the plasma core (r/a≤slant 0.2 ).
Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective
Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan
2007-01-01
Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.
MC ray-tracing optimization of lobster-eye focusing devices with RESTRAX
Saroun, Jan [Nuclear Physics Institute, ASCR, 25068 Rez (Czech Republic)]. E-mail: saroun@ujf.cas.cz; Kulda, Jiri [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)
2006-11-15
The enhanced functionalities of the latest version of the RESTRAX software, providing a high-speed Monte Carlo (MC) ray-tracing code to represent a virtual three-axis neutron spectrometer, include representation of parabolic and elliptic guide profiles and facilities for numerical optimization of parameter values, characterizing the instrument components. As examples, we present simulations of a doubly focusing monochromator in combination with cold neutron guides and lobster-eye supermirror devices, concentrating a monochromatic beam to small sample volumes. A Levenberg-Marquardt minimization algorithm is used to optimize simultaneously several parameters of the monochromator and lobster-eye guides. We compare the performance of optimized configurations in terms of monochromatic neutron flux and energy spread and demonstrate the effect of lobster-eye optics on beam transformations in real and momentum subspaces.
KARAT-LAMBDA - frequency dependent ray-traced troposphere delays for space applications
Hobiger, Thomas; Baron, Philippe
2014-05-01
Space-geodetic microwave techniques work under the assumption that the only dispersive, i.e. frequency dependent delay contribution is caused by the ionosphere. In general, the refractivity, even for the troposphere, is a complex quantity which can be denoted as N = N0 + (N'(f) + i N''(f)) where N0 is a frequency independent term, and N'(f) and N''(f) represent the complex frequency dependence. Thereby, the imaginary part can be used to derive the loss of energy (absorption) and the real part can be assigned to the changes in the propagation velocity (refraction) and thus describes the delay of an electromagnetic wave which propagates through that medium. Although the frequency dependent delay contribution appears to be of small order, one has to consider that signals are propagating through few kilometers of troposphere at high elevations to hundredths of kilometers at low elevations. Therefore, the Kashima Ray-Tracing package (Hobiger et al., 2008) has been modified (and named KARAT-LAMBDA) to enable the consideration of a frequency dependent refractivity. By using this tool, it was studied if and to which extent future space geodetic instruments are affected from dispersive troposphere delays. Moreover, a semi-empirical correction model for the microwave link of the Atomic Clock Ensemble in Space (ACES) has been developed, based on ray-tracing calculations with KARAT-LAMBDA. The proposed model (Hobiger et al., 2013) has been tested with simulated ISS overflights at different potential ACES ground station sites and it could be demonstrated that this model is capable to remove biases and elevation dependent features caused by the dispersive troposphere delay difference between the up-link and down-link. References: T. Hobiger, R. Ichikawa, T. Kondo, and Y. Koyama (2008), Fast and accurate ray-tracing algorithms for real-time space geodetic applications using numerical weather models, Journal of Geophysical Research, vol. 113, iss. D203027, pp. 1-14. T. Hobiger, D
Guggenberger, Roman; Nanz, Daniel; Puippe, Gilbert; Rufibach, Kaspar; White, Lawrence M; Sussman, Marshall S; Andreisek, Gustav
2012-08-01
To assess intra-, inter-reader agreement, and the agreement between two software packages for magnetic resonance diffusion tensor imaging (DTI) measurements of the median nerve. Fifteen healthy volunteers (seven men, eight women; mean age, 31.2 years) underwent DTI of both wrists at 1.5 T. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) of the median nerve were measured by three readers using two commonly used software packages. Measurements were repeated by two readers after 6 weeks. Intraclass correlation coefficients (ICC) and Bland-Altman analysis were used for statistical analysis. ICCs for intra-reader agreement ranged from 0.87 to 0.99, for inter-reader agreement from 0.62 to 0.83, and between the two software packages from 0.63 to 0.82. Bland-Altman analysis showed no differences for intra- and inter-reader agreement and agreement between software packages. The intra-, inter-reader, and agreement between software packages for DTI measurements of the median nerve were moderate to substantial suggesting that user- and software-dependent factors contribute little to variance in DTI measurements.
Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile
Farzin Heravi
2012-09-01
Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.
Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile
Roozbeh Rashed
2013-01-01
Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.
Korinek, Andreas; Beck, Florian; Baumeister, Wolfgang; Nickell, Stephan; Plitzko, Jürgen M
2011-09-01
Automated data acquisition expedites structural studies by electron microscopy and it allows to collect data sets of unprecedented size and consistent quality. In electron tomography it greatly facilitates the systematic exploration of large cellular landscapes and in single particle analysis it allows to generate data sets for an exhaustive classification of coexisting molecular states. Here we describe a novel software philosophy and architecture that can be used for a great variety of automated data acquisition scenarios. Based on our original software package TOM, the new TOM(2) package has been designed in an object-oriented way. The whole program can be seen as a collection of self-sufficient modules with defined relationships acting in a concerted manner. It subdivides data acquisition into a set of hierarchical tasks, bonding data structure and the operations to be performed tightly together. To demonstrate its capacity for high-throughput data acquisition it has been used in conjunction with instrumentation combining the latest technological achievements in electron optics, cryogenics and robotics. Its performance is demonstrated with a single particle analysis case study and with a batch tomography application.
Maeda, Hisato; Yamaki, Noriyasu; Azuma, Makoto
2012-01-01
The objective of this study was to develop a personal computer-based nuclear medicine data processor for education and research in the field of nuclear medicine. We call this software package "Prominence Processor" (PP). Windows of Microsoft Corporation was used as the operating system of this PP, which have 1024 × 768 image resolution and various 63 applications classified into 6 groups. The accuracy was examined for a lot of applications of the PP. For example, in the FBP reconstruction application, there was visually no difference in the image quality as a result of comparing two SPECT images obtained from the PP and GMS-5500A (Toshiba). Moreover, Normalized MSE between both images showed 0.0003. Therefore the high processing accuracy of the FBP reconstruction application was proven as well as other applications. The PP can be used in an arbitrary place if the software package is installed in note PC. Therefore the PP is used to lecture and to practice on an educational site and used for the purpose of the research of the radiological technologist on a clinical site etc. widely now.
Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software
Abel Soares Siqueira
2016-04-01
Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.
Evaluating Dense 3d Reconstruction Software Packages for Oblique Monitoring of Crop Canopy Surface
Brocks, S.; Bareth, G.
2016-06-01
Crop Surface Models (CSMs) are 2.5D raster surfaces representing absolute plant canopy height. Using multiple CMSs generated from data acquired at multiple time steps, a crop surface monitoring is enabled. This makes it possible to monitor crop growth over time and can be used for monitoring in-field crop growth variability which is useful in the context of high-throughput phenotyping. This study aims to evaluate several software packages for dense 3D reconstruction from multiple overlapping RGB images on field and plot-scale. A summer barley field experiment located at the Campus Klein-Altendorf of University of Bonn was observed by acquiring stereo images from an oblique angle using consumer-grade smart cameras. Two such cameras were mounted at an elevation of 10 m and acquired images for a period of two months during the growing period of 2014. The field experiment consisted of nine barley cultivars that were cultivated in multiple repetitions and nitrogen treatments. Manual plant height measurements were carried out at four dates during the observation period. The software packages Agisoft PhotoScan, VisualSfM with CMVS/PMVS2 and SURE are investigated. The point clouds are georeferenced through a set of ground control points. Where adequate results are reached, a statistical analysis is performed.
Calculation of the relative metastabilities of proteins using the CHNOSZ software package
Dick Jeffrey M
2008-10-01
Full Text Available Abstract Background Proteins of various compositions are required by organisms inhabiting different environments. The energetic demands for protein formation are a function of the compositions of proteins as well as geochemical variables including temperature, pressure, oxygen fugacity and pH. The purpose of this study was to explore the dependence of metastable equilibrium states of protein systems on changes in the geochemical variables. Results A software package called CHNOSZ implementing the revised Helgeson-Kirkham-Flowers (HKF equations of state and group additivity for ionized unfolded aqueous proteins was developed. The program can be used to calculate standard molal Gibbs energies and other thermodynamic properties of reactions and to make chemical speciation and predominance diagrams that represent the metastable equilibrium distributions of proteins. The approach takes account of the chemical affinities of reactions in open systems characterized by the chemical potentials of basis species. The thermodynamic database included with the package permits application of the software to mineral and other inorganic systems as well as systems of proteins or other biomolecules. Conclusion Metastable equilibrium activity diagrams were generated for model cell-surface proteins from archaea and bacteria adapted to growth in environments that differ in temperature and chemical conditions. The predicted metastable equilibrium distributions of the proteins can be compared with the optimal growth temperatures of the organisms and with geochemical variables. The results suggest that a thermodynamic assessment of protein metastability may be useful for integrating bio- and geochemical observations.
The Verification and Validation of the Ray-tracing of Bag of Triangles (BoTs)
2015-02-01
The Verification and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) by Charith Ranawake ARL-CR-0761 February 2015...Ground, MD 22105 ARL-CR-0761 February 2015 The Verification and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) Charith...and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S
GMATA: an integrated software package for genome-scale SSR mining, marker development and viewing
Xuewen Wang
2016-09-01
Full Text Available Simple sequence repeats (SSRs, also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.
GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing
Wang, Xuewen; Wang, Le
2016-01-01
Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar. PMID:27679641
MEEP: A flexible free-software package for electromagnetic simulations by the FDTD method
Oskooi, Ardavan F.; Roundy, David; Ibanescu, Mihai; Bermel, Peter; Joannopoulos, J. D.; Johnson, Steven G.
2010-03-01
This paper describes Meep, a popular free implementation of the finite-difference time-domain (FDTD) method for simulating electromagnetism. In particular, we focus on aspects of implementing a full-featured FDTD package that go beyond standard textbook descriptions of the algorithm, or ways in which Meep differs from typical FDTD implementations. These include pervasive interpolation and accurate modeling of subpixel features, advanced signal processing, support for nonlinear materials via Padé approximants, and flexible scripting capabilities. Program summaryProgram title: Meep Catalogue identifier: AEFU_v1_0 Program summary URL::http://cpc.cs.qub.ac.uk/summaries/AEFU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPL No. of lines in distributed program, including test data, etc.: 151 821 No. of bytes in distributed program, including test data, etc.: 1 925 774 Distribution format: tar.gz Programming language: C++ Computer: Any computer with a Unix-like system and a C++ compiler; optionally exploits additional free software packages: GNU Guile [1], libctl interface library [2], HDF5 [3], MPI message-passing interface [4], and Harminv filter-diagonalization [5]. Developed on 2.8 GHz Intel Core 2 Duo. Operating system: Any Unix-like system; developed under Debian GNU/Linux 5.0.2. RAM: Problem dependent (roughly 100 bytes per pixel/voxel) Classification: 10 External routines: Optionally exploits additional free software packages: GNU Guile [1], libctl interface library [2], HDF5 [3], MPI message-passing interface [4], and Harminv filter-diagonalization [5] (which requires LAPACK and BLAS linear-algebra software [6]). Nature of problem: Classical electrodynamics Solution method: Finite-difference time-domain (FDTD) method Running time: Problem dependent (typically about 10 ns per pixel per timestep) References:[1] GNU Guile, http://www.gnu.org/software/guile[2] Libctl, http
Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing
Ari, Gizem; Toker, Cenk
2016-07-01
Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.
GENES - a software package for analysis in experimental statistics and quantitative genetics
Cosme Damião Cruz
2013-06-01
Full Text Available GENES is a software package used for data analysis and processing with different biometricmodels and is essential in genetic studies applied to plant and animal breeding. It allows parameterestimation to analyze biologicalphenomena and is fundamental for the decision-making process andpredictions of success and viability of selection strategies. The program can be downloaded from theInternet (http://www.ufv.br/dbg/genes/genes.htm orhttp://www.ufv.br/dbg/biodata.htm and is available inPortuguese, English and Spanish. Specific literature (http://www.livraria.ufv.br/ and a set of sample filesare also provided, making GENES easy to use. The software is integrated into the programs MS Word, MSExcel and Paint, ensuring simplicity and effectiveness indata import and export ofresults, figures and data.It is also compatible with the free software R and Matlab, through the supply of useful scripts available forcomplementary analyses in different areas, including genome wide selection, prediction of breeding valuesand use of neural networks in genetic improvement.
Ignominy：a Tool for Software Dependency and Metric Analysis with Examples from Large HEP Packages
LassiA.Tuura; LucasTaylor
2001-01-01
Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems.Its primary component is a dependency scanner that distills information into human-usable forms.It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics.Ignominy was designed to adapt to almost any reasonable structure,and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software,and in particular warn us about possible structureal problems early on .As a part of this activity it is now used as a standard part of our release procedure,we also use it to evaluate and study the quality of external packages we plan to make use of .We describe what Ignominy can find out,and how if can be used to ivsualise and assess a software structure.We also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident.The focus is the illustration of these issues through the analysis results for several sizable HEP softwre projects.
Joubert, W. [Los Alamos National Lab., NM (United States); Carey, G.F. [Univ. of Texas, Austin, TX (United States)
1994-12-31
A great need exists for high performance numerical software libraries transportable across parallel machines. This talk concerns the PCG package, which solves systems of linear equations by iterative methods on parallel computers. The features of the package are discussed, as well as techniques used to obtain high performance as well as transportability across architectures. Representative numerical results are presented for several machines including the Connection Machine CM-5, Intel Paragon and Cray T3D parallel computers.
Leaf, G K; Minkoff, M; Byrne, G D; Sorensen, D; Bleakney, T; Saltzman, J
1978-11-01
DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous media. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 17 figures, 9 tables.
Leaf, G.K.; Minkoff, M.; Byrne, G.D.; Sorensen, D.; Bleakney, T.; Saltzman, J.
1977-05-01
DISPL is a software package for solving some second-order nonlinear systems of partial differential equations including parabolic, elliptic, hyperbolic, and some mixed types such as parabolic--elliptic equations. Fairly general nonlinear boundary conditions are allowed as well as interface conditions for problems in an inhomogeneous medium. The spatial domain is one- or two-dimensional with Cartesian, cylindrical, or spherical (in one dimension only) geometry. The numerical method is based on the use of Galerkin's procedure combined with the use of B-splines in order to reduce the system of PDE's to a system of ODE's. The latter system is then solved with a sophisticated ODE software package. Software features include extensive dump/restart facilities, free format input, moderate printed output capability, dynamic storage allocation, and three graphics packages. 16 figures, 10 tables.
Lourderaj, Upakarasamy; Sun, Rui; De Jong, Wibe A.; Windus, Theresa L.; Hase, William L.
2014-03-01
The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling. The two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface which accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.
MacMath 92 a dynamical systems software package for the Macintosh
Hubbard, John H
1993-01-01
MacMath is a scientific toolkit for the Macintosh computer consisting of twelve graphics programs. It supports mathematical computation and experimentation in dynamical systems, both for differential equations and for iteration. The MacMath package was designed to accompany the textbooks Differential Equations: A Dynamical Systems Approach Part I & II. The text and software was developed for a junior-senior level course in applicable mathematics at Cornell University, in order to take advantage of excellent and easily accessible graphics. MacMath addresses differential equations and iteration such as: analyzer, diffeq, phase plane, diffeq 3D views, numerical methods, periodic differential equations, cascade, 2D iteration, eigenfinder, jacobidraw, fourier, planets. These versatile programs greatly enhance the understanding of the mathematics in these topics. Qualitative analysis of the picture leads to quantitative results and even to new mathematics. This new edition includes the latest version of the Mac...
Green, J.R.
1996-10-21
Radclac for Windows is a user friendly menu-driven Windows compatible software program with applications in the transportation of radioactive materials. It calculates the radiolytic generation of hydrogen gas in the matrix of low-level and high-level radioactive wastes. It also calculates pressure buildup due to hydrogen and the decay heat generated in a package at seal time. It computes the quantity of a radionuclide and its associated products for a given period of time. In addition, the code categorizes shipment quantities as reportable quantity (RQ), radioactive Type A or Type B, limited quality (LQ), low specific activity (LSA), highway road controlled quality (HRCQ), and fissile excepted using US Department of Transportation (DOT) definitions and methodologies.
ELAN: a software package for analysis and visualization of MEG, EEG, and LFP signals.
Aguera, Pierre-Emmanuel; Jerbi, Karim; Caclin, Anne; Bertrand, Olivier
2011-01-01
The recent surge in computational power has led to extensive methodological developments and advanced signal processing techniques that play a pivotal role in neuroscience. In particular, the field of brain signal analysis has witnessed a strong trend towards multidimensional analysis of large data sets, for example, single-trial time-frequency analysis of high spatiotemporal resolution recordings. Here, we describe the freely available ELAN software package which provides a wide range of signal analysis tools for electrophysiological data including scalp electroencephalography (EEG), magnetoencephalography (MEG), intracranial EEG, and local field potentials (LFPs). The ELAN toolbox is based on 25 years of methodological developments at the Brain Dynamics and Cognition Laboratory in Lyon and was used in many papers including the very first studies of time-frequency analysis of EEG data exploring evoked and induced oscillatory activities in humans. This paper provides an overview of the concepts and functionalities of ELAN, highlights its specificities, and describes its complementarity and interoperability with other toolboxes.
Petrova Irina Yur’evna
2016-08-01
Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.
Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster
Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady
2015-04-01
Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.
SMOG 2: A Versatile Software Package for Generating Structure-Based Models.
Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C
2016-03-01
Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.
SMOG 2: A Versatile Software Package for Generating Structure-Based Models.
Jeffrey K Noel
2016-03-01
Full Text Available Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.
Pharmacokinetic software for the health sciences: choosing the right package for teaching purposes.
Charles, B G; Duffull, S B
2001-01-01
Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.
A software package for evaluating the performance of a star sensor operation
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; K., Nirmal; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-01-01
We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
BaitFisher: A Software Package for Multispecies Target DNA Enrichment Probe Design.
Mayer, Christoph; Sann, Manuela; Donath, Alexander; Meixner, Martin; Podsiadlowski, Lars; Peters, Ralph S; Petersen, Malte; Meusemann, Karen; Liere, Karsten; Wägele, Johann-Wolfgang; Misof, Bernhard; Bleidorn, Christoph; Ohl, Michael; Niehuis, Oliver
2016-07-01
Target DNA enrichment combined with high-throughput sequencing technologies is a powerful approach to probing a large number of loci in genomes of interest. However, software algorithms that explicitly consider nucleotide sequence information of target loci in multiple reference species for optimizing design of target enrichment baits to be applicable across a wide range of species have not been developed. Here we present an algorithm that infers target DNA enrichment baits from multiple nucleotide sequence alignments. By applying clustering methods and the combinatorial 1-center sequence optimization to bait design, we are able to minimize the total number of baits required to efficiently probe target loci in multiple species. Consequently, more loci can be probed across species with a given number of baits. Using transcript sequences of 24 apoid wasps (Hymenoptera: Crabronidae, Sphecidae) from the 1KITE project and the gene models of Nasonia vitripennis, we inferred 57,650, 120-bp-long baits for capturing 378 coding sequence sections of 282 genes in apoid wasps. Illumina reduced-representation library sequencing confirmed successful enrichment of the target DNA when applying these baits to DNA of various apoid wasps. The designed baits furthermore enriched a major fraction of the target DNA in distantly related Hymenoptera, such as Formicidae and Chalcidoidea, highlighting the baits' broad taxonomic applicability. The availability of baits with broad taxonomic applicability is of major interest in numerous disciplines, ranging from phylogenetics to biodiversity monitoring. We implemented our new approach in a software package, called BaitFisher, which is open source and freely available at https://github.com/cmayer/BaitFisher-package.git. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A software package for evaluating the performance of a star sensor operation
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-02-01
We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be
The instrument control software package for the Habitable-Zone Planet Finder spectrometer
Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan
2016-08-01
We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.
On the Incorrect Statistical Calculations of the Kinetica Software Package in Imbalanced Designs.
Morales-Alcelay, S; de la Torre de Alvarado, J M; García-Arieta, A
2015-07-01
This regulatory note supports the previous findings that suggest that the software package Kinetica, up to version 5.0.10, provides incorrect results for the 90% confidence intervals for the ratio test/reference where the groups are imbalanced in 2 × 2 crossover designs and parallel designs. The incorrect calculation results from using the simplified formula that is shown as an example in the Canadian guideline for a balanced dataset, but which provides an erroneous point estimate and confidence interval width in cases of imbalanced designs. Importantly, this software is rarely used for regulatory submissions in the European Union according to the search conducted in the Spanish Agency for Medicines and Health Care Products. According to our data, the error is minor if the imbalance between groups is small. However, the error may be relevant if the sample size is small and the imbalance is large. Therefore, bioequivalence studies should be reanalyzed by regulatory agencies to confirm the submitted results.
INSPECT: A graphical user interface software package for IDARC-2D
AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer
Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.
Okumura, Akira; Rulten, Cameron
2016-01-01
We have developed a non-sequential ray-tracing simulation library, ROOT-based simulator for ray tracing (ROBAST), which is aimed to be widely used in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++, and fully utilizes the geometry library of the ROOT framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used in the community has existed. To reduce the dispensable effort needed to develop multiple ray-tracing simulators by different research groups, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the six proposed telescope designs for CTA, ROBAST is currently used for three telescopes: a Schwarzschild-Couder (SC) medium-sized telescope, one of SC small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulation and development of hexagonal light concentrators propose...
Testing the validity of the ray-tracing code GYOTO
Grould, Marion; Perrin, Guy
2016-01-01
In the next few years, the near-infrared interferometer GRAVITY will be able to observe the Galactic center. Astrometric data will be obtained with an anticipated accuracy of 10 $\\mu$as. To analyze these future data, we have developed a code called GYOTO to compute orbits and images. We want to assess the validity and accuracy of GYOTO in a variety of contexts, in particular for stellar astrometry in the Galactic center. Furthermore, we want to tackle and complete a study made on the astrometric displacements that are due to lensing effects of a star of the central parsec with GYOTO. We first validate GYOTO in the weak-deflection limit (WDL) by studying primary caustics and primary critical curves obtained for a Kerr black hole. We compare GYOTO results to available analytical approximations and estimate GYOTO errors using an intrinsic estimator. In the strong-deflection limit (SDL), we choose to compare null geodesics computed by GYOTO and the ray-tracing code named Geokerr. Finally, we use GYOTO to estimate...
High performance dosimetry calculations using adapted ray-tracing
Perrotte, Lancelot; Saupin, Guillaume
2010-11-01
When preparing interventions on nuclear sites, it is interesting to study different scenarios, to identify the most appropriate one for the operator(s). Using virtual reality tools is a good way to simulate the potential scenarios. Thus, taking advantage of very efficient computation times can help the user studying different complex scenarios, by immediately evaluating the impact of any changes. In the field of radiation protection, people often use computation codes based on the straight line attenuation method with build-up factors. As for other approaches, geometrical computations (finding all the interactions between radiation rays and the scene objects) remain the bottleneck of the simulation. We present in this paper several optimizations used to speed up these geometrical computations, using innovative GPU ray-tracing algorithms. For instance, we manage to compute every intersectionbetween 600 000 rays and a huge 3D industrial scene in a fraction of second. Moreover, our algorithm works the same way for both static and dynamic scenes, allowing easier study of complex intervention scenarios (where everything moves: the operator(s), the shielding objects, the radiation sources).
Fast Ray Tracing of Lunar Digital Elevation Models
McClanahan, Timothy P.; Evans, L. G.; Starr, R. D.; Mitrofanov, I.
2009-01-01
Ray-tracing (RT) of Lunar Digital Elevation Models (DEM)'s is performed to virtually derive the degree of radiation incident to terrain as a function of time, orbital and ephemeris constraints [I- 4]. This process is an integral modeling process in lunar polar research and exploration due to the present paucity of terrain information at the poles and mission planning activities for the anticipated spring 2009 launch of the Lunar Reconnaissance Orbiter (LRO). As part of the Lunar Exploration Neutron Detector (LEND) and Lunar Crater Observation and Sensing Satellite (LCROSS) preparations RI methods are used to estimate the critical conditions presented by the combined effects of high latitude, terrain and the moons low obliquity [5-7]. These factors yield low incident solar illumination and subsequently extreme thermal, and radiation conditions. The presented research uses RT methods both for radiation transport modeling in space and regolith related research as well as to derive permanently shadowed regions (PSR)'s in high latitude topographic minima, e.g craters. These regions are of scientific and human exploration interest due to the near constant low temperatures in PSRs, inferred to be < 100 K. Hydrogen is thought to have accumulated in PSR's through the combined effects of periodic cometary bombardment and/or solar wind processes, and the extreme cold which minimizes hydrogen sublimation [8-9]. RT methods are also of use in surface position optimization for future illumination dependent on surface resources e.g. power and communications equipment.
Distance measurement based on light field geometry and ray tracing.
Chen, Yanqin; Jin, Xin; Dai, Qionghai
2017-01-09
In this paper, we propose a geometric optical model to measure the distances of object planes in a light field image. The proposed geometric optical model is composed of two sub-models based on ray tracing: object space model and image space model. The two theoretic sub-models are derived on account of on-axis point light sources. In object space model, light rays propagate into the main lens and refract inside it following the refraction theorem. In image space model, light rays exit from emission positions on the main lens and subsequently impinge on the image sensor with different imaging diameters. The relationships between imaging diameters of objects and their corresponding emission positions on the main lens are investigated through utilizing refocusing and similar triangle principle. By combining the two sub-models together and tracing light rays back to the object space, the relationships between objects' imaging diameters and corresponding distances of object planes are figured out. The performance of the proposed geometric optical model is compared with existing approaches using different configurations of hand-held plenoptic 1.0 cameras and real experiments are conducted using a preliminary imaging system. Results demonstrate that the proposed model can outperform existing approaches in terms of accuracy and exhibits good performance at general imaging range.
PKQuest_Java: free, interactive physiologically based pharmacokinetic software package and tutorial
Levitt David G
2009-08-01
Full Text Available Abstract Background Physiologically based pharmacokinetics (PBPK uses a realistic organ model to describe drug kinetics. The blood-tissue exchange of each organ is characterized by its volume, perfusion, metabolism, capillary permeability and blood/tissue partition coefficient. PBPK applications require both sophisticated mathematical modeling software and a reliable complete set of physiological parameters. Currently there are no software packages available that combine ease of use with the versatility that is required of a general PBPK program. Findings The program is written in Java and is available for free download at http://www.pkquest.com/. Included in the download is a detailed tutorial that discusses the pharmacokinetics of 6 solutes (D2O, amoxicillin, desflurane, propofol, ethanol and thiopental illustrated using experimental human pharmacokinetic data. The complete PBPK description for each solute is stored in Excel spreadsheets that are included in the download. The main features of the program are: 1 Intuitive and versatile interactive interface; 2 Absolute and semi-logarithmic graphical output; 3 Pre-programmed optimized human parameter data set (but, arbitrary values can be input; 4 Time dependent changes in the PBPK parameters; 5 Non-linear parameter optimization; 6 Unique approach to determine the oral "first pass metabolism" of non-linear solutes (e.g. ethanol; 7 Pulmonary perfusion/ventilation heterogeneity for volatile solutes; 8 Input and output of Excel spreadsheet data; 9 Antecubital vein sampling. Conclusion PKQuest_Java is a free, easy to use, interactive PBPK software routine. The user can either directly use the pre-programmed optimized human or rat data set, or enter an arbitrary data set. It is designed so that drugs that are classified as "extracellular" or "highly fat soluble" do not require information about tissue/blood partition coefficients and can be modeled by a minimum of user input parameters. PKQuest
PKQuest_Java: free, interactive physiologically based pharmacokinetic software package and tutorial.
Levitt, David G
2009-08-05
Physiologically based pharmacokinetics (PBPK) uses a realistic organ model to describe drug kinetics. The blood-tissue exchange of each organ is characterized by its volume, perfusion, metabolism, capillary permeability and blood/tissue partition coefficient. PBPK applications require both sophisticated mathematical modeling software and a reliable complete set of physiological parameters. Currently there are no software packages available that combine ease of use with the versatility that is required of a general PBPK program. The program is written in Java and is available for free download at http://www.pkquest.com/. Included in the download is a detailed tutorial that discusses the pharmacokinetics of 6 solutes (D2O, amoxicillin, desflurane, propofol, ethanol and thiopental) illustrated using experimental human pharmacokinetic data. The complete PBPK description for each solute is stored in Excel spreadsheets that are included in the download. The main features of the program are: 1) Intuitive and versatile interactive interface; 2) Absolute and semi-logarithmic graphical output; 3) Pre-programmed optimized human parameter data set (but, arbitrary values can be input); 4) Time dependent changes in the PBPK parameters; 5) Non-linear parameter optimization; 6) Unique approach to determine the oral "first pass metabolism" of non-linear solutes (e.g. ethanol); 7) Pulmonary perfusion/ventilation heterogeneity for volatile solutes; 8) Input and output of Excel spreadsheet data; 9) Antecubital vein sampling. PKQuest_Java is a free, easy to use, interactive PBPK software routine. The user can either directly use the pre-programmed optimized human or rat data set, or enter an arbitrary data set. It is designed so that drugs that are classified as "extracellular" or "highly fat soluble" do not require information about tissue/blood partition coefficients and can be modeled by a minimum of user input parameters. PKQuest_Java, along with the included tutorial, could be
PROOST, JH; MEIJER, DKF
The pharmacokinetic software package MW/Pharm offers an interactive, user-friendly program which gives rapid answers in clinical practice. It comprises a database with pharmacokinetic parameters of 180 drugs, a medication history database, and procedures for an individual drug dosage regimen
PROOST, JH; MEIJER, DKF
1992-01-01
The pharmacokinetic software package MW/Pharm offers an interactive, user-friendly program which gives rapid answers in clinical practice. It comprises a database with pharmacokinetic parameters of 180 drugs, a medication history database, and procedures for an individual drug dosage regimen calcula
Application of ray-traced tropospheric slant delays to geodetic VLBI analysis
Hofmeister, Armin; Böhm, Johannes
2017-02-01
The correction of tropospheric influences via so-called path delays is critical for the analysis of observations from space geodetic techniques like the very long baseline interferometry (VLBI). In standard VLBI analysis, the a priori slant path delays are determined using the concept of zenith delays, mapping functions and gradients. The a priori use of ray-traced delays, i.e., tropospheric slant path delays determined with the technique of ray-tracing through the meteorological data of numerical weather models (NWM), serves as an alternative way of correcting the influences of the troposphere on the VLBI observations within the analysis. In the presented research, the application of ray-traced delays to the VLBI analysis of sessions in a time span of 16.5 years is investigated. Ray-traced delays have been determined with program RADIATE (see Hofmeister in Ph.D. thesis, Department of Geodesy and Geophysics, Faculty of Mathematics and Geoinformation, Technische Universität Wien. http://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-3444, 2016) utilizing meteorological data provided by NWM of the European Centre for Medium-Range Weather Forecasts (ECMWF). In comparison with a standard VLBI analysis, which includes the tropospheric gradient estimation, the application of the ray-traced delays to an analysis, which uses the same parameterization except for the a priori slant path delay handling and the used wet mapping factors for the zenith wet delay (ZWD) estimation, improves the baseline length repeatability (BLR) at 55.9% of the baselines at sub-mm level. If no tropospheric gradients are estimated within the compared analyses, 90.6% of all baselines benefit from the application of the ray-traced delays, which leads to an average improvement of the BLR of 1 mm. The effects of the ray-traced delays on the terrestrial reference frame are also investigated. A separate assessment of the RADIATE ray-traced delays is carried out by comparison to the ray-traced delays from the
Application of ray-traced tropospheric slant delays to geodetic VLBI analysis
Hofmeister, Armin; Böhm, Johannes
2017-08-01
The correction of tropospheric influences via so-called path delays is critical for the analysis of observations from space geodetic techniques like the very long baseline interferometry (VLBI). In standard VLBI analysis, the a priori slant path delays are determined using the concept of zenith delays, mapping functions and gradients. The a priori use of ray-traced delays, i.e., tropospheric slant path delays determined with the technique of ray-tracing through the meteorological data of numerical weather models (NWM), serves as an alternative way of correcting the influences of the troposphere on the VLBI observations within the analysis. In the presented research, the application of ray-traced delays to the VLBI analysis of sessions in a time span of 16.5 years is investigated. Ray-traced delays have been determined with program RADIATE (see Hofmeister in Ph.D. thesis, Department of Geodesy and Geophysics, Faculty of Mathematics and Geoinformation, Technische Universität Wien. http://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-3444, 2016) utilizing meteorological data provided by NWM of the European Centre for Medium-Range Weather Forecasts (ECMWF). In comparison with a standard VLBI analysis, which includes the tropospheric gradient estimation, the application of the ray-traced delays to an analysis, which uses the same parameterization except for the a priori slant path delay handling and the used wet mapping factors for the zenith wet delay (ZWD) estimation, improves the baseline length repeatability (BLR) at 55.9% of the baselines at sub-mm level. If no tropospheric gradients are estimated within the compared analyses, 90.6% of all baselines benefit from the application of the ray-traced delays, which leads to an average improvement of the BLR of 1 mm. The effects of the ray-traced delays on the terrestrial reference frame are also investigated. A separate assessment of the RADIATE ray-traced delays is carried out by comparison to the ray-traced delays from the
Tuncbag, Nurcan; Gosline, Sara J C; Kedaigle, Amanda; Soltis, Anthony R; Gitter, Anthony; Fraenkel, Ernest
2016-04-01
High-throughput, 'omic' methods provide sensitive measures of biological responses to perturbations. However, inherent biases in high-throughput assays make it difficult to interpret experiments in which more than one type of data is collected. In this work, we introduce Omics Integrator, a software package that takes a variety of 'omic' data as input and identifies putative underlying molecular pathways. The approach applies advanced network optimization algorithms to a network of thousands of molecular interactions to find high-confidence, interpretable subnetworks that best explain the data. These subnetworks connect changes observed in gene expression, protein abundance or other global assays to proteins that may not have been measured in the screens due to inherent bias or noise in measurement. This approach reveals unannotated molecular pathways that would not be detectable by searching pathway databases. Omics Integrator also provides an elegant framework to incorporate not only positive data, but also negative evidence. Incorporating negative evidence allows Omics Integrator to avoid unexpressed genes and avoid being biased toward highly-studied hub proteins, except when they are strongly implicated by the data. The software is comprised of two individual tools, Garnet and Forest, that can be run together or independently to allow a user to perform advanced integration of multiple types of high-throughput data as well as create condition-specific subnetworks of protein interactions that best connect the observed changes in various datasets. It is available at http://fraenkel.mit.edu/omicsintegrator and on GitHub at https://github.com/fraenkel-lab/OmicsIntegrator.
Nurcan Tuncbag
2016-04-01
Full Text Available High-throughput, 'omic' methods provide sensitive measures of biological responses to perturbations. However, inherent biases in high-throughput assays make it difficult to interpret experiments in which more than one type of data is collected. In this work, we introduce Omics Integrator, a software package that takes a variety of 'omic' data as input and identifies putative underlying molecular pathways. The approach applies advanced network optimization algorithms to a network of thousands of molecular interactions to find high-confidence, interpretable subnetworks that best explain the data. These subnetworks connect changes observed in gene expression, protein abundance or other global assays to proteins that may not have been measured in the screens due to inherent bias or noise in measurement. This approach reveals unannotated molecular pathways that would not be detectable by searching pathway databases. Omics Integrator also provides an elegant framework to incorporate not only positive data, but also negative evidence. Incorporating negative evidence allows Omics Integrator to avoid unexpressed genes and avoid being biased toward highly-studied hub proteins, except when they are strongly implicated by the data. The software is comprised of two individual tools, Garnet and Forest, that can be run together or independently to allow a user to perform advanced integration of multiple types of high-throughput data as well as create condition-specific subnetworks of protein interactions that best connect the observed changes in various datasets. It is available at http://fraenkel.mit.edu/omicsintegrator and on GitHub at https://github.com/fraenkel-lab/OmicsIntegrator.
Montoya, R. J.; Lane, H. H., Jr.
1986-01-01
A software system that integrates an ADAGE 3000 Programmable Display Generator into a C.A.D. software package known as the Solid Modeling Program is described. The Solid Modeling Program (SMP) is an interactive program that is used to model complex solid object through the composition of primitive geomeentities. In addition, SMP provides extensive facilities for model editing and display. The ADAGE 3000 Programmable Display Generator (PDG) is a color, raster scan, programmable display generator with a 32-bit bit-slice, bipolar microprocessor (BPS). The modularity of the system architecture and the width and speed of the system bus allow for additional co-processors in the system. These co-processors combine to provide efficient operations on and rendering of graphics entities. The resulting software system takes advantage of the graphics capabilities of the PDG in the operation of SMP by distributing its processing modules between the host and the PDG. Initially, the target host computer was a PRIME 850, which was later substituted with a VAX-11/785. Two versions of the software system were developed, a phase 1 and a phase 2. In phase 1, the ADAGE 3000 is used as a frame buffer. In phase II, SMP was functionally partitioned and some of its functions were implemented in the ADAGE 3000 by means of ADAGE's SOLID 3000 software package.
Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.
2010-02-23
This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.
Developing a new software package for PSF estimation and fitting of adaptive optics images
Schreiber, Laura; Diolaiti, Emiliano; Sollima, Antonio; Arcidiacono, Carmelo; Bellazzini, Michele; Ciliegi, Paolo; Falomo, Renato; Foppiani, Italo; Greggio, Laura; Lanzoni, Barbara; Lombini, Matteo; Montegriffo, Paolo; Dalessandro, Emanuele; Massari, Davide
2012-07-01
Adaptive Optics (AO) images are characterized by structured Point Spread Function (PSF), with sharp core and extended halo, and by significant variations across the field of view. In order to enable the extraction of high-precision quantitative information and improve the scientific exploitation of AO data, efforts in the PSF modeling and in the integration of suitable models in a code for image analysis are needed. We present the current status of a study on the modeling of AO PSFs based on observational data taken with present telescopes (VLT and LBT). The methods under development include parametric models and hybrid (i.e. analytical / numerical) models adapted to various types of PSFs that can show up in AO images. The specific features of AO data, such as the mainly radial variation of the PSF with respect to the guide star position in single-reference AO, are taken into account as much as possible. The final objective of this project is the development of a flexible software package, based on the Starfinder code (Diolaiati et Al 2000), specifically dedicated to the PSF estimation and to the astrometric and photometric analysis of AO images with complex and spatially variable PSF.
A software package for evaluating the performance of a star sensor operation
Sarpotdar, Mayuresh; Sreejith, A G; Nirmal, K; Ambily, S; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2016-01-01
We have developed a low-cost off-the-shelf component star sensor (StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each...
ELAN: A Software Package for Analysis and Visualization of MEG, EEG, and LFP Signals
Pierre-Emmanuel Aguera
2011-01-01
Full Text Available The recent surge in computational power has led to extensive methodological developments and advanced signal processing techniques that play a pivotal role in neuroscience. In particular, the field of brain signal analysis has witnessed a strong trend towards multidimensional analysis of large data sets, for example, single-trial time-frequency analysis of high spatiotemporal resolution recordings. Here, we describe the freely available ELAN software package which provides a wide range of signal analysis tools for electrophysiological data including scalp electroencephalography (EEG, magnetoencephalography (MEG, intracranial EEG, and local field potentials (LFPs. The ELAN toolbox is based on 25 years of methodological developments at the Brain Dynamics and Cognition Laboratory in Lyon and was used in many papers including the very first studies of time-frequency analysis of EEG data exploring evoked and induced oscillatory activities in humans. This paper provides an overview of the concepts and functionalities of ELAN, highlights its specificities, and describes its complementarity and interoperability with other toolboxes.
SUPCRTBL: A revised and extended thermodynamic dataset and software package of SUPCRT92
Zimmer, Kurt; Zhang, Yilun; Lu, Peng; Chen, Yanyan; Zhang, Guanru; Dalkilic, Mehmet; Zhu, Chen
2016-05-01
The computer-enabled thermodynamic database associated with SUPCRT92 (Johnson et al., 1992) enables the calculation of the standard molal thermodynamic properties of minerals, gases, aqueous species, and reactions for a wide range of temperatures and pressures. However, new data on the thermodynamic properties of both aqueous species and minerals have become available since the database's initial release in 1992 and its subsequent updates. In light of these developments, we have expanded SUPCRT92's thermodynamic dataset and have modified the accompanying computer code for thermodynamic calculations by using newly available properties. The modifications in our new version include: (1) updating the standard state thermodynamic properties for mineral end-members with properties from Holland and Powell (2011) to improve the study of metamorphic petrology and economic geology; (2) adding As-acid, As-metal aqueous species, and As-bearing minerals to improve the study of environmental geology; (3) updating properties for Al-bearing species, SiO2° (aq) and HSiO3- , boehmite, gibbsite, and dawsonite for modeling geological carbon sequestration. The new thermodynamic dataset and the modified SUPCRT92 program were implemented in a software package called SUPCRTBL, which is available online at
Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong
2015-08-01
For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.
Kjartansson, Einar; Bjarnason, Ingi Th.
2017-04-01
Tools for ray-tracing through one dimensional earth models consisting of layers of constant velocity gradients, and continuous values across layers, have been developed. They are used to investigate stability and robustness of earthquake locations and velocity determinations in the South Iceland Lowlands (SIL) a transform seismic zone. These tools will also be used to invert for velocity functions for different regions and time periods, by inverting simultaneously for micro-earthquake source parameters and P and S velocities. Increase of velocity gradient with depth will cause rays with different take-off angles to cross, which can result in focusing and triplication when velocity is plotted versus time. It is therefore important to constrain the velocity solutions to avoid this. Large changes in gradient between adjacent layers causes variability of ray density and geometrical spreading, particularly for rays that turn just below the boundaries. This may create artificial clustering in the depth distribution of micro-earthquake source solutions. Resampling of the velocity functions using cubic spline interpolation can be used to reduce these effects. The software is open source and can be accessed at https://github.com/4dseismic
Virtual Ray Tracing as a Conceptual Tool for Image Formation in Mirrors and Lenses
Heikkinen, Lasse; Savinainen, Antti; Saarelainen, Markku
2016-12-01
The ray tracing method is widely used in teaching geometrical optics at the upper secondary and university levels. However, using simple and straightforward examples may lead to a situation in which students use the model of ray tracing too narrowly. Previous studies show that students seem to use the ray tracing method too concretely instead of as a conceptual model. This suggests that introductory physics students need to understand the nature of the ray model more profoundly. In this paper, we show how a virtual ray tracing model can be used as a tool for image formation in more complex and unconventional cases. We believe that this tool has potential in helping students to better appreciate the nature of the ray model.
Study of improved ray tracing parallel algorithm for CGH of 3D objects on GPU
Cong, Bin; Jiang, Xiaoyu; Yao, Jun; Zhao, Kai
2014-11-01
An improved parallel algorithm for holograms of three-dimensional objects was presented. According to the physical characteristics and mathematical properties of the original ray tracing algorithm for computer generated holograms (CGH), using transform approximation and numerical analysis methods, we extract parts of ray tracing algorithm which satisfy parallelization features and implement them on graphics processing unit (GPU). Meanwhile, through proper design of parallel numerical procedure, we did parallel programming to the two-dimensional slices of three-dimensional object with CUDA. According to the experiments, an effective method of dealing with occlusion problem in ray tracing is proposed, as well as generating the holograms of 3D objects with additive property. Our results indicate that the improved algorithm can effectively shorten the computing time. Due to the different sizes of spatial object points and hologram pixels, the speed has increased 20 to 70 times comparing with original ray tracing algorithm.
Magnetospherically reflected chorus waves revealed by ray tracing with CLUSTER data
M. Parrot
Full Text Available This paper is related to the propagation characteristics of a chorus emission recorded simultaneously by the 4 satellites of the CLUSTER mission on 29 October 2001 between 01:00 and 05:00 UT. During this day, the spacecraft (SC 1, 2, and 4 are relatively close to each other but SC3 has been delayed by half an hour. We use the data recorded aboard CLUSTER by the STAFF spectrum analyser. This instrument provides the cross spectral matrix of three magnetic and two electric field components. Dedicated software processes this spectral matrix in order to determine the wave normal directions relative to the Earth’s magnetic field. This calculation is done for the 4 satellites at different times and different frequencies and allows us to check the directions of these waves. Measurements around the magnetic equator show that the parallel component of the Poynting vector changes its sign when the satellites cross the equator region. It indicates that the chorus waves propagate away from this region which is considered as the source area of these emissions. This is valid for the most intense waves observed on the magnetic and electric power spectrograms. But it is also observed on SC1, SC2, and SC4 that lower intensity waves propagate toward the equator simultaneously with the SC3 intense chorus waves propagating away from the equator. Both waves are at the same frequency. Using the wave normal directions of these waves, a ray tracing study shows that the waves observed by SC1, SC2, and SC4 cross the equatorial plane at the same location as the waves observed by SC3. SC3 which is 30 minutes late observes the waves that originate first from the equator; meanwhile, SC1, SC2, and SC4 observe the same waves that have suffered a Lower Hybrid Resonance (LHR reflection at low altitudes (based on the ray tracing analysis and now return to the equator at a different location with a lower intensity. Similar phenomenon is observed when all SC are on the other side
Validation of Three-Dimensional Ray-Tracing Algorithm for Indoor Wireless Propagations
Majdi Salem; Mahamod Ismail; Norbahiah Misran
2011-01-01
A 3D ray tracing simulator has been developed for indoor wireless networks. The simulator uses geometrical optics (GOs) to propagate the electromagnetic waves inside the buildings. The prediction technique takes into account multiple reflections and transmissions of the propagated waves. An interpolation prediction method (IPM) has been proposed to predict the propagated signal and to make the ray-tracing algorithm faster, accurate, and simple. The measurements have been achieved by using a s...
Fox, Christopher; Romeijn, H Edwin; Dempsey, James F
2006-05-01
We present work on combining three algorithms to improve ray-tracing efficiency in radiation therapy dose computation. The three algorithms include: An improved point-in-polygon algorithm, incremental voxel ray tracing algorithm, and stereographic projection of beamlets for voxel truncation. The point-in-polygon and incremental voxel ray-tracing algorithms have been used in computer graphics and nuclear medicine applications while the stereographic projection algorithm was developed by our group. These algorithms demonstrate significant improvements over the current standard algorithms in peer reviewed literature, i.e., the polygon and voxel ray-tracing algorithms of Siddon for voxel classification (point-in-polygon testing) and dose computation, respectively, and radius testing for voxel truncation. The presented polygon ray-tracing technique was tested on 10 intensity modulated radiation therapy (IMRT) treatment planning cases that required the classification of between 0.58 and 2.0 million voxels on a 2.5 mm isotropic dose grid into 1-4 targets and 5-14 structures represented as extruded polygons (a.k.a. Siddon prisms). Incremental voxel ray tracing and voxel truncation employing virtual stereographic projection was tested on the same IMRT treatment planning cases where voxel dose was required for 230-2400 beamlets using a finite-size pencil-beam algorithm. Between a 100 and 360 fold cpu time improvement over Siddon's method was observed for the polygon ray-tracing algorithm to perform classification of voxels for target and structure membership. Between a 2.6 and 3.1 fold reduction in cpu time over current algorithms was found for the implementation of incremental ray tracing. Additionally, voxel truncation via stereographic projection was observed to be 11-25 times faster than the radial-testing beamlet extent approach and was further improved 1.7-2.0 fold through point-classification using the method of translation over the cross product technique.
Three-dimensional polarization ray-tracing calculus I: definition and diattenuation.
Yun, Garam; Crabtree, Karlton; Chipman, Russell A
2011-06-20
A three-by-three polarization ray-tracing matrix method for polarization ray tracing in optical systems is presented for calculating the polarization transformations associated with ray paths through optical systems. The method is a three-dimensional generalization of the Jones calculus. Reflection and refraction algorithms are provided. Diattenuation of the optical system is calculated via singular value decomposition. Two numerical examples, a three fold-mirror system and a hollow corner cube, demonstrate the method.
Design of indoor WLANs: Combination of a ray-tracing tool with the BPSO method
Moreno Delgado, José; Domingo Gracia, Marta; Valle López, Luis; Pérez López, Jesús Ramón; Torres Jménez, Rafael Pedro; Basterrechea Verdeja, José
2015-01-01
This paper presents an approach that combines a ray tracing tool with a binary version of the particle swarm optimization method (BPSO) for the design of infrastructure mode indoor wireless local area networks (WLAN). The approach uses the power levels of a set of candidate access point (AP) locations obtained with the ray tracing tool at a mesh of potential receiver locations or test points to allow the BPSO optimizer to carry out the design of the WLAN. For this purpose, several restriction...
Vaziri, Sana; Lafontaine, Marisa; Olson, Beck; Crane, Jason C.; Chang, Susan; Lupo, Janine; Nelson, Sarah J.
2014-01-01
The increasing interest in enhancing the RANO criteria by using quantitative assessments of changes in lesion size and image intensities has highlighted the need for rapid, easy-to-use tools that provide DICOM compatible outputs for evaluation of patients with glioma. To evaluate the performance of the SmartBrush software (Brainlab AG), which provides computer-assisted definitions of regions of interest (ROIs), a cohort of 20 patients with glioma (equal number having high and low grade and treated and un-treated) were scanned using a 3T whole-body MR system prior to surgical resection. The T2-weighted FLAIR, pre- and post-contrast T1-weighted gradient echo DICOM images were pushed from the scanner to an offline workstation where analysis of lesion volumes was performed using SmartBrush. Volumes of the T2Ls ranged from 7.9 to 110.2cm3 and the volumes of the CELs was 0.1 to 28.5 cm3 with 19/20 of the subjects having CELs and all 20 having T2Ls. the computer-assisted analysis was performed rapidly and efficiently, with the mean time for defining both lesions per subject was 5.77 (range 3.5 to 7.5) minutes. Prior analysis of ROIS with the SLICER package (www.slicer.org) took approximately 30 minutes/subject. SmartBrush provides lesion volumes and cross-sectional diameter as a PDF report, which can be stored in DICOM. The ROIs were also saved as DICOM objects and transferred to other packages for performing histogram analysis from ADC or other functional parameter maps. Ongoing studies that will be reported in this presentation are performing a similar analysis with multiple users in order to compare the relative intra- and inter-operator variations in terms of both the speed of analysis and the ROIs that are identified. Acknowledgements: The authors would like to acknowledge Rowena Thomson and Natalie Wright from Brainlab for helping to set up this study.
Schumacher, Florian; Friederich, Wolfgang; Lamara, Samir; Gutt, Phillip; Paffrath, Marcel
2015-04-01
We present a seismic full waveform inversion concept for applications ranging from seismological to enineering contexts, based on sensitivity kernels for full waveforms. The kernels are derived from Born scattering theory as the Fréchet derivatives of linearized frequency-domain full waveform data functionals, quantifying the influence of elastic earth model parameters and density on the data values. For a specific source-receiver combination, the kernel is computed from the displacement and strain field spectrum originating from the source evaluated throughout the inversion domain, as well as the Green function spectrum and its strains originating from the receiver. By storing the wavefield spectra of specific sources/receivers, they can be re-used for kernel computation for different specific source-receiver combinations, optimizing the total number of required forward simulations. In the iterative inversion procedure, the solution of the forward problem, the computation of sensitivity kernels and the derivation of a model update is held completely separate. In particular, the model description for the forward problem and the description of the inverted model update are kept independent. Hence, the resolution of the inverted model as well as the complexity of solving the forward problem can be iteratively increased (with increasing frequency content of the inverted data subset). This may regularize the overall inverse problem and optimizes the computational effort of both, solving the forward problem and computing the model update. The required interconnection of arbitrary unstructured volume and point grids is realized by generalized high-order integration rules and 3D-unstructured interpolation methods. The model update is inferred solving a minimization problem in a least-squares sense, resulting in Gauss-Newton convergence of the overall inversion process. The inversion method was implemented in the modularized software package ASKI (Analysis of Sensitivity
New developments in the McStas neutron instrument simulation package
Willendrup, Peter Kjær; Bergbäck Knudsen, Erik; Klinkby, Esben Bryndt; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-01-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scienti_c, open-source collaborative code in 1997. This contribution presents the project at its current state and giv...
Cordiero, Tiago M.; Lindgreen, Erik Bjørn Grønning; Sorenson, Spencer C
2005-01-01
The ARTEMIS rail emissions model was implemented in a Microsoft Excel software package that includes data from the GISCO database on railway traffic. This report is the user’s manual for the aforementioned software that includes information on how to run the program and an overview on how...... the program works. In addition, this manual provides a guide for the user to be able to update the included database itself.This program is provided as a free, open-source program that the user can manipulate to fit the desired goals and update it as new data is gathered. The program is comprised of a set...
Development of the CCP-200 mathematical model for Syzran CHPP using the Thermolib software package
Usov, S. V.; Kudinov, A. A.
2016-04-01
Simplified cycle diagram of the CCP-200 power generating unit of Syzran CHPP containing two gas turbines PG6111FA with generators, two steam recovery boilers KUP-110/15-8.0/0.7-540/200, and one steam turbine Siemens SST-600 (one-cylinder with two variable heat extraction units of 60/75 MW in heatextraction and condensing modes, accordingly) with S-GEN5-100 generators was presented. Results of experimental guarantee tests of the CCP-200 steam-gas unit are given. Brief description of the Thermolib application for the MatLab Simulink software package is given. Basic equations used in Thermolib for modeling thermo-technical processes are given. Mathematical models of gas-turbine plant, heat-recovery steam generator, steam turbine and integrated plant for power generating unit CCP-200 of Syzran CHPP were developed with the help of MatLab Simulink and Thermolib. The simulation technique at different ambient temperature values was used in order to get characteristics of the developed mathematical model. Graphic comparison of some characteristics of the CCP-200 simulation model (gas temperature behind gas turbine, gas turbine and combined cycle plant capacity, high and low pressure steam consumption and feed water consumption for high and low pressure economizers) with actual characteristics of the steam-gas unit received at experimental (field) guarantee tests at different ambient temperature are shown. It is shown that the chosen degrees of complexity, characteristics of the CCP-200 simulation model, developed by Thermolib, adequately correspond to the actual characteristics of the steam-gas unit received at experimental (field) guarantee tests; this allows considering the developed mathematical model as adequate and acceptable it for further work.
Michael R. Crusoe
2015-09-01
Full Text Available The khmer package is a freely available software library for working efficiently with fixed length DNA words, or k-mers. khmer provides implementations of a probabilistic k-mer counting data structure, a compressible De Bruijn graph representation, De Bruijn graph partitioning, and digital normalization. khmer is implemented in C++ and Python, and is freely available under the BSD license at https://github.com/dib-lab/khmer/.
Virtual experiments: the ultimate aim of neutron ray-tracing simulations
Lefmann, Kim; Willendrup, Peter Kjær; Udby, Linda
2008-01-01
We define a virtual neutron experiment as a complete simulation of an experiment, from source over sample to detector. The virtual experiment (VE) will ideally interface with the instrument control software for the input and with standard data analysis packages for the virtual data output. Virtua...
A new version of Scilab software package for the study of dynamical systems
Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.
2009-11-01
This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations for the study of
Ashraf, Haseem; de Hoop, B; Shaker, S B;
2010-01-01
We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms.......We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms....
UCVM: An Open Source Software Package for Querying and Visualizing 3D Velocity Models
Gill, D.; Small, P.; Maechling, P. J.; Jordan, T. H.; Shaw, J. H.; Plesch, A.; Chen, P.; Lee, E. J.; Taborda, R.; Olsen, K. B.; Callaghan, S.
2015-12-01
Three-dimensional (3D) seismic velocity models provide foundational data for ground motion simulations that calculate the propagation of earthquake waves through the Earth. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) package for both Linux and OS X. This unique framework provides a cohesive way for querying and visualizing 3D models. UCVM v14.3.0, supports many Southern California velocity models including CVM-S4, CVM-H 11.9.1, and CVM-S4.26. The last model was derived from 26 full-3D tomographic iterations on CVM-S4. Recently, UCVM has been used to deliver a prototype of a new 3D model of central California (CCA) also based on full-3D tomographic inversions. UCVM was used to provide initial plots of this model and will be used to deliver CCA to users when the model is publicly released. Visualizing models is also possible with UCVM. Integrated within the platform are plotting utilities that can generate 2D cross-sections, horizontal slices, and basin depth maps. UCVM can also export models in NetCDF format for easy import into IDV and ParaView. UCVM has also been prototyped to export models that are compatible with IRIS' new Earth Model Collaboration (EMC) visualization utility. This capability allows for user-specified horizontal slices and cross-sections to be plotted in the same 3D Earth space. UCVM was designed to help a wide variety of researchers. It is currently being use to generate velocity meshes for many SCEC wave propagation codes, including AWP-ODC-SGT and Hercules. It is also used to provide the initial input to SCEC's CyberShake platform. For those interested in specific data points, the software framework makes it easy to extract P and S wave propagation speeds and other material properties from 3D velocity models by providing a common interface through which researchers can query earth models for a given location and depth. Also included in the last release was the ability to add small
Real-time ray tracing of implicit surfaces on the GPU.
Singh, Jag Mohan; Narayanan, P J
2010-01-01
Compact representation of geometry using a suitable procedural or mathematical model and a ray-tracing mode of rendering fit the programmable graphics processor units (GPUs) well. Several such representations including parametric and subdivision surfaces have been explored in recent research. The important and widely applicable category of the general implicit surface has received less attention. In this paper, we present a ray-tracing procedure to render general implicit surfaces efficiently on the GPU. Though only the fourth or lower order surfaces can be rendered using analytical roots, our adaptive marching points algorithm can ray trace arbitrary implicit surfaces without multiple roots, by sampling the ray at selected points till a root is found. Adapting the sampling step size based on a proximity measure and a horizon measure delivers high speed. The sign test can handle any surface without multiple roots. The Taylor test that uses ideas from interval analysis can ray trace many surfaces with complex roots. Overall, a simple algorithm that fits the SIMD architecture of the GPU results in high performance. We demonstrate the ray tracing of algebraic surfaces up to order 50 and nonalgebraic surfaces including a Blinn's blobby with 75 spheres at better than interactive frame rates.
Application of Metafor Package in R Software%R软件Metafor程序包在Meta分析中的应用
董圣杰; 曾宪涛; 郭毅
2012-01-01
R software is a free and powerful statistical tool, including Metafor, Meta as well as Rmeta packages, all of which could conduct meta-analysis. Metafor package provides functions for meta-analyses which include analysis of continuous and categorical data,meta-regression, cumulative meta-analysis as well as test for funnel plot asymmetry. The package can also draw various plots, such as forest plot, funnel plot, radial plot and so forth. Mixed-effects models (involving single or multiple categorical and/or continuous moderates) can only be fitted with Metafor packages. Advanced methods for testing model coefficients and confidence intervals are also implemented only in this package. This article introduces detailed operation steps of Metafor package for meta-analysis using cases.%R软件是一款免费使用且功能强大的统计软件,常用的Meta分析程序包有Metafor、Meta、Rrneta等.Metafor程序包可以方便地进行Meta分析,包括二分类及连续性变量的Meta分析、Meta回归、累积Meta分析及对发表偏倚的Begg's检验和Egger's检验等,以及绘制森林图、漏斗图、星状图、拉贝图、Q-Q正态分位图等图形.此外,Metafor程序包是R软件Meta分析程序包中唯一可以进行混合效应模型(包括单个、多个分类或连续性变量)拟合运算的,还可以检验模型系数并获得可信区间.本文结合实例对应用Metafor程序包进行Meta分析的具体操作方法进行详细介绍.
Okumura, Akira; Noda, Koji; Rulten, Cameron
2016-03-01
We have developed a non-sequential ray-tracing simulation library, ROOT-basedsimulatorforraytracing (ROBAST), which is aimed to be widely used in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++, and fully utilizes the geometry library of the ROOT framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used in the community has existed. To reduce the dispensable effort needed to develop multiple ray-tracing simulators by different research groups, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the six proposed telescope designs for CTA, ROBAST is currently used for three telescopes: a Schwarzschild-Couder (SC) medium-sized telescope, one of SC small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulation and development of hexagonal light concentrators proposed for the LST focal plane. Making full use of the ROOT geometry library with additional ROBAST classes, we are able to build the complex optics geometries typically used in CR experiments and ground-based gamma-ray telescopes. We introduce ROBAST and its features developed for CR experiments, and show several successful applications for CTA.
GenoMatrix: A Software Package for Pedigree-Based and Genomic Prediction Analyses on Complex Traits.
Nazarian, Alireza; Gezan, Salvador Alejandro
2016-07-01
Genomic and pedigree-based best linear unbiased prediction methodologies (G-BLUP and P-BLUP) have proven themselves efficient for partitioning the phenotypic variance of complex traits into its components, estimating the individuals' genetic merits, and predicting unobserved (or yet-to-be observed) phenotypes in many species and fields of study. The GenoMatrix software, presented here, is a user-friendly package to facilitate the process of using genome-wide marker data and parentage information for G-BLUP and P-BLUP analyses on complex traits. It provides users with a collection of applications which help them on a set of tasks from performing quality control on data to constructing and manipulating the genomic and pedigree-based relationship matrices and obtaining their inverses. Such matrices will be then used in downstream analyses by other statistical packages. The package also enables users to obtain predicted values for unobserved individuals based on the genetic values of observed related individuals. GenoMatrix is available to the research community as a Windows 64bit executable and can be downloaded free of charge at: http://compbio.ufl.edu/software/genomatrix/. © The American Genetic Association. 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sheil, Conor; Goncharov, Alexander V.
2013-05-01
A physical model eye was constructed to test the quality of ophthalmic instruments. The accuracy and precision of two commercially available instruments were analysed. For these instruments, a particular model eye was obtained which mimicked the physical properties that would be usually measured e.g. corneal topography or optical path within the human eye. The model eye was designed using relatively simple optical components (e.g. plano-convex lenses) separated by appropriate intraocular distances taken from the literature. The dimensions of the model eye were known a priori: The lenses used in the construction of the model eye were characterised ac cording to values given in the manufacturers' data sheets and also through measurement using an interferometer. The distances between the lens surfaces were calculated using the interferometric data with reverse ray-tracing. Optical paths were calculated as the product of refractive index and axial distance. The errors inherent in mea suring these ocular parameters by different ophthalmic instruments can be considered as producing an erroneous value for the overall refractive power of the eye. The latter is a useful metric for comparing various ophthalmic devices where the direct comparison of quality is not possible or is not practical. For example, a 1% error in anterior corneal radius of curvature will have a more detrimental effect than the same error in posterior corneal radius, due to the relative differences in refractive indices at those surface boundaries. To quantify the error in ocular refractive power, a generic eye model was created in ZEMAX optical design software. The parametric errors were then used to compute the overall error in predicting ocular refractive power, thus highlighting the relative importance of individual errors. This work will help in future determination of acceptable levels of metrological errors in ocular instrumentation.
A three-dimensional sound ray tracing method by deploying regular tetrahedrons
JIANG Wei; LI Taibao
2005-01-01
A sound ray tracing algorithm is presented, which helps to rapidly find the sound ray trajectories in three-dimensional (3-D) space. At each step of ray tracing, a small regular tetrahedron is made in front of a ray, so that the sound speed field inside may be approximately regarded as linear. Since a ray trajectory in the linear sound speed field is always on a plane, it may be obtained by the two-dimensional (2-D) sound ray tracing method by deploying triangles.The theoretical derivation is given and a numerical model is discussed. It shows that the algorithm is fast and precise. It is also more concise and reliable than the traditional 3-D algorithms, and may be used to avoid the damage to the precision by the acoustic refraction in the 3-D ultrasound computerized tomography.
Ray-Tracing studies in a perturbed atmosphere I- The initial value problem
Tannous, C
2001-01-01
We report the development of a new ray-tracing simulation tool having the potential of the full characterization of a radio link through the accurate study of the propagation path of the signal from the transmitting to the receiving antennas across a perturbed atmosphere. The ray-tracing equations are solved, with controlled accuracy, in three dimensions (3D) and the propagation characteristics are obtained using various refractive index models. The launching of the rays, the atmospheric medium and its disturbances are characterized in 3D. The novelty in the approach stems from the use of special numerical techniques dealing with so called stiff differential equations without which no solution of the ray-tracing equations is possible. Starting with a given launching angle, the solution consists of the ray trajectory, the propagation time information at each point of the path, the beam spreading, the transmitted (resp. received) power taking account of the radiation pattern and orientation of the antennas and ...
Refined ray tracing inside single- and double-curvatured concave surfaces
Choudhury, Balamati
2016-01-01
This book describes the ray tracing effects inside different quadric surfaces. Analytical surface modeling is a priori requirement for electromagnetic (EM) analysis over aerospace platforms. Although numerically-specified surfaces and even non-uniform rational basis spline (NURBS) can be used for modeling such surfaces, for most practical EM applications, it is sufficient to model them as quadric surface patches and the hybrids thereof. It is therefore apparent that a vast majority of aerospace bodies can be conveniently modeled as combinations of simpler quadric surfaces, i.e. hybrid of quadric cylinders and quadric surfaces of revolutions. Hence the analysis of geometric ray tracing inside is prerequisite to analyzing the RF build-up. This book, describes the ray tracing effects inside different quadric surfaces such as right circular cylinder, general paraboloid of revolution (GPOR), GPOR frustum of different shaping parameters and the corresponding visualization of the ray-path details. Finally ray tracin...
Yu-Hua Dean Fang
2014-01-01
Full Text Available Background. The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA toolbox, and provide it to the research community as a free, open-source project. Methods. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC cancer patients treated with definitive radiotherapies. Results. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC. Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.. Conclusions. CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.
Fang, Yu-Hua Dean; Lin, Chien-Yu; Shih, Meng-Jung; Wang, Hung-Ming; Ho, Tsung-Ying; Liao, Chun-Ta; Yen, Tzu-Chen
2014-01-01
The quantification of tumor heterogeneity with molecular images, by analyzing the local or global variation in the spatial arrangements of pixel intensity with texture analysis, possesses a great clinical potential for treatment planning and prognosis. To address the lack of available software for computing the tumor heterogeneity on the public domain, we develop a software package, namely, Chang-Gung Image Texture Analysis (CGITA) toolbox, and provide it to the research community as a free, open-source project. With a user-friendly graphical interface, CGITA provides users with an easy way to compute more than seventy heterogeneity indices. To test and demonstrate the usefulness of CGITA, we used a small cohort of eighteen locally advanced oral cavity (ORC) cancer patients treated with definitive radiotherapies. In our case study of ORC data, we found that more than ten of the current implemented heterogeneity indices outperformed SUVmean for outcome prediction in the ROC analysis with a higher area under curve (AUC). Heterogeneity indices provide a better area under the curve up to 0.9 than the SUVmean and TLG (0.6 and 0.52, resp.). CGITA is a free and open-source software package to quantify tumor heterogeneity from molecular images. CGITA is available for free for academic use at http://code.google.com/p/cgita.
Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)
1995-03-01
This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V&V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V&V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, {open_quotes}User`s Manual.{close_quotes} Three factors determine what V&V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V&V that is needed (as judged from an assessment of the system`s complexity and the requirement for its integrity to form three Classes). A V&V Guideline package is provided for each of the combinations of these three variables. The package specifies the V&V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V&V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems.
Pascoal, A; Lawinski, C P; Honey, I; Blake, P
2005-12-07
Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA(detector), which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.
Pascoal, A [Medical Engineering and Physics, King' s College London, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Lawinski, C P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Honey, I [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark); Blake, P [KCARE - King' s Centre for Assessment of Radiological Equipment, King' s College Hospital, Faraday Building Denmark Hill, London SE5 8RX (Denmark)
2005-12-07
Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA{sub detector}, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.
Pascoal, A.; Lawinski, C. P.; Honey, I.; Blake, P.
2005-12-01
Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMAdetector, which indicates the potential to use the software CDRAD analyser for assessment of relative IQ.
Reflection formulae for ray tracing in uniaxial anisotropic media using Huygens's principle.
Alemán-Castañeda, Luis A; Rosete-Aguilar, Martha
2016-11-01
Ray tracing in uniaxial anisotropic materials is important because they are widely used for instrumentation, liquid-crystal displays, laser cavities, and quantum experiments. There are previous works regarding ray tracing refraction and reflection formulae using the common electromagnetic theory approach, but only the refraction formulae have been deduced using Huygens's principle. In this paper we obtain the reflection expressions using this unconventional approach with a specific coordinate system in which both refraction and reflection formulae are simplified as well as their deduction. We compute some numerical examples to compare them with the common expressions obtained using electromagnetic theory.
Nelson Oly NDUBISI; Omprakash K.GUPTA; Samia MASSOUD
2003-01-01
In this paper we study how or ganizational learning impacts organizational behavior, and how vendor support quality enhances product adoption and usage behavior. These constructs were verified using Application Software Packages (ASP) - a prewritten, precoded, commercially available set of programs that eliminates the need for individuals or organizations to write their own software programs for certain functions. The relationship between ASP usage, usage outcomes and use processes were also investigated. Two hundred and ninety-five Chinese, Indian, and Malay entrepreneurships were studied. It was found that usage outcome strongly determines usage, while use process has only an indirect relationship (via outcome) on usage. The impact of organizational learning and vendor service quality on usage, usage outcome, and use process were robust. Theoretical and practical implications ofthe research are discussed.
The last developments of the airGR R-package, an open source software for rainfall-runoff modelling
Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Perrin, Charles; Andréassian, Vazken
2017-04-01
and usability of this tool. References Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.
Veen, Berlinda J. van der; Dibbets-Schneider, Petra; Stokkel, Marcel P.M. [Leiden University Medical Center, Department of Nuclear Medicine, Leiden (Netherlands); Scholte, Arthur J. [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands)
2010-09-15
Semiquantitative analysis of myocardial perfusion scintigraphy (MPS) has reduced inter- and intraobserver variability, and enables researchers to compare parameters in the same patient over time, or between groups of patients. There are several software packages available that are designed to process MPS data and quantify parameters. In this study the performances of two systems, quantitative gated SPECT (QGS) and 4D-MSPECT, in the processing of clinical patient data and phantom data were compared. The clinical MPS data of 148 consecutive patients were analysed using QGS and 4D-MSPECT to determine the end-diastolic volume, end-systolic volume and left ventricular ejection fraction. Patients were divided into groups based on gender, body mass index, heart size, stressor type and defect type. The AGATE dynamic heart phantom was used to provide reference values for the left ventricular ejection fraction. Although the correlations were excellent (correlation coefficients 0.886 to 0.980) for all parameters, significant differences (p < 0.001) were found between the systems. Bland-Altman plots indicated that 4D-MSPECT provided overall higher values of all parameters than QGS. These differences between the systems were not significant in patients with a small heart (end-diastolic volume <70 ml). Other clinical factors had no direct influence on the relationship. Additionally, the phantom data indicated good linear responses of both systems. The discrepancies between these software packages were clinically relevant, and influenced by heart size. The possibility of such discrepancies should be taken into account when a new quantitative software system is introduced, or when multiple software systems are used in the same institution. (orig.)
Chikkagoudar, Satish; Wang, Kai; Li, Mingyao
2011-05-26
Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.
'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.
Blume, Christine; Santhi, Nayantara; Schabus, Manuel
2016-01-01
For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).
An Energy Conservative Ray-Tracing Method With a Time Interpolation of the Force Field
Yao, Jin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-02-10
A new algorithm that constructs a continuous force field interpolated in time is proposed for resolving existing difficulties in numerical methods for ray-tracing. This new method has improved accuracy, but with the same degree of algebraic complexity compared to Kaisers method.
A Ray-tracing Method to Analyzing Modulated Planar Fabry-Perot Antennas
Hougs, Mikkel Dahl; Kim, Oleksiy S.; Breinbjerg, Olav
2015-01-01
A new approach for fast modelling of Fabry-Perot antennas with modulated partially reflective surfaces (PRS) using ray-tracing is proposed. For validation of the method, a configuration is introduced which consists of a cavity with a modulated PRS, fed internally by a magnetic dipole. The PRS con...
Emulating Ray-Tracing Channels in Multi-probe Anechoic Chamber Setups for Virtual Drive Testing
Fan, Wei; Llorente, Ines Carton; Kyösti, Pekka
2016-01-01
This paper discusses virtual drive testing (VDT) for multiple-input multiple-output (MIMO) capable terminals in multi-probe anechoic chamber (MPAC) setups. We propose to perform VDT, via reproducing ray tracing (RT) simulated channels with the field synthesis technique. Simulation results demonst...
Magnetospheric Whistler Mode Ray Tracing with the Inclusion of Finite Electron and Ion Temperature
Maxworth, A. S.; Golkowski, M.
2015-12-01
Ray tracing is an important technique for the study of whistler mode wave propagation in the Earth's magnetosphere. In numerical ray tracing the trajectory of a wave packet is calculated at each point in space by solving the Haselgrove equations, assuming a smooth, loss-less medium with no mode coupling. Previous work on ray tracing has assumed a cold plasma environment with negligible electron and ion temperatures. In this work we present magnetospheric whistler mode wave ray tracing results with the inclusion of finite ion and electron temperature. The inclusion of finite temperature effects makes the fourth order dispersion relation become sixth order. We compare our results with the work done by previous researchers for cold plasma environments, using two near earth space models (NGO and GCPM). Inclusion of finite temperature closes the otherwise open refractive index surface near the lower hybrid resonance frequency and affects the magnetospheric reflection of whistler waves. We also asses the main changes in the ray trajectory and implications for cyclotron resonance wave particle interactions including energetic particle precipitation.
A Ray-tracing Method to Analyzing Modulated Planar Fabry-Perot Antennas
Hougs, Mikkel Dahl; Kim, Oleksiy S.; Breinbjerg, Olav
2015-01-01
A new approach for fast modelling of Fabry-Perot antennas with modulated partially reflective surfaces (PRS) using ray-tracing is proposed. For validation of the method, a configuration is introduced which consists of a cavity with a modulated PRS, fed internally by a magnetic dipole. The PRS...
Investigation of propagation algorithms for ray-tracing simulation of polarized neutrons
Bergbäck Knudsen, Erik; Tranum-Rømer, A.; Willendrup, Peter Kjær
2014-01-01
Ray-tracing of polarized neutrons faces a challenge when the neutron propagates through an inhomogeneous magnetic field. This affects simulations of novel instruments using encoding of energy or angle into the neutron spin. We here present a new implementation of propagation of polarized neutrons...
GPU-based ray tracing algorithm for high-speed propagation prediction in typical indoor environments
Guo, Lixin; Guan, Xiaowei; Liu, Zhongyu
2015-10-01
A fast 3-D ray tracing propagation prediction model based on virtual source tree is presented in this paper, whose theoretical foundations are geometrical optics(GO) and the uniform theory of diffraction(UTD). In terms of typical single room indoor scene, taking the geometrical and electromagnetic information into account, some acceleration techniques are adopted to raise the efficiency of the ray tracing algorithm. The simulation results indicate that the runtime of the ray tracing algorithm will sharply increase when the number of the objects in the single room is large enough. Therefore, GPU acceleration technology is used to solve that problem. As is known to all, GPU is good at calculation operation rather than logical judgment, so that tens of thousands of threads in CUDA programs are able to calculate at the same time, in order to achieve massively parallel acceleration. Finally, a typical single room with several objects is simulated by using the serial ray tracing algorithm and the parallel one respectively. It can be found easily from the results that compared with the serial algorithm, the GPU-based one can achieve greater efficiency.
HUANG Yueqin; ZHANG Jianzhong
2008-01-01
A kind of three-dimensional(3-D) sound ray tracing algorithm in heterogeneous media is studied. This algorithm includes two steps: the first step computes the wavefront traveltimes forward; the second step traces the sound rays backward. In the first step, the computation of wavefront traveltimes at discrete grid points from the sound source, was found on Eikonal equation solutions and carried out by GMM (Group marching method) wavefront marching method based on level set. In the second step, sound ray tracing was proceeded gradually from the receiver to each cell towards the sound source, with wavefront traveltimes computed in the first step. Time values on arbitrary positions in each cuboid cell can be expressed by linear interpolation of wavefront traveltimes at the same cell's grid points. Thus,an algorithm of 3-D sound ray tracing in heterogeneous media is put forward. The simulation results indicate that this method can improve both the accuracy and the efficiency of 3-D sound ray tracing greatly.
A Sub-band Divided Ray Tracing Algorithm Using the DPS Subspace in UWB Indoor Scenarios
Gan, Mingming; Xu, Zhinan; Hofer, Markus
2015-01-01
Sub-band divided ray tracing (SDRT) is one technique that has been extensively used to obtain the channel characteristics for ultra-wideband (UWB) radio wave propagation in realistic indoor environments. However, the computational complexity of SDRT scales directly with the number of sub-bands. A...
Jonestrask, L.; Tauxe, L.; Shaar, R.; Jarboe, N.; Minnett, R.; Koppers, A. A. P.
2014-12-01
There are many data types and methods of analysis in rock and paleomagnetic investigations. The MagIC database (http://earthref.org/MAGIC) was designed to accommodate the vast majority of data used in such investigations. Yet getting data from the laboratory into the database, and visualizing and re-analyzing data downloaded from the database, makes special demands on data formatting. There are several recently published programming packages that deal with single types of data: demagnetization experiments (e.g., Lurcock et al., 2012), paleointensity experiments (e.g., Leonhardt et al., 2004), and FORC diagrams (e.g., Harrison et al., 2008). However, there is a need for a unified set of open source, cross-platform software that deals with the great variety of data types in a consistent way and facilitates importing data into the MagIC format, analyzing them and uploading them into the MagIC database. The PmagPy software package (http://earthref.org/PmagPy/cookbook/) comprises a such a comprehensive set of tools. It facilitates conversion of many laboratory formats into the common MagIC format and allows interpretation of demagnetization and Thellier-type experimental data. With some 175 programs and over 250 functions, it can be used to create a wide variety of plots and allows manipulation of downloaded data sets as well as preparation of new contributions for uploading to the MagIC database.
Carbillet, Marcel; Jolissaint, Laurent; Maire, Anne-Lise
We present the Software Package PAOLAC (“PAOLA within Caos”) in its first distributed version. This new numerical simulation tool is an embedment of the analytical adaptive optics simulation code PAOLA (“Performance of Adaptive Optics for Large (or Little) Apertures”) within the CAOS problem-solving environment. The main goal of this new tool is to allow an easier and direct comparison between studies performed with the analytical open-loop code PAOLA and studies performed with the end-to-end closed-loop Software Package CAOS (“Code for Adaptive Optics Systems”), with the final scope of better understanding how to take advantage from the two approaches: one analytical allowing extremely quick results on a wide range of cases and the other extremely detailed but with a computational and memory costs which can be impressive. The practical implementation of this embedment is briefly described, showing how this absolutely does not affect any aspect of the original code which is simply directly called from the CAOS global graphical interface through ad hoc modules. A comparison between end-to-end modelling and analytical modelling is hence also initiated, within the specific framework of wide-field adaptive optics at Dome C, Antarctica.
Lee, Woonghee; Stark, Jaime L; Markley, John L
2014-11-01
Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.
Yosef Meller
2016-06-01
Full Text Available The Particle Tracking Velocimetry (PTV community employs several formats of particle information such as position and velocity as function of time, i.e. trajectory data, as a result of diverging needs unmet by existing formats, and a number of different, mostly home-grown, codes for handling the data. Flowtracks is a Python package that provides a single code base for accessing different formats as a database, i.e. storing data and programmatically manipulating them using format-agnostic data structures. Furthermore, it offers an HDF5-based format that is fast and extensible, obviating the need for other formats. The package may be obtained from https://github.com/OpenPTV/postptv and used as-is by many fluid-dynamics labs, or with minor extensions adhering to a common interface, by researchers from other fields, such as biology and population tracking.
María Gabriela Mago Ramos
2012-08-01
Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.
Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna
2016-01-01
Background Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Objective Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. Methods We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. Results We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. Conclusion The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing. PMID
Chiara Grasso
Full Text Available Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis.Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results.We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1 by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36 and DNA from blood fractions of healthy people (DD study, N = 28, respectively.We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites.The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.
Ray-tracing simulations of liquid-crystal gradient-index lenses for three-dimensional displays
Sluijter, M.; Herzog, A.; De Boer, D.K.G.; Krijn, M.P.C.M.; Urbach, P.H.
2009-01-01
For the first time, to our knowledge, we report ray-tracing simulations of an advanced liquid-crystal gradientindex lens structure for application in switchable two-dimensional/three-dimensional (3D) autostereoscopic displays. We present ray-tracing simulations of the angular-dependent lens action.
Hernandez, F. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain)]. E-mail: fimerall@ull.es; Gonzalez-Manrique, S. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Karlsson, L. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Hernandez-Armas, J. [Laboratorio de Fisica Medica y Radioactividad Ambiental, Departamento de Medicina Fisica y Farmacologia, Universidad de La Laguna, 38320 La Laguna, Tenerife (Spain); Aparicio, A. [Instituto de Astrofisica de Canarias, 38200 La Laguna, Tenerife (Spain); Departamento de Astrofisica, Universidad de La Laguna. Avenida. Astrofisico Francisco Sanchez s/n, 38071 La Laguna, Tenerife (Spain)
2007-03-15
Makrofol detectors are commonly used for long-term radon ({sup 222}Rn) measurements in houses, schools and workplaces. The use of this type of passive detectors for the determination of radon concentrations requires the counting of the nuclear tracks produced by alpha particles on the detecting material. The 'image reduction and analysis facility' (IRAF) software package is a piece of software commonly used in astronomical applications. It allows detailed counting and mapping of sky sections where stars are grouped very closely, even forming clusters. In order to count the nuclear tracks in our Makrofol radon detectors, we have developed an inter-disciplinary application that takes advantage of the similitude that exist between counting stars in a dark sky and tracks in a track-etch detector. Thus, a low cost semi-automatic system has been set up in our laboratory which utilises a commercially available desktop scanner and the IRAF software package. A detailed description of the proposed semi-automatic method and its performance, in comparison to ocular counting, is described in detail here. In addition, the calibration factor for this procedure, 2.97+/-0.07kBqm{sup -3}htrack{sup -1}cm{sup 2}, has been calculated based on the results obtained from exposing 46 detectors to certified radon concentrations. Furthermore, the results of a preliminary radon survey carried out in 62 schools in Tenerife island (Spain), using Makrofol detectors, counted with the mentioned procedure, are briefly presented. The results reported here indicate that the developed procedure permits a fast, accurate and unbiased determination of the radon tracks in a large number of detectors. The measurements carried out in the schools showed that the radon concentrations in at least 12 schools were above 200Bqm{sup -3} and, in two of them, above 400Bqm{sup -3}. Further studies should be performed at those schools following the European Union recommendations about radon concentrations in
MATCH - A Software Package for Robust Profile Matching Using S-Plus
Douglas P. Wiens
2004-04-01
Full Text Available This manual details the implementation of the profile matching techniques introduced in Robust Estimation of Air-Borne Particulate Matter (Wiens, Florence and Hiltz, Environmetrics, 2001 - included as an appendix. The program consists of a collection of functions written in S. It runs in S-Plus, including the student version. A graphical user interface is supplied for easy implementation by a user with only a passing familiarity with S-Plus. A description of the software is given, together with an extensive example of an analysis of a data set using the software. The software is available at http://www.stat.ualberta.ca/~wiens/publist.htm where it is linked to the listing for Wiens, Florence and Hiltz (2001.
A software package for stellar and solar inverse-Compton emission: Stellarics
Orlando, Elena
2013-01-01
We present our software to compute gamma-ray emission from inverse-Compton scattering by cosmic-ray leptons in the heliosphere, as well as in the photospheres of stars. It includes a formulation of modulation in the heliosphere, but it can be used for any user-defined modulation model. Profiles and spectra are output to FITS files in a variety of forms for convenient use. Also included are general-purpose inverse-Compton routines with other features like energy loss rates and emissivity for any user-defined target photon and lepton spectra. The software is publicly available and it is under continuing development.
A software package for Stellar and solar Inverse Compton emission: StellarICs
Orlando, Elena; Strong, Andrew
2013-06-01
We present our software to compute gamma-ray emission from inverse-Compton scattering by cosmic-ray leptons in the heliosphere, as well as in the photospheres of stars. It includes a formulation of modulation in the heliosphere, but can be used for any user-defined modulation model. Profiles and spectra are output to FITS files in a variety of forms for convenient use. Also included are general-purpose inverse-Compton routines with other features like energy loss rates and emissivity for any user-defined target photon and lepton spectra. The software is publicly available and it is under continuing development.
Area of ischemia assessed by physicians and software packages from myocardial perfusion scintigrams
Edenbrandt, L.; Hoglund, P.; Frantz, S.
2014-01-01
Background: The European Society of Cardiology recommends that patients with > 10% area of ischemia should receive revascularization. We investigated inter-observer variability for the extent of ischemic defects reported by different physicians and by different software tools, and if inter-observ...
Educational Administrative Software Packages: Alternatives to In-House Developed Systems.
Harris, Edward V.
1985-01-01
Historically, educational institutions have largely relied on in-house development of administrative software. However, the costs of skilled programmers and rapidly advancing technology are making in-house development too expensive. These and other factors are addressed and changes needed for future educational administrative computing support are…
Joyce, Karen E; Hayasaka, Satoru
2012-10-01
Although there are a number of statistical software tools for voxel-based massively univariate analysis of neuroimaging data, such as fMRI (functional MRI), PET (positron emission tomography), and VBM (voxel-based morphometry), very few software tools exist for power and sample size calculation for neuroimaging studies. Unlike typical biomedical studies, outcomes from neuroimaging studies are 3D images of correlated voxels, requiring a correction for massive multiple comparisons. Thus, a specialized power calculation tool is needed for planning neuroimaging studies. To facilitate this process, we developed a software tool specifically designed for neuroimaging data. The software tool, called PowerMap, implements theoretical power calculation algorithms based on non-central random field theory. It can also calculate power for statistical analyses with FDR (false discovery rate) corrections. This GUI (graphical user interface)-based tool enables neuroimaging researchers without advanced knowledge in imaging statistics to calculate power and sample size in the form of 3D images. In this paper, we provide an overview of the statistical framework behind the PowerMap tool. Three worked examples are also provided, a regression analysis, an ANOVA (analysis of variance), and a two-sample T-test, in order to demonstrate the study planning process with PowerMap. We envision that PowerMap will be a great aide for future neuroimaging research.
Ashraf, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Bach, K.S.; Hansen, H.; Prokop, M.; Pedersen, J.H.
2010-01-01
OBJECTIVE: We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. METHODS: In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were
A software package for predicting design-flood hydrographs in small and ungauged basins
Rodolfo Piscopia
2015-06-01
Full Text Available In this study, software for estimating design hydrographs in small and ungauged basins is presented. The main aim is to propose a fast and user-friendly empirical tool that the practitioner can apply for hydrological studies characterised by a lack of observed data. The software implements a homonymous framework called event-based approach for small and ungauged basins (EBA4SUB that was recently developed and tested by the authors to estimate the design peak discharge using the same input information necessary to apply the rational formula. EBA4SUB is a classical hydrological event-based model in which each step (design hyetograph, net rainfall estimation, and rainfall-runoff transformation is appropriately adapted for empirical applications without calibration. As a case study, the software is applied in a small watershed while varying the hyetograph shape, rainfall peak position, and return time. The results provide an overview of the software and confirm the secondary role of the design rainfall peak position.
Ashraf, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Bach, K.S.; Hansen, H.; Prokop, M.; Pedersen, J.H.
2010-01-01
OBJECTIVE: We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. METHODS: In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were
Lydia Hellrung
Full Text Available In this work we present a new open source software package offering a unified framework for the real-time adaptation of fMRI stimulation procedures. The software provides a straightforward setup and highly flexible approach to adapt fMRI paradigms while the experiment is running. The general framework comprises the inclusion of parameters from subject's compliance, such as directing gaze to visually presented stimuli and physiological fluctuations, like blood pressure or pulse. Additionally, this approach yields possibilities to investigate complex scientific questions, for example the influence of EEG rhythms or fMRI signals results themselves. To prove the concept of this approach, we used our software in a usability example for an fMRI experiment where the presentation of emotional pictures was dependent on the subject's gaze position. This can have a significant impact on the results. So far, if this is taken into account during fMRI data analysis, it is commonly done by the post-hoc removal of erroneous trials. Here, we propose an a priori adaptation of the paradigm during the experiment's runtime. Our fMRI findings clearly show the benefits of an adapted paradigm in terms of statistical power and higher effect sizes in emotion-related brain regions. This can be of special interest for all experiments with low statistical power due to a limited number of subjects, a limited amount of time, costs or available data to analyze, as is the case with real-time fMRI.
Anderson Gordon A
2009-03-01
Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to
Martin Nikolai Hebart
2015-01-01
Full Text Available The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns.
Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe
2017-10-01
To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.
Riddell, A.E.; Britcher, A.R. (British Nuclear Fuels plc, Sellafield (United Kingdom))
1994-01-01
The PLUTO software package was developed at Sellafield to make optimum use of the analysis data from plutonium in urine samples in arriving at the best estimate of intake/uptake. The program prompts the assessor to enter the assessment parameters required to fit the data to the excretion function using the maximum likelihood method. A critical appraisal is given of the relative strengths and weaknesses of this assessment package. (author).
Tahari, Abdel K.; Lee, Andy; Rajaram, Mahadevan; Fukushima, Kenji; Lodge, Martin A.; Wahl, Richard L.; Bravo, Paco E. [Divisions of Nuclear Medicine, Johns Hopkins Medical Institutions, Department of Radiology, Baltimore, MD (United States); Lee, Benjamin C. [INVIA Medical Imaging Solutions, Ann Arbor, MI (United States); Ficaro, Edward P. [University of Michigan Health Systems, Ann Arbor, MI (United States); Nekolla, Stephan [Technical University of Munich, Munich (Germany); Klein, Ran; DeKemp, Robert A. [University of Ottawa Heart Institute, Ottawa (Canada); Bengel, Frank M. [Hannover Medical School, Department of Nuclear Medicine, Hannover (Germany)
2014-01-15
packages. Quantitative assessment of resting and stress MBF with {sup 82}Rb PET is dependent on the software and methods used, whereas CFR appears to be more comparable. Follow-up and treatment assessment should be done with the same software and method. (orig.)
2007-11-02
Luca, �A procedure for decomposing the myoelectric signal into its constituent action potentials,� IEEE Trans. Biomed. Eng., vol. BME-29, pp. 149...Abstract- The analysis of intramuscular EMG signals is based on the decomposition of the signals into basic units. Existing decomposition...software only supports short registration periods or single-channel recordings of signals of constant muscle effort. In this paper, we present the
Chao-Chun Chen
2013-12-01
Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.
The software package AIRY 7.0: new efficient deconvolution methods for post-adaptive optics data
La Camera, Andrea; Carbillet, Marcel; Prato, Marco; Boccacci, Patrizia; Bertero, Mario
2016-07-01
The Software Package AIRY (acronym of Astronomical Image Restoration in interferometrY) is a complete tool for the simulation and the deconvolution of astronomical images. The data can be a post-adaptive-optics image of a single dish telescope or a set of multiple images of a Fizeau interferometer. Written in IDL and freely downloadable, AIRY is a package of the CAOS Problem-Solving Environment. It is made of different modules, each one performing a specific task, e.g. simulation, deconvolution, and analysis of the data. In this paper we present the last version of AIRY containing a new optimized method for the deconvolution problem based on the scaled-gradient projection (SGP) algorithm extended with different regularization functions. Moreover a new module based on our multi-component method is added to AIRY. Finally we provide a few example projects describing our multi-step method recently developed for deblurring of high dynamic range images. By using AIRY v.7.0, users have a powerful tool for simulating the observations and for reconstructing their real data.
Ye. P. Krupochkin
2017-01-01
Full Text Available Mathematical and scientific methods are highly significant in modern geoarcheological study. They contribute to the development of new computer technologies and their implementing in geoarcheological research in particular, decoding and photogrammetric processing of space images.The article focuses on the “Detection Artifacts”software package designed for thematic aerospace image decoding which is aimed at making the search automatic for various archeological sites, both natural and artificially created ones. The main attention is drawn to decoding of archeological sites using methods of morphological analysis and indicative decoding.Its work is based on two groups of methods of image computer processing: 1 an image enhancement method which is carried out with the help of spatial frequency filtration, and 2 a method of morphometric analysis. The methods of spatial frequency filtration can be used to solve two problems: information noise minimization and edge enhancement. To achieve the best results using the methods of spatial frequency filtration it is necessary to have all the information of relevance to the objects of searching.Searching for various archeological sites is not only photogrammetric task. In fact, this problem can be solved in the sphere of photogrammetry with the application of aerospace and computer methods. The authors stress the idea in order to avoid terminology ambiguity and confusion when describing the essence of the methods and processes. It should be noted that the work with the images must be executed in a strict sequence. First and foremost, photogrammetric processing – atmospheric correction, geometric adjustment, conversion and geo targeting should be implemented. And only after that one can proceed to decoding the information.When creating the software package a modular structure was applied that favorably affected the tasks being solved and corresponded to the conception of search for archaeological objects
Herlocker, J. A.; Jiang, J.; Garcia, K. J.
2008-08-01
Common digital display systems have evolved into sophisticated optical devices. The rapid market growth in liquid crystal displays makes the simulation of full systems attractive, promoting virtual prototyping with decreased development times and improved manufacturability. Realistic simulation using commercial non-sequential ray tracing tools has been instrumental in this process, but the need to accurately model polarization devices has become critical in many designs. As display systems seek more efficient use of light and more accurate color representation, the proper simulation of polarization devices with large acceptance angles is essential. This paper examines non-uniform polarization effects in the simulation of modern display devices using realistic polarizer and retarder models in the ASAPÂ® non-sequential ray-tracing environment.
Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry
Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.
GRay: a Massively Parallel GPU-Based Code for Ray Tracing in Relativistic Spacetimes
Chan, Chi-kwan; Ozel, Feryal
2013-01-01
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This GPU-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 nanosecond per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing CPU-based ray tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and lightcurves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of K...
Robust Image Denoising using a Virtual Flash Image for Monte Carlo Ray Tracing
Moon, Bochang; Jun, Jong Yun; Lee, JongHyeob
2013-01-01
parameters. To highlight the benefits of our method, we apply our method to two Monte Carlo ray tracing methods, photon mapping and path tracing, with various input scenes. We demonstrate that using virtual flash images and homogeneous pixels with a standard denoising method outperforms state-of-the-art......We propose an efficient and robust image-space denoising method for noisy images generated by Monte Carlo ray tracing methods. Our method is based on two new concepts: virtual flash images and homogeneous pixels. Inspired by recent developments in flash photography, virtual flash images emulate...... values. While denoising each pixel, we consider only homogeneous pixels—pixels that are statistically equivalent to each other. This makes it possible to define a stochastic error bound of our method, and this bound goes to zero as the number of ray samples goes to infinity, irrespective of denoising...
SOFTWARE PACKAGE FOR SOLVING THE PROBLEMS OF ANALYSIS AND SYNTHESIS OF NETWORKED CONTROL SYSTEMS
A. E. Emelyanov
2015-01-01
Full Text Available Summary. Modern control systems shall exchange data packets through the network channels. Such systems are called network management systems. One of the promising directions of development of network management systems is the use of common computer networks in the control loop for the exchange of information between elements of the system. Such a construction of control systems leads to new problems. So in the design and study of such systems need to combine different methods of scientific fields. First of all, it is the field of control theory and communication theory. However, not all the developer has full knowledge of these areas to the same extent. To solve engineering problems, in order to ensure the required quality of operation, developed methods of analysis and synthesis of networked control systems with data transmission over a channel with competing access methods. These techniques allow the calculation of probability-time characteristics of a stochastic process data channel with competing access methods to build transients considered control systems to calculate their qualitative characteristics, to determine the conditions of stability of network systems management and tuning parameters to optimize the digital controllers for the respective criterion. These techniques are the basis for the development of software. The proposed software system allows for the analysis and synthesis of the network through which the information data exchange. As well as to study the network system for a variety of laws regulation. Complex structure based on the principles of modularity, hierarchy and nesting modules to each other. Easy to use interface allows the software user numb special training.
PDBStat: a universal restraint converter and restraint analysis software package for protein NMR
Tejero, Roberto [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States); Snyder, David [William Paterson University, Department of Chemistry (United States); Mao, Binchen; Aramini, James M.; Montelione, Gaetano T., E-mail: guy@cabm.rutgers.edu [Rutgers, The State University of New Jersey, Center for Advanced Biotechnology and Medicine (United States)
2013-08-15
The heterogeneous array of software tools used in the process of protein NMR structure determination presents organizational challenges in the structure determination and validation processes, and creates a learning curve that limits the broader use of protein NMR in biology. These challenges, including accurate use of data in different data formats required by software carrying out similar tasks, continue to confound the efforts of novices and experts alike. These important issues need to be addressed robustly in order to standardize protein NMR structure determination and validation. PDBStat is a C/C++ computer program originally developed as a universal coordinate and protein NMR restraint converter. Its primary function is to provide a user-friendly tool for interconverting between protein coordinate and protein NMR restraint data formats. It also provides an integrated set of computational methods for protein NMR restraint analysis and structure quality assessment, relabeling of prochiral atoms with correct IUPAC names, as well as multiple methods for analysis of the consistency of atomic positions indicated by their convergence across a protein NMR ensemble. In this paper we provide a detailed description of the PDBStat software, and highlight some of its valuable computational capabilities. As an example, we demonstrate the use of the PDBStat restraint converter for restrained CS-Rosetta structure generation calculations, and compare the resulting protein NMR structure models with those generated from the same NMR restraint data using more traditional structure determination methods. These results demonstrate the value of a universal restraint converter in allowing the use of multiple structure generation methods with the same restraint data for consensus analysis of protein NMR structures and the underlying restraint data.
Two-Dimensional Gel Electrophoresis Image Analysis via Dedicated Software Packages.
Maurer, Martin H
2016-01-01
Analyzing two-dimensional gel electrophoretic images is supported by a number of freely and commercially available software. Although the respective program is highly specific, all the programs follow certain standardized algorithms. General steps are: (1) detecting and separating individual spots, (2) subtracting background, (3) creating a reference gel and (4) matching the spots to the reference gel, (5) modifying the reference gel, (6) normalizing the gel measurements for comparison, (7) calibrating for isoelectric point and molecular weight markers, and moreover, (8) constructing a database containing the measurement results and (9) comparing data by statistical and bioinformatic methods.
Mathematic models for a ray tracing method and its applications in wireless optical communications.
Zhang, Minglun; Zhang, Yangan; Yuan, Xueguang; Zhang, Jinnan
2010-08-16
This paper presents a new ray tracing method, which contains a whole set of mathematic models, and its validity is verified by simulations. In addition, both theoretical analysis and simulation results show that the computational complexity of the method is much lower than that of previous ones. Therefore, the method can be used to rapidly calculate the impulse response of wireless optical channels for complicated systems.
Statistical Inverse Ray Tracing for Image-Based 3D Modeling.
Liu, Shubao; Cooper, David B
2014-10-01
This paper proposes a new formulation and solution to image-based 3D modeling (aka "multi-view stereo") based on generative statistical modeling and inference. The proposed new approach, named statistical inverse ray tracing, models and estimates the occlusion relationship accurately through optimizing a physically sound image generation model based on volumetric ray tracing. Together with geometric priors, they are put together into a Bayesian formulation known as Markov random field (MRF) model. This MRF model is different from typical MRFs used in image analysis in the sense that the ray clique, which models the ray-tracing process, consists of thousands of random variables instead of two to dozens. To handle the computational challenges associated with large clique size, an algorithm with linear computational complexity is developed by exploiting, using dynamic programming, the recursive chain structure of the ray clique. We further demonstrate the benefit of exact modeling and accurate estimation of the occlusion relationship by evaluating the proposed algorithm on several challenging data sets.
A rapid and accurate two-point ray tracing method in horizontally layered velocity model
TIAN Yue; CHEN Xiao-fei
2005-01-01
A rapid and accurate method for two-point ray tracing in horizontally layered velocity model is presented in this paper. Numerical experiments show that this method provides stable and rapid convergence with high accuracies, regardless of various 1-D velocity structures, takeoff angles and epicentral distances. This two-point ray tracing method is compared with the pseudobending technique and the method advanced by Kim and Baag (2002). It turns out that the method in this paper is much more efficient and accurate than the pseudobending technique, but is only applicable to 1-D velocity model. Kim(s method is equivalent to ours for cases without large takeoff angles, but it fails to work when the takeoff angle is close to 90o. On the other hand, the method presented in this paper is applicable to cases with any takeoff angles with rapid and accurate convergence. Therefore, this method is a good choice for two-point ray tracing problems in horizontally layered velocity model and is efficient enough to be applied to a wide range of seismic problems.
Vertex shading of the three-dimensional model based on ray-tracing algorithm
Hu, Xiaoming; Sang, Xinzhu; Xing, Shujun; Yan, Binbin; Wang, Kuiru; Dou, Wenhua; Xiao, Liquan
2016-10-01
Ray Tracing Algorithm is one of the research hotspots in Photorealistic Graphics. It is an important light and shadow technology in many industries with the three-dimensional (3D) structure, such as aerospace, game, video and so on. Unlike the traditional method of pixel shading based on ray tracing, a novel ray tracing algorithm is presented to color and render vertices of the 3D model directly. Rendering results are related to the degree of subdivision of the 3D model. A good light and shade effect is achieved by realizing the quad-tree data structure to get adaptive subdivision of a triangle according to the brightness difference of its vertices. The uniform grid algorithm is adopted to improve the rendering efficiency. Besides, the rendering time is independent of the screen resolution. In theory, as long as the subdivision of a model is adequate, cool effects as the same as the way of pixel shading will be obtained. Our practical application can be compromised between the efficiency and the effectiveness.
Benthin, Carsten; Wald, Ingo; Woop, Sven; Ernst, Manfred; Mark, William R
2012-09-01
Wide-SIMD hardware is power and area efficient, but it is challenging to efficiently map ray tracing algorithms to such hardware especially when the rays are incoherent. The two most commonly used schemes are either packet tracing, or relying on a separate traversal stack for each SIMD lane. Both work great for coherent rays, but suffer when rays are incoherent: The former experiences a dramatic loss of SIMD utilization once rays diverge; the latter requires a large local storage, and generates multiple incoherent streams of memory accesses that present challenges for the memory system. In this paper, we introduce a single-ray tracing scheme for incoherent rays that uses just one traversal stack on 16-wide SIMD hardware. It uses a bounding-volume hierarchy with a branching factor of four as the acceleration structure, exploits four-wide SIMD in each box and primitive intersection test, and uses 16-wide SIMD by always performing four such node or primitive tests in parallel. We then extend this scheme to a hybrid tracing scheme that automatically adapts to varying ray coherence by starting out with a 16-wide packet scheme and switching to the new single-ray scheme as soon as rays diverge. We show that on the Intel Many Integrated Core architecture this hybrid scheme consistently, and over a wide range of scenes and ray distributions, outperforms both packet and single-ray tracing.
RAY-RAMSES: a code for ray tracing on the fly in N-body simulations
Barreira, Alexandre; Bose, Sownak; Li, Baojiu
2016-01-01
We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementation using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conv...
Informed-Proteomics: Open Source Software Package for Top-down Proteomics
Park, Jung Kap; Piehowski, Paul D.; Wilkins, Christopher S.; Zhou, Mowei; Mendoza, Joshua A.; Fujimoto, Grant M.; Gibbons, Bryson C.; Shaw, Jared B.; Shen, Yufeng; Shukla, Anil K.; Moore, Ronald J.; Liu, Tao; Petyuk, Vladislav A.; Tolic, Nikola; Pasa Tolic, Ljiljana; Smith, Richard D.; Payne, Samuel H.; Kim, Sangtae
2017-08-07
Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent need to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.
Zoran Novaković
2008-04-01
Full Text Available Zakon o bezbednosti i zdravlju na radu donosi niz novih obaveza poslodavaca, među kojima se, po značaju i složenosti, izdvajaju aktivnosti vezane za izradu Akta o proceni rizika na svim radnim mestima. Inicijalizacija projekta izrade softverskog paketa za procenu rizika na radnom mestu i u radnoj okolini izvršena je na osnovu iskazane potrebe, proistekle iz značajnih zakonodavnih promena koje su donete u oblasti bezbednosti i zdravlja na radu. To predstavlja osnovu za dalju nadgradnju u smislu integralnog informatičkog rešenja za vođenje poslova bezbednosti i zdravlja na radu. / A new national Occupational and Safety Law brings many new obligations for employers and among them activities related to risk assessment procedures. A project of developing a software package for conducting a risk assessment procedure in the workplace and in the work environment was initiated, based on needs generated by significant legislation changes in the domain of occupational safety and health. All of previous should be a basis for further upgrading to the level of an integrated software solution in the domain of occupational safety and health.
Philipp Thomas
Full Text Available The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA, which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network
Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo
2016-01-01
Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.
Koch, S. E.; Kocin, P. J.; Desjardins, M.
1983-01-01
The analysis scheme and meteorological applications of the GEMPAK data analysis and display software system developed by NASA are described. The program was devised to permit objective, versatile, and practical analysis of satellite meteorological data using a minicomputer and a display system with graphics capability. A data area can be selected within the data file for the globe, and data-sparse regions can be avoided. Distances between observations and the nearest observation points are calculated in order to avoid errors when determining synoptic weather conditions. The Barnes (1973) successive correction method is employed to restore the amplitude of small yet resolvable wavelengths suppressed in an initial filtering pass. The rms deviation is then calculated in relation to available measured data. Examples are provided of treatment of VISSR data from the GOES satellite and a study of the impact of incorrect cloud height data on synoptic weather field analysis.
Altermann, Eric; Lu, Jingli; McCulloch, Alan
2017-01-01
Expert curated annotation remains one of the critical steps in achieving a reliable biological relevant annotation. Here we announce the release of GAMOLA2, a user friendly and comprehensive software package to process, annotate and curate draft and complete bacterial, archaeal, and viral genomes. GAMOLA2 represents a wrapping tool to combine gene model determination, functional Blast, COG, Pfam, and TIGRfam analyses with structural predictions including detection of tRNAs, rRNA genes, non-coding RNAs, signal protein cleavage sites, transmembrane helices, CRISPR repeats and vector sequence contaminations. GAMOLA2 has already been validated in a wide range of bacterial and archaeal genomes, and its modular concept allows easy addition of further functionality in future releases. A modified and adapted version of the Artemis Genome Viewer (Sanger Institute) has been developed to leverage the additional features and underlying information provided by the GAMOLA2 analysis, and is part of the software distribution. In addition to genome annotations, GAMOLA2 features, among others, supplemental modules that assist in the creation of custom Blast databases, annotation transfers between genome versions, and the preparation of Genbank files for submission via the NCBI Sequin tool. GAMOLA2 is intended to be run under a Linux environment, whereas the subsequent visualization and manual curation in Artemis is mobile and platform independent. The development of GAMOLA2 is ongoing and community driven. New functionality can easily be added upon user requests, ensuring that GAMOLA2 provides information relevant to microbiologists. The software is available free of charge for academic use.
Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger
2013-04-22
For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.
Software Package \\Nesvetay-3D" for modeling three-dimensional flows of monatomic rarefied gas
V. A. Titarev
2014-01-01
Full Text Available Analysis of three-dimensional rarefied gas flowsin microdevices (micropipes, micropumps etc and over re-entry vehicles requires development of methods of computational modelling. One of such methods is the direct numerical solution of the Boltzmann kinetic equation for the velocity distribution function with either exact or approximate (model collision integral. At present, for flows of monatomic rarefied gas the Shakhov model kinetic equation, also called S-model, has gained wide-spread use. The equation can be regarded as a model equation of the incomplete thirdorder approximation. Despite its relative simplicity, the S-model is still a complicated integrodifferential equation of high dimension. The numerical solution of such an equation requires high-accuracy parallel methods.The present work is a review of recent results concerning the development and application of three-dimensional computer package Nesvetay-3D intended for modelling of rarefied gas flows. The package solves Boltzmann kinetic equation with the BGK (Krook and Shakhov model collision integrals using the discrete velocity approach. Calculations are carried out in non-dimensional variables. A finite integration domain and a mesh are introduced in the molecular velocity space. Next, the kinetic equation is re-written as a system of kinetic equations for each of the discrete velocities. The system is solved using an implicit finite-volume method of Godunov type. The steady-state solution is computed by a time marching method. High order of spatial accuracy is achieved by using a piece-wise linear representation of the distribution function in each spatial cell. In general, the coefficients of such an approximation are found using the least-square method. Arbitrary unstructured meshes in the physical space can be used in calculations, which allow considering flows over objects of general geometrical shape. Conservative property of the method with respect to the model collision
Optomechanical design software for segmented mirrors
Marrero, Juan
2016-08-01
The software package presented in this paper, still under development, was born to help analyzing the influence of the many parameters involved in the design of a large segmented mirror telescope. In summary, it is a set of tools which were added to a common framework as they were needed. Great emphasis has been made on the graphical presentation, as scientific visualization nowadays cannot be conceived without the use of a helpful 3d environment, showing the analyzed system as close to reality as possible. Use of third party software packages is limited to ANSYS, which should be available in the system only if the FEM results are needed. Among the various functionalities of the software, the next ones are worth mentioning here: automatic 3d model construction of a segmented mirror from a set of parameters, geometric ray tracing, automatic 3d model construction of a telescope structure around the defined mirrors from a set of parameters, segmented mirror human access assessment, analysis of integration tolerances, assessment of segments collision, structural deformation under gravity and thermal variation, mirror support system analysis including warping harness mechanisms, etc.
Sergii FIRSOV
2014-06-01
Full Text Available The unmanned aerial vehicles are used for dangerous tasks solution. The search and detection of injured in rough terrain is one of them. Thus, vertical take-off unmanned aerial vehicles are of a special interest. A hardware and software package for the task solving is proposed in the article.
Mazurek, P. [Instytut Automatyki Systemow Energetycznych, Wroclaw (Poland)
1995-07-01
Application of the ITSM software package is presented together with the used iterative time series modeling methodology. The results obtained for the National Power System 24 hour load forecast were calculated for 200 days with acceptable accuracy. A time required for input data analysis and single forecast calculations was approx. 20 minutes on standard IBM PC. (author). 5 refs., 4 figs., 2 tabs.
FullSWOF: A free software package for the simulation of shallow water flows
Delestre, Olivier; James, Francois; Lucas, Carine; Laguerre, Christian; Cordier, Stephane
2014-01-01
Numerical simulations of flows are required for numerous applications, and are usually carried out using shallow water equations. We describe the FullSWOF software which is based on up-to-date finite volume methods and well-balanced schemes to solve this kind of equations. It consists of a set of open source C++ codes, freely available to the community, easy to use, and open for further development. Several features make FullSWOF particularly suitable for applications in hydrology: small water heights and wet-dry transitions are robustly handled, rainfall and infiltration are incorporated, and data from grid-based digital topographies can be used directly. A detailed mathematical description is given here, and the capabilities of FullSWOF are illustrated based on analytic solutions and datasets of real cases. The codes, available in 1D and 2D versions, have been validated on a large set of benchmark cases, which are available together with the download information and documentation at http://www.univ-orleans....
CALIPSO: an interactive image analysis software package for desktop PACS workstations
Ratib, Osman M.; Huang, H. K.
1990-07-01
The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade
Comparing FDTD and Ray-Tracing Models in Numerical Simulation of HgCdTe LWIR Photodetectors
Vallone, Marco; Goano, Michele; Bertazzi, Francesco; Ghione, Giovanni; Schirmacher, Wilhelm; Hanna, Stefan; Figgemeier, Heinrich
2016-09-01
We present a simulation study of HgCdTe-based long-wavelength infrared detectors, focusing on methodological comparisons between the finite-difference time-domain (FDTD) and ray-tracing optical models. We performed three-dimensional simulations to determine the absorbed photon density distributions and the corresponding photocurrent and quantum efficiency spectra of isolated n-on- p uniform-composition pixels, systematically comparing the results obtained with FDTD and ray tracing. Since ray tracing is a classical optics approach, unable to describe interference effects, its applicability has been found to be strongly wavelength dependent, especially when reflections from metallic layers are relevant. Interesting cavity effects around the material cutoff wavelength are described, and the cases where ray tracing can be considered a viable approximation are discussed.
Spin tracking simulations in AGS based on ray-tracing methods - bare lattice, no snakes -
Meot, F.; Ahrens, L.; Gleen, J.; Huang, H.; Luccio, A.; MacKay, W. W.; Roser, T.; Tsoupas, N.
2009-09-01
This Note reports on the first simulations of and spin dynamics in the AGS using the ray-tracing code Zgoubi. It includes lattice analysis, comparisons with MAD, DA tracking, numerical calculation of depolarizing resonance strengths and comparisons with analytical models, etc. It also includes details on the setting-up of Zgoubi input data files and on the various numerical methods of concern in and available from Zgoubi. Simulations of crossing and neighboring of spin resonances in AGS ring, bare lattice, without snake, have been performed, in order to assess the capabilities of Zgoubi in that matter, and are reported here. This yields a rather long document. The two main reasons for that are, on the one hand the desire of an extended investigation of the energy span, and on the other hand a thorough comparison of Zgoubi results with analytical models as the 'thin lens' approximation, the weak resonance approximation, and the static case. Section 2 details the working hypothesis : AGS lattice data, formulae used for deriving various resonance related quantities from the ray-tracing based 'numerical experiments', etc. Section 3 gives inventories of the intrinsic and imperfection resonances together with, in a number of cases, the strengths derived from the ray-tracing. Section 4 gives the details of the numerical simulations of resonance crossing, including behavior of various quantities (closed orbit, synchrotron motion, etc.) aimed at controlling that the conditions of particle and spin motions are correct. In a similar manner Section 5 gives the details of the numerical simulations of spin motion in the static case: fixed energy in the neighboring of the resonance. In Section 6, weak resonances are explored, Zgoubi results are compared with the Fresnel integrals model. Section 7 shows the computation of the {rvec n} vector in the AGS lattice and tuning considered. Many details on the numerical conditions as data files etc. are given in the
TIM, a ray-tracing program for METATOY research and its dissemination
Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes
2012-03-01
TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.
Andreev, Yu. G.; Lundström, T.; Sorokin, N. I.
1995-02-01
An updated version of the CPSR software package for powder pattern fitting and structure refinement offers major advantages over previous versions. An optional use of the new figure-of-merit function, that takes into account a systematic behaviour of residuals, allows users to reduce the effect of local correlations at the full-profile fitting stage, thus providing more reliable estimates for integrated intensities and their deviances. The structure refinement stage in such a case yields accurate values for estimated standard deviations of structural parameters since, in addition, model errors affecting calculated integrated intensities are taken into consideration. Furthermore, the new CPSR version is customized for a variety of constant-wavelength neutron and X-ray diffraction techniques and is equipped with an enhanced menu structure. Graphical on-screen-controlled support allows users to follow the progress of a fitting procedure over any region of a powder pattern. The program performance is illustrated using the neutron diffraction data file for PbSO 4 distributed during the Rietveld refinement round robin, organized by the IUCr Commission on Powder Diffraction.
Marcos-Arenal, P; De Ridder, J; Huygen, R; Aerts, C
2014-01-01
The preparation of a space-mission that carries out any kind of imaging to detect high-precision low-amplitude variability of its targets requires a robust model for the expected performance of its instruments. This model cannot be derived from simple addition of noise properties due to the complex interaction between the various noise sources. While it is not feasible to build and test a prototype of the imaging device on-ground, realistic numerical simulations in the form of an end-to-end simulator can be used to model the noise propagation in the observations. These simulations not only allow studying the performance of the instrument, its noise source response and its data quality, but also the instrument design verification for different types of configurations, the observing strategy and the scientific feasibility of an observing proposal. In this way, a complete description and assessment of the objectives to expect from the mission can be derived. We present a high-precision simulation software packag...
Qin, R.
2016-06-01
Large-scale Digital Surface Models (DSM) are very useful for many geoscience and urban applications. Recently developed dense image matching methods have popularized the use of image-based very high resolution DSM. Many commercial/public tools that implement matching methods are available for perspective images, but there are rare handy tools for satellite stereo images. In this paper, a software package, RPC (rational polynomial coefficient) stereo processor (RSP), is introduced for this purpose. RSP implements a full pipeline of DSM and orthophoto generation based on RPC modelled satellite imagery (level 1+), including level 2 rectification, geo-referencing, point cloud generation, pan-sharpen, DSM resampling and ortho-rectification. A modified hierarchical semi-global matching method is used as the current matching strategy. Due to its high memory efficiency and optimized implementation, RSP can be used in normal PC to produce large format DSM and orthophotos. This tool was developed for internal use, and may be acquired by researchers for academic and non-commercial purpose to promote the 3D remote sensing applications.
Damyanova, M.; Sabchevski, S.; Zhelyazkov, I.; Vasileva, E.; Balabanova, E.; Dankov, P.; Malinov, P.
2016-03-01
Gyrotrons are the most powerful sources of coherent CW (continuous wave) radiation in the frequency range situated between the long-wavelength edge of the infrared light (far-infrared region) and the microwaves, i.e., in the region of the electromagnetic spectrum which is usually called the THz-gap (or T-gap), since the output power of other devices (e.g., solid-state oscillators) operating in this interval is by several orders of magnitude lower. In the recent years, the unique capabilities of the sub-THz and THz gyrotrons have opened the road to many novel and future prospective applications in various physical studies and advanced high-power terahertz technologies. In this paper, we present the current status and functionality of the problem-oriented software packages (most notably GYROSIM and GYREOSS) used for numerical studies, computer-aided design (CAD) and optimization of gyrotrons for diverse applications. They consist of a hierarchy of codes specialized to modelling and simulation of different subsystems of the gyrotrons (EOS, resonant cavity, etc.) and are based on adequate physical models, efficient numerical methods and algorithms.
Grunze Michael
2010-06-01
Full Text Available Abstract Background Soft X-ray spectromicroscopy based absorption near-edge structure analysis, is a spectroscopic technique useful for investigating sample composition at a nanoscale of resolution. While the technique holds great promise for analysis of biological samples, current methodologies are challenged by a lack of automatic analysis software e. g. for selection of regions of interest and statistical comparisons of sample variability. Results We have implemented a set of functions and scripts in Python to provide a semiautomatic treatment of data obtained using scanning transmission X-ray microscopy. The toolkit includes a novel line-by-line absorption conversion and data filtering automatically identifying image components with significant absorption. Results are provided to the user by direct graphical output to the screen and by output images and data files, including the average and standard deviation of the X-ray absorption spectrum. Using isolated mouse melanosomes as a sample biological tissue, application of STXMPy in analysis of biological tissues is illustrated. Conclusion The STXMPy package allows both interactive and automated batch processing of scanning transmission X-ray microscopic data. It is open source, cross platform, and offers rapid script development using the interpreted Python language.
Infrasonic ray tracing applied to mesoscale atmospheric structures: refraction by hurricanes.
Bedard, Alfred J; Jones, R Michael
2013-11-01
A ray-tracing program is used to estimate the refraction of infrasound by the temperature structure of the atmosphere and by hurricanes represented by a Rankine-combined vortex wind plus a temperature perturbation. Refraction by the hurricane winds is significant, giving rise to regions of focusing, defocusing, and virtual sources. The refraction of infrasound by the temperature anomaly associated with a hurricane is small, probably no larger than that from uncertainties in the wind field. The results are pertinent to interpreting ocean wave generated infrasound in the vicinities of tropical cyclones.
Invisibility cloaking via non-smooth transformation optics and ray tracing
Crosskey, Miles M., E-mail: mmc31@duke.ed [Mathematics Department, Duke University, Box 90320, Durham, NC 27708-0320 (United States); Nixon, Andrew T., E-mail: andrew_nixon@brown.ed [Division of Applied Mathematics, Brown University, 182 George Street, Providence, RI 02912 (United States); Schick, Leland M., E-mail: lschick@math.arizona.ed [Department of Mathematics, University of Arizona, 617 N. Santa Rita Ave., P.O. Box 210089, Tucson, AZ 85721-0089 (United States); Kovacic, Gregor, E-mail: kovacg@rpi.ed [Mathematical Sciences Department, Rensselaer Polytechnic Institute, 110 8th Street, Troy, NY 12180 (United States)
2011-05-02
We present examples of theoretically-predicted invisibility cloaks with shapes other than spheres and cylinders, including cones and ellipsoids, as well as shapes spliced from parts of these simpler shapes. In addition, we present an example explicitly displaying the non-uniqueness of invisibility cloaks of the same shape. We depict rays propagating through these example cloaks using ray tracing for geometric optics. - Highlights: Theoretically-predicted conical and ellipsoidal invisibility cloaks. Non-smooth cloaks spliced from parts of simpler shapes. Example displaying non-uniqueness of invisibility cloaks of the same shape. Rays propagating through example cloaks depicted using geometric optics.
Shi, Guangyuan; Li, Song; Huang, Ke; Li, Zile; Zheng, Guoxing
2016-10-01
We have developed a new numerical ray-tracing approach for LIDAR signal power function computation, in which the light round-trip propagation is analyzed by geometrical optics and a simple experiment is employed to acquire the laser intensity distribution. It is relatively more accurate and flexible than previous methods. We emphatically discuss the relationship between the inclined angle and the dynamic range of detector output signal in biaxial LIDAR system. Results indicate that an appropriate negative angle can compress the signal dynamic range. This technique has been successfully proved by comparison with real measurements.
A Fast Ray-Tracing Using Bounding Spheres and Frustum Rays for Dynamic Scene Rendering
Suzuki, Ken-Ichi; Kaeriyama, Yoshiyuki; Komatsu, Kazuhiko; Egawa, Ryusuke; Ohba, Nobuyuki; Kobayashi, Hiroaki
Ray tracing is one of the most popular techniques for generating photo-realistic images. Extensive research and development work has made interactive static scene rendering realistic. This paper deals with interactive dynamic scene rendering in which not only the eye point but also the objects in the scene change their 3D locations every frame. In order to realize interactive dynamic scene rendering, RTRPS (Ray Tracing based on Ray Plane and Bounding Sphere), which utilizes the coherency in rays, objects, and grouped-rays, is introduced. RTRPS uses bounding spheres as the spatial data structure which utilizes the coherency in objects. By using bounding spheres, RTRPS can ignore the rotation of moving objects within a sphere, and shorten the update time between frames. RTRPS utilizes the coherency in rays by merging rays into a ray-plane, assuming that the secondary rays and shadow rays are shot through an aligned grid. Since a pair of ray-planes shares an original ray, the intersection for the ray can be completed using the coherency in the ray-planes. Because of the three kinds of coherency, RTRPS can significantly reduce the number of intersection tests for ray tracing. Further acceleration techniques for ray-plane-sphere and ray-triangle intersection are also presented. A parallel projection technique converts a 3D vector inner product operation into a 2D operation and reduces the number of floating point operations. Techniques based on frustum culling and binary-tree structured ray-planes optimize the order of intersection tests between ray-planes and a sphere, resulting in 50% to 90% reduction of intersection tests. Two ray-triangle intersection techniques are also introduced, which are effective when a large number of rays are packed into a ray-plane. Our performance evaluations indicate that RTRPS gives 13 to 392 times speed up in comparison with a ray tracing algorithm without organized rays and spheres. We found out that RTRPS also provides competitive
Tichý, Vladimír; Hudec, René; Němcová, Šárka
2016-06-01
The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.
GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes
Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal
2013-11-01
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.
Integrated ray tracing simulation of spectral bio-signatures from full 3D earth model
Ryu, Dongok; Seong, Sehyun; Lee, Jae-Min; Hong, Jinsuk; Jeong, Soomin; Jeong, Yukyeong; Kim, Sug-Whan
2009-08-01
Accurate identification and understanding of spectral bio-signatures from possible extra terrestrial planets have received an ever increasing attention from both astronomy and space science communities in recent years. In pursuance of this subject, one of the most important scientific breakthroughs would be to obtain the detailed understanding on spectral biosignatures of the Earth, as it serves as a reference datum for accurate interpretation of collapsed (in temporal and spatial domains) information from the spectral measurement using TPF instruments. We report a new Integrated Ray Tracing (IRT) model capable of computing various spectral bio-signatures as they are observed from the Earth surface. The model includes the Sun, the full 3-D Earth, and an optical instrument, all combined into single ray tracing environment in real scale. In particular, the full 3-D Earth surface is constructed from high resolution coastal line data and defined with realistic reflectance and BSDF characteristics depending on wavelength, vegetation types and their distributions. We first examined the model validity by confirming the imaging and radiometric performance of the AmonRa visible channel camera, simulating the Earth observation from the L1 halo orbit. We then computed disk averaged spectra, light curves and NDVI indexes, leading to the construction of the observed disk averaged spectra at the AmonRa instrument detector plane. The model, computational procedure and the simulation results are presented. The future plan for the detailed spectral signature simulation runs for various input conditions including seasonal vegetation changes and variable cloud covers is discussed.
N. H. Abd Rahman
2014-01-01
Full Text Available Reflector antennas have been widely used in many areas. In the implementation of parabolic reflector antenna for broadcasting satellite applications, it is essential for the spacecraft antenna to provide precise contoured beam to effectively serve the required region. For this purpose, combinations of more than one beam are required. Therefore, a tool utilizing ray tracing method is developed to calculate precise off-axis beams for multibeam antenna system. In the multibeam system, each beam will be fed from different feed positions to allow the main beam to be radiated at the exact direction on the coverage area. Thus, detailed study on caustics of a parabolic reflector antenna is performed and presented in this paper, which is to investigate the behaviour of the rays and its relation to various antenna parameters. In order to produce accurate data for the analysis, the caustic behaviours are investigated in two distinctive modes: scanning plane and transverse plane. This paper presents the detailed discussions on the derivation of the ray tracing algorithms, the establishment of the equations of caustic loci, and the verification of the method through calculation of radiation pattern.
Monte Carlo tolerancing tool using nonsequential ray tracing on a computer cluster
Reimer, Christopher
2010-08-01
The development of a flexible tolerancing tool for illumination systems based on Matlab® and Zemax® is described in this paper. Two computationally intensive techniques are combined, Monte Carlo tolerancing and non-sequential ray tracing. Implementation of the tool on a computer cluster allows for relatively rapid tolerancing. This paper explores the tool structure, describing the splitting the task of tolerancing between Zemax and Matlab. An equation is derived that determines the number of simulated ray traces needed to accurately resolve illumination uniformity. Two examples of tolerancing illuminators are given. The first one is a projection system consisting of a pico-DLP, a light pipe, a TIR prism and the critical illumination relay optics. The second is a wide band, high performance Köhler illuminator, which includes a modified molded LED as the light source. As high performance illumination systems evolve, the practice of applying standard workshop tolerances to these systems may need to be re-examined.
Hoop, Bartjan de [University Medical Center, Department of Radiology, Utrecht (Netherlands); University Medical Center, Heidelberglaan 100, GA, Utrecht (Netherlands); Gietema, Hester; Prokop, Mathias [University Medical Center, Department of Radiology, Utrecht (Netherlands); Ginneken, Bram van [University Medical Center, Image Sciences Institute, Utrecht (Netherlands); Zanen, Pieter [University Medical Center, Department of Pulmonology, Utrecht (Netherlands); Groenewegen, Gerard [University Medical Center, Department of Oncology, Utrecht (Netherlands)
2009-04-15
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules {>=}8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages. (orig.)
Uwano, Ikuko; Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Advanced Medical Research Center, Morioka (Japan); Christensen, Soren [University of Melbourne, Royal Melbourne Hospital, Departments of Neurology and Radiology, Victoria (Australia); Oestergaard, Leif [Aarhus University Hospital, Department of Neuroradiology, Center for Functionally Integrative Neuroscience, DK, Aarhus C (Denmark); Ogasawara, Kuniaki; Ogawa, Akira [Iwate Medical University, Department of Neurosurgery, Morioka (Japan)
2012-05-15
Computed tomography perfusion (CTP) and magnetic resonance perfusion (MRP) are expected to be usable for ancillary tests of brain death by detection of complete absence of cerebral perfusion; however, the detection limit of hypoperfusion has not been determined. Hence, we examined whether commercial software can visualize very low cerebral blood flow (CBF) and cerebral blood volume (CBV) by creating and using digital phantoms. Digital phantoms simulating 0-4% of normal CBF (60 mL/100 g/min) and CBV (4 mL/100 g/min) were analyzed by ten software packages of CT and MRI manufacturers. Region-of-interest measurements were performed to determine whether there was a significant difference between areas of 0% and areas of 1-4% of normal flow. The CTP software detected hypoperfusion down to 2-3% in CBF and 2% in CBV, while the MRP software detected that of 1-3% in CBF and 1-4% in CBV, although the lower limits varied among software packages. CTP and MRP can detect the difference between profound hypoperfusion of <5% from that of 0% in digital phantoms, suggesting their potential efficacy for assessing brain death. (orig.)
Marr-Lyon, Lisa R; Gupchup, Gireesh V; Anderson, Joe R
2012-01-01
The Purdue Pharmacist Directive Guidance (PPDG) Scale was developed to assess patients' perceptions of the level of pharmacist-provided (1) instruction and (2) feedback and goal-setting-2 aspects of pharmaceutical care. Calculations of its psychometric properties stemming from SPSS and R were similar, but distinct differences were apparent. Using SPSS and R software packages, researchers aimed to examine the construct validity of the PPDG using a higher order factoring procedure; in tandem, McDonald's omega and Cronbach's alpha were calculated as means of reliability analyses. Ninety-nine patients with either type I or type II diabetes, aged 18 years or older, able to read and write English, and who could provide written-informed consent participated in the study. Data were collected in 8 community pharmacies in New Mexico. Using R, (1) a principal axis factor analysis with promax (oblique) rotation was conducted, (2) a Schmid-Leiman transformation was attained, and (3) McDonald's omega and Cronbach's alpha were computed. Using SPSS, subscale findings were validated by conducting a principal axis factor analysis with promax rotation; strict parallels and Cronbach's alpha reliabilities were calculated. McDonald's omega and Cronbach's alpha were robust, with coefficients greater than 0.90; principal axis factor analysis with promax rotation revealed construct similarities with an overall general factor emerging from R. Further subjecting the PPDG to rigorous psychometric testing revealed stronger quantitative support of the overall general factor of directive guidance and subscales of instruction and feedback and goal-setting. Copyright © 2012 Elsevier Inc. All rights reserved.
Schumacher, F.; Friederich, W.
2015-12-01
We present the modularized software package ASKI which is a flexible and extendable toolbox for seismic full waveform inversion (FWI) as well as sensitivity or resolution analysis operating on the sensitivity matrix. It utilizes established wave propagation codes for solving the forward problem and offers an alternative to the monolithic, unflexible and hard-to-modify codes that have typically been written for solving inverse problems. It is available under the GPL at www.rub.de/aski. The Gauss-Newton FWI method for 3D-heterogeneous elastic earth models is based on waveform sensitivity kernels and can be applied to inverse problems at various spatial scales in both Cartesian and spherical geometries. The kernels are derived in the frequency domain from Born scattering theory as the Fréchet derivatives of linearized full waveform data functionals, quantifying the influence of elastic earth model parameters on the particular waveform data values. As an important innovation, we keep two independent spatial descriptions of the earth model - one for solving the forward problem and one representing the inverted model updates. Thereby we account for the independent needs of spatial model resolution of forward and inverse problem, respectively. Due to pre-integration of the kernels over the (in general much coarser) inversion grid, storage requirements for the sensitivity kernels are dramatically reduced.ASKI can be flexibly extended to other forward codes by providing it with specific interface routines that contain knowledge about forward code-specific file formats and auxiliary information provided by the new forward code. In order to sustain flexibility, the ASKI tools must communicate via file output/input, thus large storage capacities need to be accessible in a convenient way. Storing the complete sensitivity matrix to file, however, permits the scientist full manual control over each step in a customized procedure of sensitivity/resolution analysis and full
Reich, N.H.; van Sark, W.G.J.H.M.; Turkenburg, W.C.; Sinke, W.C.
2010-01-01
In this paper, we show that photovoltaic (PV) energy yields can be simulated using standard rendering and ray-tracing features of Computer Aided Design (CAD) software. To this end, three-dimensional (3-D) sceneries are ray-traced in CAD. The PV power output is then modeled by translating irradiance
Reich, N.H.; van Sark, W.G.J.H.M.; Turkenburg, W.C.; Sinke, W.C.
2010-01-01
In this paper, we show that photovoltaic (PV) energy yields can be simulated using standard rendering and ray-tracing features of Computer Aided Design (CAD) software. To this end, three-dimensional (3-D) sceneries are ray-traced in CAD. The PV power output is then modeled by translating irradiance
Mammarella, Ivan; Peltola, Olli; Nordbo, Annika; Järvi, Leena; Rannik, Üllar
2016-10-01
We have carried out an inter-comparison between EddyUH and EddyPro®, two public software packages for post-field processing of eddy covariance data. Datasets including carbon dioxide, methane and water vapour fluxes measured over 2 months at a wetland in southern Finland and carbon dioxide and water vapour fluxes measured over 3 months at an urban site in Helsinki were processed and analysed. The purpose was to estimate the flux uncertainty due to the use of different software packages and to evaluate the most critical processing steps, determining the largest deviations in the calculated fluxes. Turbulent fluxes calculated with a reference combination of processing steps were in good agreement, the systematic difference between the two software packages being up to 2.0 and 6.7 % for half-hour and cumulative sum values, respectively. The raw data preparation and processing steps were consistent between the software packages, and most of the deviations in the estimated fluxes were due to the flux corrections. Among the different calculation procedures analysed, the spectral correction had the biggest impact for closed-path latent heat fluxes, reaching a nocturnal median value of 15 % at the wetland site. We found up to a 43 % median value of deviation (with respect to the run with all corrections included) if the closed-path carbon dioxide flux is calculated without the dilution correction, while the methane fluxes were up to 10 % lower without both dilution and spectroscopic corrections. The Webb-Pearman-Leuning (WPL) and spectroscopic corrections were the most critical steps for open-path systems. However, we found also large spectral correction factors for the open-path methane fluxes, due to the sensor separation effect.
van Aardt, J. A.; van Leeuwen, M.; Kelbe, D.; Kampe, T.; Krause, K.
2015-12-01
Remote sensing is widely accepted as a useful technology for characterizing the Earth surface in an objective, reproducible, and economically feasible manner. To date, the calibration and validation of remote sensing data sets and biophysical parameter estimates remain challenging due to the requirements to sample large areas for ground-truth data collection, and restrictions to sample these data within narrow temporal windows centered around flight campaigns or satellite overpasses. The computer graphics community have taken significant steps to ameliorate some of these challenges by providing an ability to generate synthetic images based on geometrically and optically realistic representations of complex targets and imaging instruments. These synthetic data can be used for conceptual and diagnostic tests of instrumentation prior to sensor deployment or to examine linkages between biophysical characteristics of the Earth surface and at-sensor radiance. In the last two decades, the use of image generation techniques for remote sensing of the vegetated environment has evolved from the simulation of simple homogeneous, hypothetical vegetation canopies, to advanced scenes and renderings with a high degree of photo-realism. Reported virtual scenes comprise up to 100M surface facets; however, due to the tighter coupling between hardware and software development, the full potential of image generation techniques for forestry applications yet remains to be fully explored. In this presentation, we examine the potential computer graphics techniques have for the analysis of forest structure-function relationships and demonstrate techniques that provide for the modeling of extremely high-faceted virtual forest canopies, comprising billions of scene elements. We demonstrate the use of ray tracing simulations for the analysis of gap size distributions and characterization of foliage clumping within spatial footprints that allow for a tight matching between characteristics
Kim, Jee Hoon; Lee, Joon Woo; Ahn, Tae In; Shin, Jong Hwa; Park, Kyung Sub; Son, Jung Eek
2016-01-01
Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L.) with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada). The light curves and A/C i curve of each layer were measured to parameterize the Farquhar, von Caemmerer, and Berry (FvCB) model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant's photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions.
Jee Hoon Kim
2016-09-01
Full Text Available Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L. with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada. The light curves and A/Ci curve of each layer were measured to parameterize the Farquhar, von Caemmerer and Berry (FvCB model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant’s photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions.
Kim, Jee Hoon; Lee, Joon Woo; Ahn, Tae In; Shin, Jong Hwa; Park, Kyung Sub; Son, Jung Eek
2016-01-01
Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L.) with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada). The light curves and A/Ci curve of each layer were measured to parameterize the Farquhar, von Caemmerer, and Berry (FvCB) model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant's photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions. PMID:27667994
Smith, Richard L., Ed.
1985-01-01
Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)
Ray tracing simulation of aero-optical effect using multiple gradient index layer
Yang, Seul Ki; Seong, Sehyun; Ryu, Dongok; Kim, Sug-Whan; Kwon, Hyeuknam; Jin, Sang-Hun; Jeong, Ho; Kong, Hyun Bae; Lim, Jae Wan; Choi, Jong Hwa
2016-10-01
We present a new ray tracing simulation of aero-optical effect through anisotropic inhomogeneous media as supersonic flow field surrounds a projectile. The new method uses multiple gradient-index (GRIN) layers for construction of the anisotropic inhomogeneous media and ray tracing simulation. The cone-shaped projectile studied has 19° semi-vertical angle; a sapphire window is parallel to the cone angle; and an optical system of the projectile was assumed via paraxial optics and infrared image detector. The condition for the steady-state solver conducted through computational fluid dynamics (CFD) included Mach numbers 4 and 6 in speed, 25 km altitude, and 0° angle of attack (AoA). The grid refractive index of the flow field via CFD analysis and Gladstone-Dale relation was discretized into equally spaced layers which are parallel with the projectile's window. Each layer was modeled as a form of 2D polynomial by fitting the refractive index distribution. The light source of ray set generated 3,228 rays for varying line of sight (LOS) from 10° to 40°. Ray tracing simulation adopted the Snell's law in 3D to compute the paths of skew rays in the GRIN layers. The results show that optical path difference (OPD) and boresight error (BSE) decreases exponentially as LOS increases. The variation of refractive index decreases, as the speed of flow field increases the OPD and its rate of decay at Mach number 6 in speed has somewhat larger value than at Mach number 4 in speed. Compared with the ray equation method, at Mach number 4 and 10° LOS, the new method shows good agreement, generated 0.33% of relative root-mean-square (RMS) OPD difference and 0.22% of relative BSE difference. Moreover, the simulation time of the new method was more than 20,000 times faster than the conventional ray equation method. The technical detail of the new method and simulation is presented with results and implication.
a Highly-Accurate and Fast Ray Tracing Sysyem for HF and UHF Simulations
Jones, J. C.; Richards, G. P.
2016-12-01
Accurate and fast ray tracing is critical for radiowave propagation tools and applications. A ray tracer needs to be accurate to reduce accumulated errors which come from the myriad of models (ionospheric electron density, magnetic field, ion density, neutral molecule density, absorption, land surface, ocean surface, and potentially others) required for accurate simulation. A ray tracer must also be fast to make the use of applications practical. Here we introduce NINJART Is Not Just Another Ray Tracer (NINJART), a highly accurate and fast ray tracing system. NINJART consists of an embarrassingly parallel algorithm rigorously solving the 3-D Hasselgrove equations with a Runge-Kutta adaptive step quadrature rule to accurately trace high frequency to ultra-high frequency radiowaves. It is capable of a wide range of propagation modes from multi-ground hops to vertical and near vertical incidence rays, chordal modes, and other esoteric paths. It is capable of using a variety of ionospheric models to include operational data assimilative or empirical models depending on the needs of the user. It can forward and backward ray trace, calculate time of flight, find the focus factor for signals near the skip zone and calculate the angle of arrival from a known transmitter to a known receiver location. Additionally NINJART uses magnetic field data from various models including the International Geomagnetic Reference Field to reduce the inaccuracies introduced by the simple dipole model, which is commonly used by other ray tracers, in calculating the effects of magneto-ionic splitting thereby allowing accurate traces of both the ordinary and extraordinary mode rays. The NINJART algorithm is a heterogeneous system utilizing the CUDA programming language to take advantage of the computing power of graphical processing units. This allows tracing of thousands of rays concurrently. NINJART achieves additional processing savings, without sacrificing accuracy, by use of an adaptive
Modeling of 3D In—Building Propagation by Ray Tracing Technique
GongKe; XuRui
1995-01-01
The modeling of in-building propagation is of great importance for planning of indoor wireless networks.To model the transmission system comprising of transmitter,receiver and dif-ferent kinds of obstacles,ray tracing technique is used by taking a transmitter as a source launch-ing radio rays in different directions,some of these can reach the receiver through different paths with different path loss and delay,adding them together gives out the field strength at the receiv-ing point.Based on this model,computer simulation is carried out to predict the propagation loss and delay spread,it is shown that the simulation agrees well with the experiments.
Yang, Yufei; Yan, Changxiang
2016-02-20
The polarization properties of a two-axis periscopic optical scanner constituted by a pair of rotating planar mirrors have been studied by using the three-dimensional polarization ray-tracing matrix method. The separate and cumulative matrices that define the transformation of the polarization state are obtained and expressed in terms of the rotation angles of two mirrors. The variations of diattenuation and retardance are investigated and graphically shown as functions of the rotation angles. On this basis, a further investigation about the cumulative polarization aberrations of three different metal-coated periscopic scanners is accomplished. Finally, the output polarization states of the three metal-coated scanners are calculated with the input beam of the arbitrary polarization states, and the results show that aluminum film is more appropriate than gold film or silver film for the polarization-maintaining periscopic scanner.
Stress optical path difference analysis of off-axis lens ray trace footprint
Hsu, Ming-Ying; Chan, Chia-Yen; Lin, Wei-Cheng; Wu, Kun-Huan; Chen, Chih-Wen; Chan, Shenq-Tsong; Huang, Ting-Ming
2013-06-01
The mechanical and thermal stress on lens will cause the glass refractive index different, the refractive index of light parallel and light perpendicular to the direction of stress. The refraction index changes will introduce Optical Path Difference (OPD). This study is applying Finite Element Method (FEM) and optical ray tracing; calculate off axis ray stress OPD. The optical system stress distribution result is calculated from finite element simulation, and the stress coordinate need to rotate to optical path direction. Meanwhile, weighting stress to each optical ray path and sum the ray path OPD. The Z-direction stress OPD can be fitted by Zernike polynomial, the separated to sag difference, and rigid body motion. The fitting results can be used to evaluate the stress effect on optical component.
Ray-tracing analysis of crosstalk in multi-core polymer optical fibers.
Berganza, Amaia; Aldabaldetreku, Gotzon; Zubia, Joseba; Durana, Gaizka
2010-10-11
The aim of this paper is to present a new ray-tracing model which describes the propagation of light in multi-core polymer optical fibers (MCPOFs), taking into account the crosstalk among their cores. The new model overcomes many of the limitations of previous approaches allowing us to simulate MCPOFs of arbitrary designs. Additionally, it provides us with the output ray distribution at the end of the fiber, making it possible to calculate useful parameters related to the fiber performance such as the Near-Field Pattern, the Far-Field Pattern or the bandwidth. We also present experimental measurements in order to validate the computational model and we analyze the importance of crosstalk in different MCPOF configurations.
Simulation of radiation damping in rings, using stepwise ray-tracing methods
Méot, F.
2015-06-01
The ray-tracing code Zgoubi computes particle trajectories in arbitrary magnetic and/or electric field maps or analytical field models. It includes a built-in fitting procedure, spin tracking, many Monte Carlo processes. The accuracy of the integration method makes it an efficient tool for multi-turn tracking in periodic machines. Energy loss by synchrotron radiation, based on Monte Carlo techniques, had been introduced in Zgoubi in the early 2000s for studies regarding the linear collider beam delivery system. However, only recently has this Monte Carlo tool been used for systematic beam dynamics and spin diffusion studies in rings, including the eRHIC electron-ion collider project at the Brookhaven National Laboratory. Some beam dynamics aspects of this recent use of Zgoubi capabilities, including considerations of accuracy as well as further benchmarking in the presence of synchrotron radiation in rings, are reported here.
Microcellular propagation prediction model based on an improved ray tracing algorithm.
Liu, Z-Y; Guo, L-X; Fan, T-Q
2013-11-01
Two-dimensional (2D)/two-and-one-half-dimensional ray tracing (RT) algorithms for the use of the uniform theory of diffraction and geometrical optics are widely used for channel prediction in urban microcellular environments because of their high efficiency and reliable prediction accuracy. In this study, an improved RT algorithm based on the "orientation face set" concept and on the improved 2D polar sweep algorithm is proposed. The goal is to accelerate point-to-point prediction, thereby making RT prediction attractive and convenient. In addition, the use of threshold control of each ray path and the handling of visible grid points for reflection and diffraction sources are adopted, resulting in an improved efficiency of coverage prediction over large areas. Measured results and computed predictions are also compared for urban scenarios. The results indicate that the proposed prediction model works well and is a useful tool for microcellular communication applications.
Heat-Flux Analysis of Solar Furnace Using the Monte Carlo Ray-Tracing Method
Lee, Hyun Jin; Kim, Jong Kyu; Lee, Sang Nam; Kang, Yong Heack [Korea Institute of Energy Research, Daejeon (Korea, Republic of)
2011-10-15
An understanding of the concentrated solar flux is critical for the analysis and design of solar-energy-utilization systems. The current work focuses on the development of an algorithm that uses the Monte Carlo ray-tracing method with excellent flexibility and expandability; this method considers both solar limb darkening and the surface slope error of reflectors, thereby analyzing the solar flux. A comparison of the modeling results with measurements at the solar furnace in Korea Institute of Energy Research (KIER) show good agreement within a measurement uncertainty of 10%. The model evaluates the concentration performance of the KIER solar furnace with a tracking accuracy of 2 mrad and a maximum attainable concentration ratio of 4400 sun. Flux variations according to measurement position and flux distributions depending on acceptance angles provide detailed information for the design of chemical reactors or secondary concentrators.
Ray-tracing for coordinate knowledge in the JWST Integrated Science Instrument Module
Sabatke, Derek; Rohrbach, Scott; Kubalak, David
2014-01-01
Optical alignment and testing of the Integrated Science Instrument Module of the James Webb Space Telescope is underway. We describe the Optical Telescope Element Simulator used to feed the science instruments with point images of precisely known location and chief ray pointing, at appropriate wavelengths and flux levels, in vacuum and at operating temperature. The simulator's capabilities include a number of devices for in situ monitoring of source flux, wavefront error, pupil illumination, image position and chief ray angle. Taken together, these functions become a fascinating example of how the first order properties and constructs of an optical design (coordinate systems, image surface and pupil location) acquire measurable meaning in a real system. We illustrate these functions with experimental data, and describe the ray tracing system used to provide both pointing control during operation and analysis support subsequently. Prescription management takes the form of optimization and fitting. Our core too...
Betremieux, Yan
2015-01-01
Atmospheric refraction affects to various degrees exoplanet transit, lunar eclipse, as well as stellar occultation observations. Exoplanet retrieval algorithms often use analytical expressions for the column abundance along a ray traversing the atmosphere as well as for the deflection of that ray, which are first order approximations valid for low densities in a spherically symmetric homogeneous isothermal atmosphere. We derive new analytical formulae for both of these quantities, which are valid for higher densities, and use them to refine and validate a new ray tracing algorithm which can be used for arbitrary atmospheric temperature-pressure profiles. We illustrate with simple isothermal atmospheric profiles the consequences of our model for different planets: temperate Earth-like and Jovian-like planets, as well as HD189733b, and GJ1214b. We find that, for both hot exoplanets, our treatment of refraction does not make much of a difference to pressures as high as 10 atmosphere, but that it is important to ...
Ray tracing optical analysis of offset solar collector for Space Station solar dynamic system
Jefferies, Kent S.
1988-01-01
OFFSET, a detailed ray tracing computer code, was developed at NASA Lewis Research Center to model the offset solar collector for the Space Station solar dynamic electric power system. This model traces rays from 50 points on the face of the sun to 10 points on each of the 456 collector facets. The triangular facets are modeled with spherical, parabolic, or toroidal reflective surface contour and surface slope errors. The rays are then traced through the receiver aperture to the walls of the receiver. Images of the collector and of the sun within the receiver produced by this code provide insight into the collector receiver interface. Flux distribution on the receiver walls, plotted by this code, is improved by a combination of changes to aperture location and receiver tilt angle. Power loss by spillage at the receiver aperture is computed and is considerably reduced by using toroidal facets.
Photorealistic ray tracing of free-space invisibility cloaks made of uniaxial dielectrics
Halimeh, Jad C
2012-01-01
The design rules of transformation optics generally lead to spatially inhomogeneous and anisotropic impedance-matched magneto-dielectric material distributions for, e.g., free-space invisibility cloaks. Recently, simplified anisotropic non-magnetic free-space cloaks made of a locally uniaxial dielectric material (calcite) have been realized experimentally. In a two-dimensional setting and for in-plane polarized light propagating in this plane, the cloaking performance can still be perfect for light rays. However, for general views in three dimensions, various imperfections are expected. In this paper, we study two different purely dielectric uniaxial cylindrical free-space cloaks. For one, the optic axis is along the radial direction, for the other one it is along the azimuthal direction. The azimuthal uniaxial cloak has not been suggested previously to the best of our knowledge. We visualize the cloaking performance of both by calculating photorealistic images rendered by ray tracing. Following and complemen...
Enzo+Moray: Radiation Hydrodynamics Adaptive Mesh Refinement Simulations with Adaptive Ray Tracing
Wise, John H
2010-01-01
We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray tracing scheme, and its parallel implementation into the adaptive mesh refinement (AMR) cosmological hydrodynamics code, Enzo. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilised to study a broad range of astrophysical problems, such as stellar and black hole (BH) feedback. Inaccuracies can arise from large timesteps and poor sampling, therefore we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. (2006, 2009). We further test our method with more dynamical situations, for example, the propagation of an ionisation front through a Rayleigh-Taylor instability, time-varying luminosities, and collimated radiation. The test suite also includes an...
Construction of Virtual Tuming Scene Based on Local Ray Tracing Algorithm
王国锋; 王子良; 王太勇
2003-01-01
According to the features of the turning simulation, a simplified Whitted lighting model is proposed based on the analysis of Phong and other local illumination model. Moreover, in order to obtain the natural lighting effects, local ray tracing algorithm is given to calculate the light intensity of every position during the course of the simulation. This method can calculate the refresh area before calculating the intersection line,simulate the machining environment accurately and reduce the calculating time. Finally, an example of the virtual cutting scene is shown to demonstrate the effects of the global illumination model. If the CUP is 1.3 G and the internal memory is 128 M, the refreshing time of virtual turning scene can be reduced by nine times. This study plays an important role in the enrichment of the virtual manufacturing theory and the promotion of the development of the advanced manufacturing technology.
Ray trace algorithm description for the study of pump power absorption in double clad fibers
Narro, R.; Rodriguez, E.; Ponce, L.; de Posada, E.; Flores, T.; Arronte, M.
2011-09-01
An algorithm for the analysis of the double clad fiber design is presented. The algorithm developed in the MATLAB computing language, is based on ray tracing method applied to three-dimensional graphics figures which are composed of a set of plans. The algorithm can evaluate thousands of ray paths in sequence and its corresponding pump absorption in each of the elements of the fiber according to the Lambert-Beer law. The beam path is evaluated in 3 dimensions considering the losses by reflexion and refraction in the faces and within the fiber. Due to its flexibility, the algorithm can be used to study the ray propagation in single mode or multimode fibers, bending effects in fibers, variable geometries of the inner clad and the core, and could also be used to study tappers.
Skew ray tracing in a step-index optical fiber using Geometric Algebra
Ang, Angeleene; McNamara, Daniel J
2015-01-01
We used Geometric Algebra to compute the paths of skew rays in a cylindrical, step-index multimode optical fiber. To do this, we used the vector addition form for the law of propagation, the exponential of an imaginary vector form for the law of refraction, and the juxtaposed vector product form for the law of reflection. In particular, the exponential forms of the vector rotations enables us to take advantage of the addition or subtraction of exponential arguments of two rotated vectors in the derivation of the ray tracing invariants in cylindrical and spherical coordinates. We showed that the light rays inside the optical fiber trace a polygonal helical path characterized by three invariants that relate successive reflections inside the fiber: the ray path distance, the difference in axial distances, and the difference in the azimuthal angles. We also rederived the known generalized formula for the numerical aperture for skew rays, which simplifies to the standard form for meridional rays.
ACCELERATION RENDERING METHOD ON RAY TRACING WITH ANGLE COMPARISON AND DISTANCE COMPARISON
Liliana liliana
2007-01-01
Full Text Available In computer graphics applications, to produce realistic images, a method that is often used is ray tracing. Ray tracing does not only model local illumination but also global illumination. Local illumination count ambient, diffuse and specular effects only, but global illumination also count mirroring and transparency. Local illumination count effects from the lamp(s but global illumination count effects from other object(s too. Objects that are usually modeled are primitive objects and mesh objects. The advantage of mesh modeling is various, interesting and real-like shape. Mesh contains many primitive objects like triangle or square (rare. A problem in mesh object modeling is long rendering time. It is because every ray must be checked with a lot of triangle of the mesh. Added by ray from other objects checking, the number of ray that traced will increase. It causes the increasing of rendering time. To solve this problem, in this research, new methods are developed to make the rendering process of mesh object faster. The new methods are angle comparison and distance comparison. These methods are used to reduce the number of ray checking. The rays predicted will not intersect with the mesh, are not checked weather the ray intersects the mesh. With angle comparison, if using small angle to compare, the rendering process will be fast. This method has disadvantage, if the shape of each triangle is big, some triangles will be corrupted. If the angle to compare is bigger, mesh corruption can be avoided but the rendering time will be longer than without comparison. With distance comparison, the rendering time is less than without comparison, and no triangle will be corrupted.
Numerical simulation and comparison of nonlinear self-focusing based on iteration and ray tracing
Li, Xiaotong; Chen, Hao; Wang, Weiwei; Ruan, Wangchao; Zhang, Luwei; Cen, Zhaofeng
2017-05-01
Self-focusing is observed in nonlinear materials owing to the interaction between laser and matter when laser beam propagates. Some of numerical simulation strategies such as the beam propagation method (BPM) based on nonlinear Schrödinger equation and ray tracing method based on Fermat's principle have applied to simulate the self-focusing process. In this paper we present an iteration nonlinear ray tracing method in that the nonlinear material is also cut into massive slices just like the existing approaches, but instead of paraxial approximation and split-step Fourier transform, a large quantity of sampled real rays are traced step by step through the system with changing refractive index and laser intensity by iteration. In this process a smooth treatment is employed to generate a laser density distribution at each slice to decrease the error caused by the under-sampling. The characteristics of this method is that the nonlinear refractive indices of the points on current slice are calculated by iteration so as to solve the problem of unknown parameters in the material caused by the causal relationship between laser intensity and nonlinear refractive index. Compared with the beam propagation method, this algorithm is more suitable for engineering application with lower time complexity, and has the calculation capacity for numerical simulation of self-focusing process in the systems including both of linear and nonlinear optical media. If the sampled rays are traced with their complex amplitudes and light paths or phases, it will be possible to simulate the superposition effects of different beam. At the end of the paper, the advantages and disadvantages of this algorithm are discussed.
Weiland, C.M. [Univ. of California, Santa Barbara, CA (United States); Steck, L.K. [Los Alamos National Lab., NM (United States); Dawson, P.B. [Geological Survey, Menlo Park, CA (United States)] [and others
1995-10-10
The authors explore the impact of three-dimensional minimum travel time ray tracing on nonlinear teleseismic inversion. This problem has particular significance when trying to image strongly contrasting low-velocity bodies, such as magma chambers, because strongly refracted/and/or diffracted rays may precede the direct P wave arrival traditionally used in straight-ray seismic tomography. They use a simplex-based ray tracer to compute the three-dimensional, minimum travel time ray paths and employ an interative technique to cope with nonlinearity. Results from synthetic data show that their algorithm results in better model reconstructions compared with traditional straight-ray inversions. The authors reexamine the teleseismic data collected at Long Valley caldera by the U.S. Geological Survey. The most prominent feature of their result is a 25-30% low-velocity zone centered at 11.5 km depth beneath the northwestern quandrant of the caldera. Beneath this at a depth of 24.5 km is a more diffuse 15% low-velocity zone. In general, the low velocities tend to deepen to the south and east. The authors interpret the shallow feature to be the residual Long Valley caldera magma chamber, while the deeper feature may represent basaltic magmas ponded in the midcrust. The deeper position of the prominent low-velocity region in comparison to earlier tomographic images is a result of using three-dimensional rays rather than straight rays in the ray tracing. The magnitude of the low-velocity anomaly is a factor of {approximately}3 times larger than earlier models from linear arrival time inversions and is consistent with models based on observations of ray bending at sites within the caldera. These results imply the presence of anywhere from 7 to 100% partial melt beneath the caldera. 40 refs., 1 fig., 1 tab.
New developments in the McStas neutron instrument simulation package
Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-07-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.
Jones, M. A.; Edwards, A.; Boulton, P.
2010-12-01
Helping students to develop a cognitive and intuitive feel for the different temporal and spatial scales of processes through which the rock record is assembled is a primary goal of geoscience teaching. SedWorks is a 3-D virtual geoscience world that integrates both quantitative modelling and field-based studies into one interactive package. The program aims to help students acquire scientific content, cultivate critical thinking skills, and hone their problem solving ability, while also providing them with the opportunity to practice the activities undertaken by professional earth scientists. SedWorks is built upon a game development platform used for constructing interactive 3-D applications. Initially the software has been developed for teaching the sedimentology component of a Geoscience degree and consists of a series of continents or land masses each possessing sedimentary environments which the students visit on virtual field trips. The students are able to interact with the software to collect virtual field data from both the modern environment and the stratigraphic record, and to formulate hypotheses based on their observations which they can test through virtual physical experimentation within the program. The program is modular in design in order to enhance its adaptability and to allow scientific content to be updated so that the knowledge and skills acquired are at the cutting edge. We will present an example module in which students undertake a virtual field study of a 2-km long stretch of a river to observe how sediment is transported and deposited. On entering the field area students are able to observe different bedforms in different parts of the river as they move up- and down-stream, as well as in and out of the river. As they explore, students discover ‘hot spots’ at which particular tools become available to them. This includes tools for measuring the physical parameters of the flow and sediment bed (e.g. velocity, depth, grain size, bed
Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio
2011-01-01
Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide.
Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.; Salim, S.
2008-01-01
The Green Solar Collector (GSC), a photobioreactor designed for area efficient outdoor cultivation of microalgae uses Fresnel lenses and light guides to focus, transport and distribute direct light into the algae suspension. Calculating the path of rays of light, so-called ray tracing, is used to de
Snellenburg, J.J.; Braaf, B.; Hermans, E.A.; Heijde, van der R.G.L.; Sicam, V.A.
2010-01-01
A forward ray tracing (FRT) model is presented to determine the exact image projection in a general corneal topography system. Consequently, the skew ray error in Placido-based topography is demonstrated. A quantitative analysis comparing FRT-based algorithms and Placido-based algorithms in reconstr
Maliage, M
2012-05-01
Full Text Available The purpose of this paper is to validate SolTrace for concentrating solar investigations at CSIR by means of a test case: the comparison of the flux distribution in the focal spot of a 1.25 m2 target aligned heliostat predicted by the ray tracing...
Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.; Salim, S.
2008-01-01
The Green Solar Collector (GSC), a photobioreactor designed for area efficient outdoor cultivation of microalgae uses Fresnel lenses and light guides to focus, transport and distribute direct light into the algae suspension. Calculating the path of rays of light, so-called ray tracing, is used to de
Terahertz/mm wave imaging simulation software
Fetterman, M. R.; Dougherty, J.; Kiser, W. L., Jr.
2006-10-01
We have developed a mm wave/terahertz imaging simulation package from COTS graphic software and custom MATLAB code. In this scheme, a commercial ray-tracing package was used to simulate the emission and reflections of radiation from scenes incorporating highly realistic imagery. Accurate material properties were assigned to objects in the scenes, with values obtained from the literature, and from our own terahertz spectroscopy measurements. The images were then post-processed with custom Matlab code to include the blur introduced by the imaging system and noise levels arising from system electronics and detector noise. The Matlab code was also used to simulate the effect of fog, an important aspect for mm wave imaging systems. Several types of image scenes were evaluated, including bar targets, contrast detail targets, a person in a portal screening situation, and a sailboat on the open ocean. The images produced by this simulation are currently being used as guidance for a 94 GHz passive mm wave imaging system, but have broad applicability for frequencies extending into the terahertz region.
McStas 1.1: A freeware package for neutron Monte Carlo ray-tracing simulations
Lefmann, K.; Nielsen, K.
1999-01-01
The key themes of teh 12th ordinary general meeting of the Nordic Society for Radiation Protection were: RADIATION - ENVIRONMENT - INFORMATION. A number of outstanding international experts accepted to contribute on the meetings first day with invited presentations, which focussed on these themes...
Tauxe, L.; Shaar, R.; Jonestrask, L.; Swanson-Hysell, N. L.; Minnett, R.; Koppers, A. A. P.; Constable, C. G.; Jarboe, N.; Gaastra, K.; Fairchild, L.
2016-06-01
The Magnetics Information Consortium (MagIC) database provides an archive with a flexible data model for paleomagnetic and rock magnetic data. The PmagPy software package is a cross-platform and open-source set of tools written in Python for the analysis of paleomagnetic data that serves as one interface to MagIC, accommodating various levels of user expertise. PmagPy facilitates thorough documentation of sampling, measurements, data sets, visualization, and interpretation of paleomagnetic and rock magnetic experimental data. Although not the only route into the MagIC database, PmagPy makes preparation of newly published data sets for contribution to MagIC as a byproduct of normal data analysis and allows manipulation as well as reanalysis of data sets downloaded from MagIC with a single software package. The graphical user interface (GUI), Pmag GUI enables use of much of PmagPy's functionality, but the full capabilities of PmagPy extend well beyond that. Over 400 programs and functions can be called from the command line interface mode, or from within the interactive Jupyter notebooks. Use of PmagPy within a notebook allows for documentation of the workflow from the laboratory to the production of each published figure or data table, making research results fully reproducible. The PmagPy design and its development using GitHub accommodates extensions to its capabilities through development of new tools by the user community. Here we describe the PmagPy software package and illustrate the power of data discovery and reuse through a reanalysis of published paleointensity data which illustrates how the effectiveness of selection criteria can be tested.
Kashima RAy-Tracing Service (KARATS) for high accurate GNSS positioning
Ichikawa, R.; Hobiger, T.; Hasegawa, S.; Tsutsumi, M.; Koyama, Y.; Kondo, T.
2010-12-01
Radio signal delays associated with the neutral atmosphere are one of the major error sources of space geodesy such as GPS, GLONASS, GALILEO, VLBI, In-SAR measurements. We have developed a state-of-art tool to estimate the atmospheric path delays by ray-tracing through JMA meso-scale analysis (MANAL data) data. The tools, which we have named 'KAshima RAytracing Tools (KARAT)', are capable of calculating total slant delays and ray-bending angles considering real atmospheric phenomena. Numerical weather models such as MANAL data have undergone a significant improvement of accuracy and spatial resolution, which makes it feasible to utilize them for the correction of atmosphere excess path delays. In the previous studies for evaluating KARAT performance, the KARAT solutions are slightly better than the solutions using VMF1 and GMF with linear gradient model for horizontal and height positions. Based on these results we have started the web-based online service, 'KAshima RAytracing Service (KARATS)' for providing the atmospheric delay correction of RINEX files on Jan 27th, 2010. The KARATS receives user's RINEX data via a proper web site (http://vps.nict.go.jp/karats/index.html) and processes user's data files using KARAT for reducing atmospheric slant delays. The reduced RINEX files are archived in the specific directory for each user on the KARATS server. Once the processing is finished the information of data archive is sent privately via email to each user. If user want to process a large amount of data files, user can prepare own server which archives them. The KARATS can get these files from the user's server using GNU ¥emph{wget} and performs ray-traced corrections. We will present a brief status of the KARATS and summarize first experiences gained after this service went operational in December 2009. In addition, we will also demonstrate the newest KARAT performance based on the 5km MANAL data which has been operational from April 7th, 2009 and an outlook on
Identification of gravity wave sources using reverse ray tracing over Indian region
M. Pramitha
2014-07-01
Full Text Available Reverse ray tracing method is successfully implemented for the first time in the Indian region for identification of the sources and propagation characteristics of the gravity waves observed using airglow emissions from Gadanki (13.5° N, 79.2° E and Hyderabad (17.5° N, 78.5° E. Wave amplitudes are also traced back for these wave events by including both radiative and diffusive damping. Background temperature and wind data obtained from MSISE-90 and HWM-07 models, respectively, are used for the ray tracing. For Gadanki region suitability of these models is tested. Further, a climatological model of background atmosphere for Gadanki region has been developed using a long-term of nearly 30 years of observations available from a variety of ground-based (MST radar, radiosonde, MF radar, rocket-, and satellite-borne measurements. For considering real-time atmospheric inputs, ERA-Interim products are utilized. By this reverse ray method, the source locations for nine wave events could be identified to be in the upper troposphere, whereas, for five other events the waves seem to have been ducted in the mesosphere itself. Uncertainty in locating the terminal points in the horizontal direction is estimated to be within 50–100 and 150–300 km for Gadanki and Hyderabad wave events, respectively. This uncertainty arises mainly due to non-consideration of the day-to-day variability in tidal amplitudes. As no convection in-and-around the terminal points are noticed, it is unlikely to be the source. Interestingly, large (~9 m s−1 km−1 vertical shear in the horizontal wind is noted near the ray terminal points (at 10–12 km altitude and is identified to be the source for generating the nine wave events. Conditions prevailing at the terminal points for each of the 14 events are also provided. These events provide leads to a greater understanding of the tropical lower and upper atmospheric coupling through gravity waves.
GPU-based four-dimensional general-relativistic ray tracing
Kuchelmeister, Daniel; Müller, Thomas; Ament, Marco; Wunner, Günter; Weiskopf, Daniel
2012-10-01
This paper presents a new general-relativistic ray tracer that enables image synthesis on an interactive basis by exploiting the performance of graphics processing units (GPUs). The application is capable of visualizing the distortion of the stellar background as well as trajectories of moving astronomical objects orbiting a compact mass. Its source code includes metric definitions for the Schwarzschild and Kerr spacetimes that can be easily extended to other metric definitions, relying on its object-oriented design. The basic functionality features a scene description interface based on the scripting language Lua, real-time image output, and the ability to edit almost every parameter at runtime. The ray tracing code itself is implemented for parallel execution on the GPU using NVidia's Compute Unified Device Architecture (CUDA), which leads to performance improvement of an order of magnitude compared to a single CPU and makes the application competitive with small CPU cluster architectures. Program summary Program title: GpuRay4D Catalog identifier: AEMV_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 73649 No. of bytes in distributed program, including test data, etc.: 1334251 Distribution format: tar.gz Programming language: C++, CUDA. Computer: Linux platforms with a NVidia CUDA enabled GPU (Compute Capability 1.3 or higher), C++ compiler, NVCC (The CUDA Compiler Driver). Operating system: Linux. RAM: 2 GB Classification: 1.5. External routines: OpenGL Utility Toolkit development files, NVidia CUDA Toolkit 3.2, Lua5.2 Nature of problem: Ray tracing in four-dimensional Lorentzian spacetimes. Solution method: Numerical integration of light rays, GPU-based parallel programming using CUDA, 3D
Yamazaki, Ichitaro; Wu, Kesheng; Simon, Horst
2008-10-27
The original software package TRLan, [TRLan User Guide], page 24, implements the thick restart Lanczos method, [Wu and Simon 2001], page 24, for computing eigenvalues {lambda} and their corresponding eigenvectors v of a symmetric matrix A: Av = {lambda}v. Its effectiveness in computing the exterior eigenvalues of a large matrix has been demonstrated, [LBNL-42982], page 24. However, its performance strongly depends on the user-specified dimension of a projection subspace. If the dimension is too small, TRLan suffers from slow convergence. If it is too large, the computational and memory costs become expensive. Therefore, to balance the solution convergence and costs, users must select an appropriate subspace dimension for each eigenvalue problem at hand. To free users from this difficult task, nu-TRLan, [LNBL-1059E], page 23, adjusts the subspace dimension at every restart such that optimal performance in solving the eigenvalue problem is automatically obtained. This document provides a user guide to the nu-TRLan software package. The original TRLan software package was implemented in Fortran 90 to solve symmetric eigenvalue problems using static projection subspace dimensions. nu-TRLan was developed in C and extended to solve Hermitian eigenvalue problems. It can be invoked using either a static or an adaptive subspace dimension. In order to simplify its use for TRLan users, nu-TRLan has interfaces and features similar to those of TRLan: (1) Solver parameters are stored in a single data structure called trl-info, Chapter 4 [trl-info structure], page 7. (2) Most of the numerical computations are performed by BLAS, [BLAS], page 23, and LAPACK, [LAPACK], page 23, subroutines, which allow nu-TRLan to achieve optimized performance across a wide range of platforms. (3) To solve eigenvalue problems on distributed memory systems, the message passing interface (MPI), [MPI forum], page 23, is used. The rest of this document is organized as follows. In Chapter 2 [Installation
A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet
Mazzia, F.; Cash, J.R.; Soetaert, K.
2012-01-01
In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value
A comprehensive ray tracing study on the impact of solar reflections from glass curtain walls.
Wong, Justin S J
2016-01-01
To facilitate the investigation of the impact of solar reflection from the façades of skyscrapers to surrounding environment, a comprehensive ray tracing model has been developed using the International Commerce Centre (ICC) in Hong Kong as an example. Taking into account the actual physical dimensions of buildings and meteorological data, the model simulates and traces the paths of solar reflections from ICC to the surrounding buildings, assessing the impact in terms of hit locations, light intensity and the hit time on each day throughout the year. Our analyses show that various design and architectural features of ICC have amplified the intensity of reflected solar rays and increased the hit rates of surrounding buildings. These factors include the high reflectivity of glass panels, their upward tilting angles, the concave profile of the 'Dragon Tail' (glass panels near the base), the particular location and orientation of ICC, as well as the immense height of ICC with its large reflective surfaces. The simulation results allow us to accurately map the date and time when the ray projections occur on each of the target buildings, rendering important information such as the number of converging (overlapping) projections, and the actual light intensity hitting each of the buildings at any given time. Comparisons with other skyscrapers such as Taipei 101 in Taiwan and 2-IFC (International Finance Centre) Hong Kong are made. Remedial actions for ICC and preventive measures are also discussed.
Eccentric small-zone ray tracing wavefront aberrometry for refraction in keratoconus.
Fredriksson, Anneli; Behndig, Anders
2016-11-01
To compare objective refraction using small-zone eccentric laser ray tracing (LRT) wavefront aberrometry to standard autorefraction in keratoconus (KC), and whether the visual acuities achieved with these refractions differ from corresponding values in healthy eyes. Twenty-nine eyes of 29 patients with KC and 29 eyes of 29 healthy controls were included in this prospective unmasked case-control study. The uncorrected (UCVA) and spectacle-corrected (SCVA) Early Treatment Diabetic Retinopathy Study (ETDRS) visual acuities based on refractions derived from LRT in central and four eccentric zones were compared to those achieved with standard autorefraction. The spherical equivalent (M) and two astigmatic power vectors (C0 and C45) were calculated for all refractions. Pentacam HR(®) was used to generate keratometry readings of the corresponding zones. In KC, the refraction from the upper nasal zone rendered a higher SCVA than the standard autorefraction more often than in the controls (p refractions rendered similar SCVA:s in KC. Pentacam HR(®) showed higher keratometry readings infero-temporally, but also lower readings supero-nasally, compared to controls. In KC, eccentric LRT measurements gave better SCVA than standard autorefraction more often than in healthy eyes. Eccentric LRT may become a valuable tool in the demanding task of subjective refraction in KC. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Stevens, John Colby [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). The Joint Center for Artificial Photosynthesis; Univ. of California, Berkeley, CA (United States). Dept. of Mechanical Engineering
2012-12-01
Ray tracing was used to perform optical optimization of arrays of photovoltaic microrods and explore the interaction between light and bubbles of oxygen gas on the surface of the microrods. The incident angle of light was varied over a wide range. The percent of incident light absorbed by the microrods and reflected by the bubbles was computed over this range. It was found that, for the 10 μm diameter, 100 μm tall SrTiO_{3} microrods simulated in the model, the optimal center-to-center spacing was 14 μm for a square grid. This geometry produced 75% average and 90% maximum absorbance. For a triangular grid using the same microrods, the optimal center-to-center spacing was 14 μm. This geometry produced 67% average and 85% maximum absorbance. For a randomly laid out grid of 5 μm diameter, 100 μm tall SrTiO_{3} microrods with an average center-to-center spacing of 20 μm, the average absorption was 23% and the maximum absorption was 43%. For a 50% areal coverage fraction of bubbles on the absorber surface, between 2%-20% of the incident light energy was reflected away from the rods by the bubbles, depending upon incident angle and bubble morphology.
ENZO+MORAY: radiation hydrodynamics adaptive mesh refinement simulations with adaptive ray tracing
Wise, John H.; Abel, Tom
2011-07-01
We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray-tracing scheme, and its parallel implementation into the adaptive mesh refinement cosmological hydrodynamics code ENZO. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilized to study a broad range of astrophysical problems, such as stellar and black hole feedback. Inaccuracies can arise from large time-steps and poor sampling; therefore, we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. We further test our method with more dynamical situations, for example, the propagation of an ionization front through a Rayleigh-Taylor instability, time-varying luminosities and collimated radiation. The test suite also includes an expanding H II region in a magnetized medium, utilizing the newly implemented magnetohydrodynamics module in ENZO. This method linearly scales with the number of point sources and number of grid cells. Our implementation is scalable to 512 processors on distributed memory machines and can include the radiation pressure and secondary ionizations from X-ray radiation. It is included in the newest public release of ENZO.
Liang, Yicheng; Peng, Hao
2015-02-07
Depth-of-interaction (DOI) poses a major challenge for a PET system to achieve uniform spatial resolution across the field-of-view, particularly for small animal and organ-dedicated PET systems. In this work, we implemented an analytical method to model system matrix for resolution recovery, which was then incorporated in PET image reconstruction on a graphical processing unit platform, due to its parallel processing capacity. The method utilizes the concepts of virtual DOI layers and multi-ray tracing to calculate the coincidence detection response function for a given line-of-response. The accuracy of the proposed method was validated for a small-bore PET insert to be used for simultaneous PET/MR breast imaging. In addition, the performance comparisons were studied among the following three cases: 1) no physical DOI and no resolution modeling; 2) two physical DOI layers and no resolution modeling; and 3) no physical DOI design but with a different number of virtual DOI layers. The image quality was quantitatively evaluated in terms of spatial resolution (full-width-half-maximum and position offset), contrast recovery coefficient and noise. The results indicate that the proposed method has the potential to be used as an alternative to other physical DOI designs and achieve comparable imaging performances, while reducing detector/system design cost and complexity.
Three-dimensional ray tracing for refractive correction of human eye ametropies
Jimenez-Hernandez, J. A.; Diaz-Gonzalez, G.; Trujillo-Romero, F.; Iturbe-Castillo, M. D.; Juarez-Salazar, R.; Santiago-Alvarado, A.
2016-09-01
Ametropies of the human eye, are refractive defects hampering the correct imaging on the retina. The most common ways to correct them is by means of spectacles, contact lenses, and modern methods as laser surgery. However, in any case it is very important to identify the ametropia grade for designing the optimum correction action. In the case of laser surgery, it is necessary to define a new shape of the cornea in order to obtain the wanted refractive correction. Therefore, a computational tool to calculate the focal length of the optical system of the eye versus variations on its geometrical parameters is required. Additionally, a clear and understandable visualization of the evaluation process is desirable. In this work, a model of the human eye based on geometrical optics principles is presented. Simulations of light rays coming from a punctual source at six meter from the cornea are shown. We perform a ray-tracing in three dimensions in order to visualize the focusing regions and estimate the power of the optical system. The common parameters of ametropies can be easily modified and analyzed in the simulation by an intuitive graphic user interface.
Woei Leow, Shin; Corrado, Carley; Osborn, Melissa; Isaacson, Michael; Alers, Glenn; Carter, Sue A.
2013-06-01
Luminescent solar concentrators (LSC) collect ambient light from a broad range of angles and concentrate the captured light onto photovoltaic (PV) cells. LSCs with front-facing cells collect direct and indirect sunlight ensuring a gain factor greater than one. The flexible placement and percentage coverage of PV cells on the LSC panel allow for layout adjustments to be made in order to balance re-absorption losses and the level of light concentration desired. A weighted Monte Carlo ray tracing program was developed to study the transport of photons and loss mechanisms in the LSC to aid in design optimization. The program imports measured absorption/emission spectra of an organic luminescent dye (LR305), the transmission coefficient, and refractive index of acrylic as parameters that describe the system. Simulations suggest that for LR305, 8-10 cm of luminescent material surrounding the PV cell yields the highest increase in power gain per unit area of LSC added, thereby determining the ideal spacing between PV cells in the panel. For rectangular PV cells, results indicate that for each centimeter of PV cell width, an additional increase of 0.15 mm to the waveguide thickness is required to efficiently transport photon collected by the LSC to the PV cell with minimal loss.
A model of polarized-beam AGS in the ray-tracing code Zgoubi
Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ahrens, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Glenn, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Roser, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Shoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States)
2016-07-12
A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are available from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.
Seismic ray-tracing calculation based on parabolic travel-time interpolation
周竹生; 张赛民; 陈灵君
2004-01-01
A new seismic ray-tracing method is put forward based on parabolic travel-time interpolation(PTI) method, which is more accurate than the linear travel-time interpolation (LTI) method. Both PTI method and LTI method are used to compute seismic travel-time and ray-path in a 2-D grid cell model. Firstly, some basic concepts are introduced. The calculations of travel-time and ray-path are carried out only at cell boundaries. So, the ray-path is always straight in the same cells with uniform velocity. Two steps are applied in PTI and LTI method, step 1 computes travel-time and step 2 traces ray-path. Then, the derivation of LTI formulas is described. Because of the presence of refraction wave in shot cell, the formula aiming at shot cell is also derived. Finally, PTI method is presented. The calculation of PTI method is more complex than that of LTI method, but the error is limited. The results of numerical model show that PTI method can trace ray-path more accurately and efficiently than LTI method does.
Leow, Shin Woei; Corrado, Carley; Osborn, Melissa; Carter, Sue A.
2013-09-01
Luminescent solar concentrators (LSCs) have the ability to receive light from a wide range of angles, concentrating the captured light onto small photo active areas. This enables greater incorporation of LSCs into building designs as windows, skylights and wall claddings in addition to rooftop installations of current solar panels. Using relatively cheap luminescent dyes and acrylic waveguides to effect light concentration onto lesser photovoltaic (PV) cells, there is potential for this technology to approach grid price parity. We employ a panel design in which the front facing PV cells collect both direct and concentrated light ensuring a gain factor greater than one. This also allows for flexibility in determining the placement and percentage coverage of PV cells during the design process to balance reabsorption losses against the power output and level of light concentration desired. To aid in design optimization, a Monte-Carlo ray tracing program was developed to study the transport of photons and loss mechanisms in LSC panels. The program imports measured absorption/emission spectra and transmission coefficients as simulation parameters with interactions of photons in the panel determined by comparing calculated probabilities with random number generators. LSC panels with multiple dyes or layers can also be simulated. Analysis of the results reveals optimal panel dimensions and PV cell layouts for maximum power output for a given dye concentration, absorbtion/emission spectrum and quantum efficiency.
Using Stochastic Ray Tracing to Simulate a Dense Time Series of Gross Primary Productivity
Martin van Leeuwen
2015-12-01
Full Text Available Eddy-covariance carbon dioxide flux measurement is an established method to estimate primary productivity at the forest stand level (typically 10 ha. To validate eddy-covariance estimates, researchers rely on extensive time-series analysis and an assessment of flux contributions made by various ecosystem components at spatial scales much finer than the eddy-covariance footprint. Scaling these contributions to the stand level requires a consideration of the heterogeneity in the canopy radiation field. This paper presents a stochastic ray tracing approach to predict the probabilities of light absorption from over a thousand hemispherical directions by thousands of individual scene elements. Once a look-up table of absorption probabilities is computed, dynamic illumination conditions can be simulated in a computationally realistic time, from which stand-level gross primary productivity can be obtained by integrating photosynthetic assimilation over the scene. We demonstrate the method by inverting a leaf-level photosynthesis model with eddy-covariance and meteorological data. Optimized leaf photosynthesis parameters and canopy structure were able to explain 75% of variation in eddy-covariance gross primary productivity estimates, and commonly used parameters, including photosynthetic capacity and quantum yield, fell within reported ranges. Remaining challenges are discussed including the need to address the distribution of radiation within shoots and needles.
Zhu, Yang; Zhang, Xin; Liu, Tao; Wu, Yanxiong; Shi, Guangwei; Wang, Lingjie
2015-07-01
A long wave infrared imaging system operated for space exploration of faint target is highly sensitive to stray radiation. We present an integrative suppression process of internal and external stray radiation. A compact and re-imaging LWIR catadioptric telescope is designed as practical example and internal and external stray radiation is analyzed for this telescope. The detector is cryogenically cooled with 100% cold shield efficiency of Lyot stop. A non-sequential ray tracing technique is applied to investigate how the stray radiation propagates inside optical system. The simulation and optimization during initial design stage are proceeded to avoid subversive defect that the stray radiation disturbs the target single. The quantitative analysis of stray radiation irradiance emitted by lenses and structures inside is presented in detail. The optical elements, which operate at room-temperature due to the limitation of weight and size, turn to be the significant stray radiation sources. We propose a method combined infrared material selection and optical form optimization to reduce the internal stray radiation of lens. We design and optimize mechanical structures to achieve a further attenuation of internal stray radiation power. The point source transmittance (PST) is calculated to assess the external radiation which comes from the source out of view field. The ghost of bright target due to residual reflection of optical coatings is simulated. The results show that the performance of stray radiation suppression is dramatically improved by iterative optimization and modification of optomechanical configurations.
Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee
2015-08-01
Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool, a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.
Johnson, Charles S.
1986-01-01
The embedded systems running real-time applications, for which Ada was designed, require their own mechanisms for the management of dynamically allocated storage. There is a need for packages which manage their own internalo structures to control their deallocation as well, due to the performance implications of garbage collection by the KAPSE. This places a requirement upon the design of generic packages which manage generically structured private types built-up from application-defined input types. These kinds of generic packages should figure greatly in the development of lower-level software such as operating systems, schedulers, controllers, and device driver; and will manage structures such as queues, stacks, link-lists, files, and binary multary (hierarchical) trees. Controlled to prevent inadvertent de-designation of dynamic elements, which is implicit in the assignment operation A study was made of the use of limited private type, in solving the problems of controlling the accumulation of anonymous, detached objects in running systems. The use of deallocator prodecures for run-down of application-defined input types during deallocation operations during satellites.
Plis, M.N.; Rohrbacher, T.J.; Teeters, D.D.
1993-01-01
The U.S. Bureau of Mines report presents the documentation for COALVAL, a coal property evaluation software package developed on Lotus 1-2-3, version 2.2, spreadsheets. The software is compatible with version 3.1 as well, and may provisionally be run on the earlier 2.01 version. COALVAL is a menu-driven program that produces a prefeasibility-level cost analysis of mine-planned coal resources. The package contains cost models for each of five coal mining methods commonly employed in Appalachia: auger, contour strip, mountain top removal, continuous miner, and longwall. Other models, such as a dragline cost model, will be incorporated as the Bureau's Coal Recoverability Program matures. COALVAL allows mine operators, evaluators, consultants, and Government entities to input resource data and the various production, operating, and cost variables that pertain to their property. The program can evaluate up to 25 seams, each to be mined with up to five different mining methods, within a given area. Summary spreadsheets listing the cost per clean ton to mine the resources, f.o.b. the tipple, are produced for each property, seam, and mining method/seam combination.
Palomo, E.
1994-07-01
This script describes the structure and the separated modules of the software package METEOR for the statistical analysis of meteorological data series. It contains a systematic description of the subroutines of METEOR and, also, of the required shape for input and output files. The original version of METEOR have been developed by Ph.D. Elena Palomo, CIEMAT-IER, GIMASE. It is built by linking programs and routines written in FORTRAN 77 and it adds thc graphical capabilities of GNUPLOT. The shape of this toolbox was designed following the criteria of modularity, flexibility and agility criteria. All the input, output and analysis options are structured in three main menus: i) the first is aimed to evaluate the quality of the data set; ii) the second is aimed for pre-processing of the data; and iii) the third is aimed towards the statistical analyses and for creating the graphical outputs. Actually the information about METEOR is constituted by three documents written in spanish: 1) METEOR v1.0: User's guide; 2) METEOR v1.0: A usage example; 3) METEOR v 1.0: Design and structure of the software package. (Author)
Dondorp Arjen M
2009-10-01
Full Text Available Abstract Background A number of molecular tools have been developed to monitor the emergence and spread of anti-malarial drug resistance to Plasmodium falciparum. One of the major obstacles to the wider implementation of these tools is the absence of practical methods enabling high throughput analysis. Here a new Zip-code array is described, called FlexiChip, linked to a dedicated software program, which largely overcomes this problem. Methods Previously published microarray probes detecting single-nucleotide polymorphisms (SNP associated with parasite resistance to anti-malarial drugs (ResMalChip were adapted for a universal microarray FlexiChip format. To evaluate the overall sensitivity of the FlexiChip package (microarray + software, the results of FlexiChip were compared to ResMalChip microarray, using the same extension probes and with the same PCR products. In both cases, sequence results were used as gold standard to calculate sensitivity and specificity. FlexiChip results obtained with a set of field isolates were then compared to those assessed in an independent reference laboratory. Results The FlexiChip package gave results identical to the ResMalChip results in 92.7% of samples (kappa coefficient 0.8491, with a standard error 0.021 and had a sensitivity of 95.88% and a specificity of 97.68% compared to the sequencing as the reference method. Moreover the method performed well compared to the results obtained in the reference laboratories, with 99.7% of identical results (kappa coefficient 0.9923, S.E. 0.0523. Conclusion Microarrays could be employed to monitor P. falciparum drug resistance markers with greater cost effectiveness and the possibility for high throughput analysis. The FlexiChip package is a promising tool for use in poor resource settings of malaria endemic countries.
Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.
2016-03-01
The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.
Identification of Gravity wave Sources over Tropical Latitudes Using Reverse Ray Tracing technique
Venkat Ratnam, Madineni; Pramitha, M.
2016-07-01
Sources and propagation characteristics of high-frequency gravity waves (GWs) observed in the mesosphere using airglow emissions from Gadanki (13.5oN, 79.2oE) and Hyderabad (17.5oN, 78.5oE) are investigated using reverse ray tracing. Wave amplitudes are also traced back, including both radiative and diffusive damping. For this a climatological model of the background atmosphere for the Gadanki region has been developed using nearly 30 years of observations available from a variety of ground based (MST radar, radiosondes, MF radar) and rocket- and satellite-borne measurements. With the reverse ray-tracing method, the source locations for wave events could be identified to be in the upper troposphere. Uncertainty in locating the terminal points of wave events in the horizontal direction is estimated to be within 50-100 km and 150-300 km for Gadanki and Hyderabad wave events, respectively. This uncertainty arises mainly due to non-consideration of the day-to-day variability in the tidal amplitudes. Interestingly, large (~9ms-1 km-1) vertical shears in the horizontal wind are noticed near the ray terminal points (at 10-12 km altitude) and are thus identified to be the source for generating the observed high phase- speed, high-frequency GWs. We also tried to identify the sources for the GWs which are observed during Indo-French campaign conducted during May 2014. Uniqueness of the present study lies in using near-real time background atmosphere data from simultaneous radiosonde and meteor radar covering both source and propagation/dissipation regions of GWs. When we searched for the sources near the terminal points, deep convection is found to be a source for these events. We also tried to identify the sources of inertia-gravity waves (IGWs) that are observed in the troposphere and lower stratosphere during different seasons using long-term (2006-2014) high resolution radiosonde observations. In general, 50% of the waves observed over this location have convection as
Fokker-Planck/Ray Tracing for Electron Bernstein and Fast Wave Modeling in Support of NSTX
Harvey, R. W. [CompX, Del Mar, CA (United States)
2009-11-12
This DOE grant supported fusion energy research, a potential long-term solution to the world's energy needs. Magnetic fusion, exemplified by confinement of very hot ionized gases, i.e., plasmas, in donut-shaped tokamak vessels is a leading approach for this energy source. Thus far, a mixture of hydrogen isotopes has produced 10's of megawatts of fusion power for seconds in a tokamak reactor at Princeton Plasma Physics Laboratory in New Jersey. The research grant under consideration, ER54684, uses computer models to aid in understanding and projecting efficacy of heating and current drive sources in the National Spherical Torus Experiment, a tokamak variant, at PPPL. The NSTX experiment explores the physics of very tight aspect ratio, almost spherical tokamaks, aiming at producing steady-state fusion plasmas. The current drive is an integral part of the steady-state concept, maintaining the magnetic geometry in the steady-state tokamak. CompX further developed and applied models for radiofrequency (rf) heating and current drive for applications to NSTX. These models build on a 30 year development of rf ray tracing (the all-frequencies GENRAY code) and higher dimensional Fokker-Planck rf-collisional modeling (the 3D collisional-quasilinear CQL3D code) at CompX. Two mainline current-drive rf modes are proposed for injection into NSTX: (1) electron Bernstein wave (EBW), and (2) high harmonic fast wave (HHFW) modes. Both these current drive systems provide a means for the rf to access the especially high density plasma--termed high beta plasma--compared to the strength of the required magnetic fields. The CompX studies entailed detailed modeling of the EBW to calculate the efficiency of the current drive system, and to determine its range of flexibility for driving current at spatial locations in the plasma cross-section. The ray tracing showed penetration into NSTX bulk plasma, relatively efficient current drive, but a limited ability to produce current over
Lo, Ch. K.; Lim, Y. S.; Tan, S. G.; Rahman, F. A. [Faculty of Engineering and Science, University Tunku Abdul Rahman, Jalan Genting Klang, 53300, Kuala Lumpur (Malaysia)
2010-12-15
A Luminescent Solar Concentrator (LSC) is a transparent plate containing luminescent material with photovoltaic (PV) cells attached to its edges. Sunlight entering the plate is absorbed by the luminescent material, which in turn emits light. The emitted light propagates through the plate and arrives at the PV cells through total internal reflection. The ratio of the area of the relatively cheap polymer plate to that of the expensive PV cells is increased, and the cost per unit of solar electricity can be reduced by 75%. To improve the emission performance of LSCs, simulation modeling of LSCs becomes essential. Ray-tracing modeling is a popular approach for simulating LSCs due to its great ability of modeling various LSC structures under direct and diffuse sunlight. However, this approach requires substantial amount of measurement input data. Also, the simulation time is enormous because it is a forward-ray tracing method that traces all the rays propagating from the light source to the concentrator. On the other hand, the thermodynamic approach requires substantially less input parameters and simulation time, but it can only be used to model simple LSC designs with direct sunlight. Therefore, a new hybrid model was developed to perform various simulation studies effectively without facing the issues arisen from the existing ray-tracing and thermodynamic models. The simulation results show that at least 60% of the total output irradiance of a LSC is contributed by the light trapped and channeled by the LSC. The novelty of this hybrid model is the concept of integrating the thermodynamic model with a well-developed Radiance ray-tracing model, hence making this model as a fast, powerful and cost-effective tool for the design of LSCs. (authors)
Chin Kim Lo
2010-11-01
Full Text Available A Luminescent Solar Concentrator (LSC is a transparent plate containing luminescent material with photovoltaic (PV cells attached to its edges. Sunlight entering the plate is absorbed by the luminescent material, which in turn emits light. The emitted light propagates through the plate and arrives at the PV cells through total internal reflection. The ratio of the area of the relatively cheap polymer plate to that of the expensive PV cells is increased, and the cost per unit of solar electricity can be reduced by 75%. To improve the emission performance of LSCs, simulation modeling of LSCs becomes essential. Ray-tracing modeling is a popular approach for simulating LSCs due to its great ability of modeling various LSC structures under direct and diffuse sunlight. However, this approach requires substantial amount of measurement input data. Also, the simulation time is enormous because it is a forward-ray tracing method that traces all the rays propagating from the light source to the concentrator. On the other hand, the thermodynamic approach requires substantially less input parameters and simulation time, but it can only be used to model simple LSC designs with direct sunlight. Therefore, a new hybrid model was developed to perform various simulation studies effectively without facing the issues arisen from the existing ray-tracing and thermodynamic models. The simulation results show that at least 60% of the total output irradiance of a LSC is contributed by the light trapped and channeled by the LSC. The novelty of this hybrid model is the concept of integrating the thermodynamic model with a well-developed Radiance ray-tracing model, hence making this model as a fast, powerful and cost-effective tool for the design of LSCs.
Desnijder, Karel; Hanselaer, Peter; Meuret, Youri
2016-04-01
A key requirement to obtain a uniform luminance for a side-lit LED backlight is the optimised spatial pattern of structures on the light guide that extract the light. The generation of such a scatter pattern is usually performed by applying an iterative approach. In each iteration, the luminance distribution of the backlight with a particular scatter pattern is analysed. This is typically performed with a brute-force ray-tracing algorithm, although this approach results in a time-consuming optimisation process. In this study, the Adding-Doubling method is explored as an alternative way for evaluating the luminance of a backlight. Due to the similarities between light propagating in a backlight with extraction structures and light scattering in a cloud of light scatterers, the Adding-Doubling method which is used to model the latter could also be used to model the light distribution in a backlight. The backlight problem is translated to a form upon which the Adding-Doubling method is directly applicable. The calculated luminance for a simple uniform extraction pattern with the Adding-Doubling method matches the luminance generated by a commercial raytracer very well. Although successful, no clear computational advantage over ray tracers is realised. However, the dynamics of light propagation in a light guide as used the Adding-Doubling method, also allow to enhance the efficiency of brute-force ray-tracing algorithms. The performance of this enhanced ray-tracing approach for the simulation of backlights is also evaluated against a typical brute-force ray-tracing approach.
Ray tracing based path-length calculations for polarized light tomographic imaging
Manjappa, Rakesh; Kanhirodan, Rajan
2015-09-01
A ray tracing based path length calculation is investigated for polarized light transport in a pixel space. Tomographic imaging using polarized light transport is promising for applications in optical projection tomography of small animal imaging and turbid media with low scattering. Polarized light transport through a medium can have complex effects due to interactions such as optical rotation of linearly polarized light, birefringence, di-attenuation and interior refraction. Here we investigate the effects of refraction of polarized light in a non-scattering medium. This step is used to obtain the initial absorption estimate. This estimate can be used as prior in Monte Carlo (MC) program that simulates the transport of polarized light through a scattering medium to assist in faster convergence of the final estimate. The reflectance for p-polarized (parallel) and s-polarized (perpendicular) are different and hence there is a difference in the intensities that reach the detector end. The algorithm computes the length of the ray in each pixel along the refracted path and this is used to build the weight matrix. This weight matrix with corrected ray path length and the resultant intensity reaching the detector for each ray is used in the algebraic reconstruction (ART) method. The proposed method is tested with numerical phantoms for various noise levels. The refraction errors due to regions of different refractive index are discussed, the difference in intensities with polarization is considered. The improvements in reconstruction using the correction so applied is presented. This is achieved by tracking the path of the ray as well as the intensity of the ray as it traverses through the medium.
Accounting for partiality in serial crystallography using ray-tracing principles.
Kroon-Batenburg, Loes M J; Schreurs, Antoine M M; Ravelli, Raimond B G; Gros, Piet
2015-09-01
Serial crystallography generates `still' diffraction data sets that are composed of single diffraction images obtained from a large number of crystals arbitrarily oriented in the X-ray beam. Estimation of the reflection partialities, which accounts for the expected observed fractions of diffraction intensities, has so far been problematic. In this paper, a method is derived for modelling the partialities by making use of the ray-tracing diffraction-integration method EVAL. The method estimates partialities based on crystal mosaicity, beam divergence, wavelength dispersion, crystal size and the interference function, accounting for crystallite size. It is shown that modelling of each reflection by a distribution of interference-function weighted rays yields a `still' Lorentz factor. Still data are compared with a conventional rotation data set collected from a single lysozyme crystal. Overall, the presented still integration method improves the data quality markedly. The R factor of the still data compared with the rotation data decreases from 26% using a Monte Carlo approach to 12% after applying the Lorentz correction, to 5.3% when estimating partialities by EVAL and finally to 4.7% after post-refinement. The merging R(int) factor of the still data improves from 105 to 56% but remains high. This suggests that the accuracy of the model parameters could be further improved. However, with a multiplicity of around 40 and an R(int) of ∼50% the merged still data approximate the quality of the rotation data. The presented integration method suitably accounts for the partiality of the observed intensities in still diffraction data, which is a critical step to improve data quality in serial crystallography.
Zhang, Dong; Zhang, Ting-Ting; Zhang, Xiao-Lei; Yang, Yan; Hu, Ying; Qin, Qian-Qing
2013-05-01
We present a new method of three-dimensional (3-D) seismic ray tracing, based on an improvement to the linear traveltime interpolation (LTI) ray tracing algorithm. This new technique involves two separate steps. The first involves a forward calculation based on the LTI method and the dynamic successive partitioning scheme, which is applied to calculate traveltimes on cell boundaries and assumes a wavefront that expands from the source to all grid nodes in the computational domain. We locate several dynamic successive partition points on a cell's surface, the traveltimes of which can be calculated by linear interpolation between the vertices of the cell's boundary. The second is a backward step that uses Fermat's principle and the fact that the ray path is always perpendicular to the wavefront and follows the negative traveltime gradient. In this process, the first-arriving ray path can be traced from the receiver to the source along the negative traveltime gradient, which can be calculated by reconstructing the continuous traveltime field with cubic B-spline interpolation. This new 3-D ray tracing method is compared with the LTI method and the shortest path method (SPM) through a number of numerical experiments. These comparisons show obvious improvements to computed traveltimes and ray paths, both in precision and computational efficiency.
E. Achmad
2006-12-01
Full Text Available Gravity wave signatures were extracted from OH airglow observations using all-sky CCD imagers at four different stations: Cachoeira Paulista (CP (22.7° S, 45° W and São João do Cariri (7.4° S, 36.5° W, Brazil; Tanjungsari (TJS (6.9° S, 107.9° E, Indonesia and Shigaraki (34.9° N, 136° E, Japan. The gravity wave parameters are used as an input in a reverse ray tracing model to study the gravity wave vertical propagation trajectory and to estimate the wave source region. Gravity waves observed near the equator showed a shorter period and a larger phase velocity than those waves observed at low-middle latitudes. The waves ray traced down into the troposphere showed the largest horizontal wavelength and phase speed. The ray tracing results also showed that at CP, Cariri and Shigaraki the majority of the ray paths stopped in the mesosphere due to the condition of m2m2m|→∞, which suggests the presence of ducting waves and/or waves generated in-situ. In the troposphere, the possible gravity wave sources are related to meteorological front activities and cloud convections at CP, while at Cariri and TJS tropical cloud convections near the equator are the most probable gravity wave sources. The tropospheric jet stream and the orography are thought to be the major responsible sources for the waves observed at Shigaraki.
Shi, Shengxian; Ding, Junfei; New, T. H.; Soria, Julio
2017-07-01
This paper presents a dense ray tracing reconstruction technique for a single light-field camera-based particle image velocimetry. The new approach pre-determines the location of a particle through inverse dense ray tracing and reconstructs the voxel value using multiplicative algebraic reconstruction technique (MART). Simulation studies were undertaken to identify the effects of iteration number, relaxation factor, particle density, voxel-pixel ratio and the effect of the velocity gradient on the performance of the proposed dense ray tracing-based MART method (DRT-MART). The results demonstrate that the DRT-MART method achieves higher reconstruction resolution at significantly better computational efficiency than the MART method (4-50 times faster). Both DRT-MART and MART approaches were applied to measure the velocity field of a low speed jet flow which revealed that for the same computational cost, the DRT-MART method accurately resolves the jet velocity field with improved precision, especially for the velocity component along the depth direction.
Bisdas, Sotirios [Johann Wolfgang University Hospital, Department of Radiology, Frankfurt (Germany); Medical University of South Carolina, Department of Radiology, Charleston, SC (United States); Johann Wolfgang Goethe University Hospital, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Rumboldt, Zoran; Deveikis, John; Spampinato, Maria Vittoria [Medical University of South Carolina, Department of Radiology, Charleston, SC (United States); Surlan, Katarina [Clinical Centre Ljubljana, Department of Clinical Radiology, Ljubljana (Slovenia); Koh, Tong San [Nanyang Technological University, School of Electrical and Electronic Engineering, Singapore (Singapore)
2008-10-15
Our purpose was to examine the feasibility and reproducibility of perfusion CT studies in the cervical spinal cord and the interchangeability of the values obtained by two post-processing methods. The perfusion CT studies of 40 patients with neck tumours were post-processed using two software packages (Software-1: deconvolution-based analysis with adiabatic tissue homogeneity approach and Software-2: maximum-slope-model with Patlak analysis). Eight patients were examined twice for assessing the reproducibility of the technique. Two neuroradiologists separately post-processed the images with two arterial input functions (AIFs): (1) the internal carotid artery (ICA) and (2) the vertebral artery (VA). Maps of blood flow (F) in ml/min/100 g, blood volume (V) in ml/100 g, mean transit time (MTT) in seconds (s) and permeability (PS) in ml/min/100 g were generated. The mean F, V, MTT and PS (Software-1) with VA-AIF and ICA-AIF were 8.93, 1.12, 16.3, 1.88 and 8.57, 1.19, 16.85 and 1.94, respectively. The reproducibility of the techniques was satisfactory, while the V and MTT values (in Software-1) and the F and V values (in Software-2) were dependent on the site of the AIF (p{>=}0.03 and p=0.02, respectively). The interobserver agreement was very good. The significant differences in measurements for a single patient (%) using Software-1/Software-2 were {+-}120%/110%, 90%/80%, 180% and 250%/130% for F, V, MTT and PS, respectively. Only F and PS values in the healthy tissue seemed to be interchangeable. Our results were in essential agreement with those derived by invasive measurements in animals. The cervical spine perfusion CT studies are feasible and reproducible. The present knowledge has to be validated with studies in spinal cord tumours in order to decide the usefulness of the perfusion CT in this field. (orig.)
Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.
2014-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results
Frattare, L. M.; Salzer, J. J.
1996-05-01
We present an update on the KPNO International Spectroscopic Survey (KISS) project. KISS is a wide-field survey for extragalactic emission-line objects being carried out with the Burrell Schmidt at Kitt Peak. While we are utilizing the classical objective-prism technique to find strong-lined star-forming galaxies and AGNs, the use of CCD detectors and automated reduction software promise to make KISS a powerful tool for the study of activity in galaxies. We are currently completing our first survey strip (100 square degrees). The data consist of deep (to B = 20) objective-prism images, deep direct images in both B and V, and small-format photometric calibration images of each field. The KISS reduction package was designed to run under the IRAF image processing environment, and will eventually grow to be a complete IRAF package. Tasks added to the package over the past year include precise astrometry and photometry modules. The astrometry routines utilize the HST Guide Star Catalog to perform a full plate solution on the direct image of each Schmidt field, and then assign accurate equatorial coordinates to each object in the field. The photometry module performs aperture photometry on the direct images for all objects in the KISS database catalog, and provides routines to transfer the photometry calibration from the small-format images taken under photometric conditions to the large-format survey images. Extensive tests and modifications have also been carried out on the pre-existing software described by Herrero & Salzer (1995) in order to better fine-tune the reduction procedures and parameter settings. In addition to presenting a complete description of the new software, we describe the current status of the survey and present some preliminary characteristics of the sample. Other members of the KISS project include V. Lipovetsky & A. Kniazev (S.A.O.), T. Boroson (NOAO/USGP), T. Thuan (U. Virginia), J. Moody (BYU), Y. Izotov (Ukrainian Acad. Sci.), and J. Herrero
Classroom Computer Learning, 1990
1990-01-01
Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)
Dwyer, Donna; And Others
1989-01-01
Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)
Knowlton, Robert G.; Arnold, Bill Walter; Mattie, Patrick D.
2007-03-01
Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in the assessment of radioactive waste disposal and at the time of this publication is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. In countries with small radioactive waste programs, international technology transfer program efforts are often hampered by small budgets, schedule constraints, and a lack of experienced personnel. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available software codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission (NRC) and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, revitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a credible and solid computational platform for constructing probabilistic safety assessment models. This document is a reference users guide for the GoldSim/BLT-MS integrated modeling software package developed as part of a cooperative technology transfer project between Sandia National Laboratories and the Institute of Nuclear Energy Research (INER) in Taiwan for the preliminary assessment of several candidate low
Component-level test of molded freeform optics for LED beam shaping using experimental ray tracing
Gutierrez, Gustavo; Hilbig, David; Fleischmann, Friedrich; Henning, Thomas
2017-06-01
Due to the high demand of LED light sources, the need to modify their radiation pattern to meet specific application requirements has also increased. This is mostly achieved by using molded secondary optics, which are composed of a combination of several aspherical and freeform surfaces. Unfortunately, the manufacturers of these secondary optics only provide output information at system level, making impossible to independently characterize the secondary optic in order to determine the sources of erroneous results. For this reason, it is necessary to perform a component-level verification leading to the validation of the correctness of the produced secondary optic independently of the light source. To understand why traditional inspection methods fail, it is necessary to take into account that not only errors due to irregularities on the lens surface like pores, glass indentations or scratches affect the performance of the lens, but also differences in refractive index appear after the compression during fabrication process. These internal alterations are generally produced during the cooling stage and their effect over the performance of the lens are not possible to be measured using tactile techniques. Additionally, the small size of the lens and the freeform characteristics of its surface introduce additional difficulties to perform its validation. In this work, the component-level test is done by obtaining the ray mapping function (RMF) which describes the deflection of the light beam as a function of the input angle. To obtain the RMF, firstly a collimated light source is held fix and the lens is rotated. Thus, a virtual point source is created and subsequently by using experimental ray tracing it is possible to determine the ray slopes, which are used to the retrieve the RMF. Under the assumption that the optical system under analysis is lossless and considering the principle of energy conservation, it is possible under specific conditions to use this new
PONDEROSA-C/S: client–server based software package for automated protein 3D structure determination
Lee, Woonghee; Stark, Jaime L.; Markley, John L.
2014-01-01
Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727–1728. doi:10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nucle...
Malone, R.; Wang, X.J.
1999-06-14
BY WRITING BOTH A CUSTOM WINDOWS(NTTM) DYNAMIC LINK LIBRARY AND GENERIC COMPANION SERVER SOFTWARE, THE INTRINSIC FUNCTIONS OF MATHSOFT MATHCAD(TM) HAVE BEEN EXTENDED WITH NEW CAPABILITIES WHICH PERMIT DIRECT ACCESS TO THE CONTROL SYSTEM DATABASES OF BROOKHAVEN NATIONAL LABORATORY ACCELERATOR TEST FACILITY. UNDER THIS SCHEME, A MATHCAD WORKSHEET EXECUTING ON A PERSONAL COMPUTER BECOMES A CLIENT WHICH CAN BOTH IMPORT AND EXPORT DATA TO A CONTROL SYSTEM SERVER VIA A NETWORK STREAM SOCKET CONNECTION. THE RESULT IS AN ALTERNATIVE, MATHEMATICALLY ORIENTED VIEW OF CONTROLLING THE ACCELERATOR INTERACTIVELY.
Lopez-Rendon, X. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Bosmans, H.; Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Oyen, R. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium)
2015-07-15
To evaluate the effect of including tube current modulation (TCM) versus using the average mAs in estimating organ and effective dose (E) using commercial software. Forty adult patients (24 females, 16 males) with normal BMI underwent chest/abdomen computed tomography (CT) performed with TCM at 120 kVp, reference mAs of 110 (chest) and 200 (abdomen). Doses to fully irradiated organs (breasts, lungs, stomach, liver and ovaries) and E were calculated using two versions of a dosimetry software: v.2.0, which uses the average mAs, and v.2.2, which accounts for TCM by implementing a gender-specific mAs profile. Student's t-test was used to assess statistically significant differences between organ doses calculated with the two versions. A statistically significant difference (p < 0.001) was found for E on chest and abdomen CT, with E being lower by 4.2 % when TCM is considered. Similarly, organ doses were also significantly lower (p < 0.001): 13.7 % for breasts, 7.3 % for lungs, 9.1 % for the liver and 8.5 % for the stomach. Only the dose to the ovaries was higher with TCM (11.5 %). When TCM is used, for the stylized phantom, the doses to lungs, breasts, stomach and liver decreased while the dose to the ovaries increased. (orig.)
The Adaptive Buffered Force QM/MM method in the CP2K and AMBER software packages
Mones, Letif; Götz, Andreas W; Laino, Teodoro; Walker, Ross C; Leimkuhler, Ben; Csányi, Gábor; Bernstein, Noam
2014-01-01
The implementation and validation of the adaptive buffered force QM/MM method in two popular packages, CP2K and AMBER are presented. The implementations build on the existing QM/MM functionality in each code, extending it to allow for redefinition of the QM and MM regions during the simulation and reducing QM-MM interface errors by discarding forces near the boundary according to the buffered force-mixing approach. New adaptive thermostats, needed by force-mixing methods, are also implemented. Different variants of the method are benchmarked by simulating the structure of bulk water, water autoprotolysis in the presence of zinc and dimethyl-phosphate hydrolysis using various semiempirical Hamiltonians and density functional theory as the QM model. It is shown that with suitable parameters, based on force convergence tests, the adaptive buffered-force QM/MM scheme can provide an accurate approximation of the structure in the dynamical QM region matching the corresponding fully QM simulations, as well as reprod...
Kimihiro Noguchi
2012-09-01
Full Text Available Longitudinal data from factorial experiments frequently arise in various fields of study, ranging from medicine and biology to public policy and sociology. In most practical situations, the distribution of observed data is unknown and there may exist a number of atypical measurements and outliers. Hence, use of parametric and semi-parametric procedures that impose restrictive distributional assumptions on observed longitudinal samples becomes questionable. This, in turn, has led to a substantial demand for statistical procedures that enable us to accurately and reliably analyze longitudinal measurements in factorial experiments with minimal conditions on available data, and robust nonparametric methodology offering such a possibility becomes of particular practical importance. In this article, we introduce a new R package nparLD which provides statisticians and researchers from other disciplines an easy and user-friendly access to the most up-to-date robust rank-based methods for the analysis of longitudinal data in factorial settings. We illustrate the implemented procedures by case studies from dentistry, biology, and medicine.
McStas 1.7 - a new version of the flexible Monte Carlo neutron scattering package
Willendrup, P.; Farhi, E.; Lefmann, K.
2004-01-01
Current neutron instrumentation is both complex and expensive, and accurate simulation has become essential both for building new instruments and for using them effectively. The McStas neutron ray-trace simulation package is a versatile tool for producing such simulations, developed in collaborat......Current neutron instrumentation is both complex and expensive, and accurate simulation has become essential both for building new instruments and for using them effectively. The McStas neutron ray-trace simulation package is a versatile tool for producing such simulations, developed...
Meier Harald
2006-05-01
Full Text Available Abstract Background Availability of high-resolution RNA crystal structures for the 30S and 50S ribosomal subunits and the subsequent validation of comparative secondary structure models have prompted the biologists to use three-dimensional structure of ribosomal RNA (rRNA for evaluating sequence alignments of rRNA genes. Furthermore, the secondary and tertiary structural features of rRNA are highly useful and successfully employed in designing rRNA targeted oligonucleotide probes intended for in situ hybridization experiments. RNA3D, a program to combine sequence alignment information with three-dimensional structure of rRNA was developed. Integration into ARB software package, which is used extensively by the scientific community for phylogenetic analysis and molecular probe designing, has substantially extended the functionality of ARB software suite with 3D environment. Results Three-dimensional structure of rRNA is visualized in OpenGL 3D environment with the abilities to change the display and overlay information onto the molecule, dynamically. Phylogenetic information derived from the multiple sequence alignments can be overlaid onto the molecule structure in a real time. Superimposition of both statistical and non-statistical sequence associated information onto the rRNA 3D structure can be done using customizable color scheme, which is also applied to a textual sequence alignment for reference. Oligonucleotide probes designed by ARB probe design tools can be mapped onto the 3D structure along with the probe accessibility models for evaluation with respect to secondary and tertiary structural conformations of rRNA. Conclusion Visualization of three-dimensional structure of rRNA in an intuitive display provides the biologists with the greater possibilities to carry out structure based phylogenetic analysis. Coupled with secondary structure models of rRNA, RNA3D program aids in validating the sequence alignments of rRNA genes and evaluating
Genser, Krzysztof [Fermilab; Hatcher, Robert [Fermilab; Perdue, Gabriel [Fermilab; Wenzel, Hans [Fermilab; Yarba, Julia [Fermilab; Kelsey, Michael [SLAC; Wright, Dennis H. [SLAC
2016-11-10
The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.
Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H
2016-01-01
The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.
Chorus wave-normal statistics in the Earth's radiation belts from ray tracing technique
H. Breuillard
2012-08-01
Full Text Available Discrete ELF/VLF (Extremely Low Frequency/Very Low Frequency chorus emissions are one of the most intense electromagnetic plasma waves observed in radiation belts and in the outer terrestrial magnetosphere. These waves play a crucial role in the dynamics of radiation belts, and are responsible for the loss and the acceleration of energetic electrons. The objective of our study is to reconstruct the realistic distribution of chorus wave-normals in radiation belts for all magnetic latitudes. To achieve this aim, the data from the electric and magnetic field measurements onboard Cluster satellite are used to determine the wave-vector distribution of the chorus signal around the equator region. Then the propagation of such a wave packet is modeled using three-dimensional ray tracing technique, which employs K. Rönnmark's WHAMP to solve hot plasma dispersion relation along the wave packet trajectory. The observed chorus wave distributions close to waves source are first fitted to form the initial conditions which then propagate numerically through the inner magnetosphere in the frame of the WKB approximation. Ray tracing technique allows one to reconstruct wave packet properties (electric and magnetic fields, width of the wave packet in k-space, etc. along the propagation path. The calculations show the spatial spreading of the signal energy due to propagation in the inhomogeneous and anisotropic magnetized plasma. Comparison of wave-normal distribution obtained from ray tracing technique with Cluster observations up to 40° latitude demonstrates the reliability of our approach and applied numerical schemes.
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
Maxwell, Evan; Tan, Yan; Tan, Yuxiang; Hu, Han; Benson, Gary; Aizikov, Konstantin; Conley, Shannon; Staples, Gregory O.; Slysz, Gordon W.; Smith, Richard D.; Zaia, Joseph
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile the glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.
Hellrung, Lydia; Hollmann, Maurice; Zscheyge, Oliver; Schlumm, Torsten; Kalberlah, Christian; Roggenhofer, Elisabeth; Okon-Singer, Hadas; Villringer, Arno; Horstmann, Annette
2015-01-01
In this work we present a new open source software package offering a unified framework for the real-time adaptation of fMRI stimulation procedures. The software provides a straightforward setup and highly flexible approach to adapt fMRI paradigms while the experiment is running. The general framework comprises the inclusion of parameters from subject’s compliance, such as directing gaze to visually presented stimuli and physiological fluctuations, like blood pressure or pulse. Additionally, this approach yields possibilities to investigate complex scientific questions, for example the influence of EEG rhythms or fMRI signals results themselves. To prove the concept of this approach, we used our software in a usability example for an fMRI experiment where the presentation of emotional pictures was dependent on the subject’s gaze position. This can have a significant impact on the results. So far, if this is taken into account during fMRI data analysis, it is commonly done by the post-hoc removal of erroneous trials. Here, we propose an a priori adaptation of the paradigm during the experiment’s runtime. Our fMRI findings clearly show the benefits of an adapted paradigm in terms of statistical power and higher effect sizes in emotion-related brain regions. This can be of special interest for all experiments with low statistical power due to a limited number of subjects, a limited amount of time, costs or available data to analyze, as is the case with real-time fMRI. PMID:25837719
Kohei Arai
2013-01-01
Full Text Available Monte Carlo Ray Tracing: MCRT based sensitivity analysis of the geophysical parameters (the atmosphere and the ocean on Top of the Atmosphere: TOA radiance in visible to near infrared wavelength regions is conducted. As the results, it is confirmed that the influence due to the atmosphere is greater than that of the ocean. Scattering and absorption due to aerosol particles and molecules in the atmosphere is major contribution followed by water vapor and ozone while scattering due to suspended solid is dominant contribution for the ocean parameters.
Cervera, M. A.; Harris, T. J.
2014-01-01
The Defence Science and Technology Organisation (DSTO) has initiated an experimental program, Spatial Ionospheric Correlation Experiment, utilizing state-of-the-art DSTO-designed high frequency digital receivers. This program seeks to understand ionospheric disturbances at scales employ a 3-D magnetoionic Hamiltonian ray tracing engine, developed by DSTO, to (1) model the various disturbance features observed on both the O and X polarization modes in our QVI data and (2) understand how they are produced. The ionospheric disturbances which produce the observed features were modeled by perturbing the ionosphere with atmospheric gravity waves.
He, Wenjun; Fu, Yuegang; Zheng, Yang; Zhang, Lei; Wang, Jiake; Liu, Zhiying; Zheng, Jianping
2013-07-01
The output polarization states of corner cubes (for both uncoated and metal-coated surfaces) with an input beam of arbitrary polarization state and of arbitrary tilt angle to the cube have been analyzed by using the three-dimensional polarization ray-tracing matrix method. The diattenuation and retardance of the corner-cube retroreflector (CCR) for all six different ray paths are calculated, and the relationships to the tilt angle and the tilt orientation angle are shown. When the tilt angle is large, hollow metal-coated CCR is more appropriate than solid metal-coated CCR for the case that the polarization states of output beam should be controlled.
Pujol Nadal, Ramon; Martínez Moll, Víctor
2013-10-20
Fixed-mirror solar concentrators (FMSCs) use a static reflector and a moving receiver. They are easily installable on building roofs. However, for high-concentration factors, several flat mirrors would be needed. If curved mirrors are used instead, high-concentration levels can be achieved, and such a solar concentrator is called a curved-slats fixed-mirror solar concentrator (CSFMSC), on which little information is available. Herein, a methodology is proposed to characterize the CSFMSC using 3D ray-tracing tools. The CSFMSC shows better optical characteristics than the FMSC, as it needs fewer reflector segments for achieving the same concentration and optical efficiency.
Jefferies, K.
1994-01-01
OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at
de Hoop, Bartjan; Gietema, Hester; van Ginneken, Bram; Zanen, Pieter; Groenewegen, Gerard; Prokop, Mathias
2009-01-01
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth
Hoop, B. de; Gietema, H.; Ginneken, B. van; Zanen, P.; Groenewegen, G.; Prokop, M.
2009-01-01
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth
Teaching Materials “Surface Area of Geometric Figures,” Created Using the Software Package GeoGebra
Slaviša RADOVIĆ
2013-01-01
Full Text Available Social development, the progress of technology, and changing economic forces certainly affect the development of the current educational system. One of the main problems of today’s school system is how to maintain focus, concentration, and interest among students with regards to the learning that takes place during classes. An important feature of modern teaching is multimedia. However, the use of multimedia requires a certain transformation in the teaching process. Given the fact that the focus of the teaching process has been shifting away from the curriculum and the teacher, and towards the student, multimedia will undoubtedly contribute in a significant way to the modernization of traditional teaching. It is not unrealistic to think that technology will become a regular part of the daily routines of teaching. In this work, an innovative approach to teaching mathematics in elementary and high schools through the use of the software known as GeoGebra is introduced. This approach is demonstrated using the example of ‘surface area’ as a mathematical concept, wherein the goal was to increase interactivity between teachers and students, and to improve the quality of teaching.
Daveler, S.A.; Wolery, T.J.
1992-12-17
EQPT is a data file preprocessor for the EQ3/6 software package. EQ3/6 currently contains five primary data files, called datao files. These files comprise alternative data sets. These data files contain both standard state and activity coefficient-related data. Three (com, sup, and nea) support the use of the Davies or B-dot equations for the activity coefficients; the other two (hmw and pit) support the use of Pitzer`s (1973, 1975) equations. The temperature range of the thermodynamic data on these data files varies from 25{degrees}C only to 0-300{degrees}C. The principal modeling codes in EQ3/6, EQ3NR and EQ6, do not read a data0 file, however. Instead, these codes read an unformatted equivalent called a data1 file. EQPT writes a datal file, using the corresponding data0 file as input. In processing a data0 file, EQPT checks the data for common errors, such as unbalanced reactions. It also conducts two kinds of data transformation. Interpolating polynomials are fit to data which are input on temperature adds. The coefficients of these polynomials are then written on the datal file in place of the original temperature grids. A second transformation pertains only to data files tied to Pitzer`s equations. The commonly reported observable Pitzer coefficient parameters are mapped into a set of primitive parameters by means of a set of conventional relations. These primitive form parameters are then written onto the datal file in place of their observable counterparts. Usage of the primitive form parameters makes it easier to evaluate Pitzer`s equations in EQ3NR and EQ6. EQPT and the other codes in the EQ3/6 package are written in FORTRAN 77 and have been developed to run under the UNIX operating system on computers ranging from workstations to supercomputers.
Wolery, T.J.
1979-02-01
The newly developed EQ/36 software package computes equilibrium models of aqueous geochemical systems. The package contains two principal programs: EQ3 performs distribution-of-species calculations for natural water compositions; EQ6 uses the results of EQ3 to predict the consequences of heating and cooling aqueous solutions and of irreversible reaction in rock--water systems. The programs are valuable for studying such phenomena as the formation of ore bodies, scaling and plugging in geothermal development, and the long-term disposal of nuclear waste. EQ3 and EQ6 are compared with such well-known geochemical codes as SOLMNEQ, WATEQ, REDEQL, MINEQL, and PATHI. The data base allows calculations in the temperature interval 0 to 350{sup 0}C, at either 1 atm-steam saturation pressures or a constant 500 bars. The activity coefficient approximations for aqueous solutes limit modeling to solutions of ionic strength less than about one molal. The mathematical derivations and numerical techniques used in EQ6 are presented in detail. The program uses the Newton--Raphson method to solve the governing equations of chemical equilibrium for a system of specified elemental composition at fixed temperature and pressure. Convergence is aided by optimizing starting estimates and by under-relaxation techniques. The minerals present in the stable phase assemblage are found by several empirical methods. Reaction path models may be generated by using this approach in conjunction with finite differences. This method is analogous to applying high-order predictor--corrector methods to integrate a corresponding set of ordinary differential equations, but avoids propagation of error (drift). 8 figures, 9 tables.
Gliss, Jonas; Stebel, Kerstin; Kylling, Arve; Solvejg Dinger, Anna; Sihler, Holger; Sudbø, Aasmund
2017-04-01
UV SO2 cameras have become a common method for monitoring SO2 emission rates from volcanoes. Scattered solar UV radiation is measured in two wavelength windows, typically around 310 nm and 330 nm (distinct / weak SO2 absorption) using interference filters. The data analysis comprises the retrieval of plume background intensities (to calculate plume optical densities), the camera calibration (to convert optical densities into SO2 column densities) and the retrieval of gas velocities within the plume as well as the retrieval of plume distances. SO2 emission rates are then typically retrieved along a projected plume cross section, for instance a straight line perpendicular to the plume propagation direction. Today, for most of the required analysis steps, several alternatives exist due to ongoing developments and improvements related to the measurement technique. We present piscope, a cross platform, open source software toolbox for the analysis of UV SO2 camera data. The code is written in the Python programming language and emerged from the idea of a common analysis platform incorporating a selection of the most prevalent methods found in literature. piscope includes several routines for plume background retrievals, routines for cell and DOAS based camera calibration including two individual methods to identify the DOAS field of view (shape and position) within the camera images. Gas velocities can be retrieved either based on an optical flow analysis or using signal cross correlation. A correction for signal dilution (due to atmospheric scattering) can be performed based on topographic features in the images. The latter requires distance retrievals to the topographic features used for the correction. These distances can be retrieved automatically on a pixel base using intersections of individual pixel viewing directions with the local topography. The main features of piscope are presented based on dataset recorded at Mt. Etna, Italy in September 2015.
Sardina, V.
2012-12-01
The US Tsunami Warning Centers (TWCs) have traditionally generated their tsunami message products primarily as blocks of text then tagged with headers that identify them on each particular communications' (comms) circuit. Each warning center has a primary area of responsibility (AOR) within which it has an authoritative role regarding parameters such as earthquake location and magnitude. This means that when a major tsunamigenic event occurs the other warning centers need to quickly access the earthquake parameters issued by the authoritative warning center before issuing their message products intended for customers in their own AOR. Thus, within the operational context of the TWCs the scientists on duty have an operational need to access the information contained in the message products issued by other warning centers as quickly as possible. As a solution to this operational problem we designed and implemented a C++ software package that allows scanning for and parsing the entire suite of tsunami message products issued by the Pacific Tsunami Warning Center (PTWC), the West Coast and Alaska Tsunami Warning Center (WCATWC), and the Japan Meteorological Agency (JMA). The scanning and parsing classes composing the resulting C++ software package allow parsing both non-official message products(observatory messages) routinely issued by the TWCs, and all official tsunami message products such as tsunami advisories, watches, and warnings. This software package currently allows scientists on duty at the PTWC to automatically retrieve the parameters contained in tsunami messages issued by WCATWC, JMA, or PTWC itself. Extension of the capabilities of the classes composing the software package would make it possible to generate XML and CAP compliant versions of the TWCs' message products until new messaging software natively adds this capabilities. Customers who receive the TWCs' tsunami message products could also use the package to automatically retrieve information from
Cui Xiaoqi
2010-08-01
Full Text Available Abstract Background Identification of transcription factors (TFs involved in a biological process is the first step towards a better understanding of the underlying regulatory mechanisms. However, due to the involvement of a large number of genes and complicated interactions in a gene regulatory network (GRN, identification of the TFs involved in a biology process remains to be very challenging. In reality, the recognition of TFs for a given a biological process can be further complicated by the fact that most eukaryotic genomes encode thousands of TFs, which are organized in gene families of various sizes and in many cases with poor sequence conservation except for small conserved domains. This poses a significant challenge for identification of the exact TFs involved or ranking the importance of a set of TFs to a process of interest. Therefore, new methods for recognizing novel TFs are desperately needed. Although a plethora of methods have been developed to infer regulatory genes using microarray data, it is still rare to find the methods that use existing knowledge base in particular the validated genes known to be involved in a process to bait/guide discovery of novel TFs. Such methods can replace the sometimes-arbitrary process of selection of candidate genes for experimental validation and significantly advance our knowledge and understanding of the regulation of a process. Results We developed an automated software package called TF-finder for recognizing TFs involved in a biological process using microarray data and existing knowledge base. TF-finder contains two components, adaptive sparse canonical correlation analysis (ASCCA and enrichment test, for TF recognition. ASCCA uses positive target genes to bait TFS from gene expression data while enrichment test examines the presence of positive TFs in the outcomes from ASCCA. Using microarray data from salt and water stress experiments, we showed TF-finder is very efficient in recognizing
Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER
Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena
2015-11-01
Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.
Kolski, Jeffrey S. [Los Alamos National Laboratory; Barlow, David B. [Los Alamos National Laboratory; Macek, Robert J. [Los Alamos National Laboratory; McCrady, Rodney C. [Los Alamos National Laboratory
2011-01-01
Particle ray tracing through simulated 3D magnetic fields was executed to investigate the effective quadrupole strength of the edge focusing of the rectangular bending magnets in the Los Alamos Proton Storage Ring (PSR). The particle rays receive a kick in the edge field of the rectangular dipole. A focal length may be calculated from the particle tracking and related to the fringe field integral (FINT) model parameter. This tech note introduces the baseline lattice model of the PSR and motivates the need for an improvement in the baseline model's vertical tune prediction, which differs from measurement by .05. An improved model of the PSR is created by modifying the fringe field integral parameter to those suggested by the ray tracing investigation. This improved model is then verified against measurement at the nominal PSR operating set point and at set points far away from the nominal operating conditions. Lastly, Linear Optics from Closed Orbits (LOCO) is employed in an orbit response matrix method for model improvement to verify the quadrupole strengths of the improved model.
Klemm, Matthias; Schweitzer, Dietrich; Peters, Sven; Sauer, Lydia; Hammer, Martin; Haueisen, Jens
2015-01-01
Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX) for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay) to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX’s applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation. PMID:26192624
Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hirt, Evelyn H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dib, Gerges [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Veeramany, Arun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bonebrake, Christopher A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Roy, Surajit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2016-09-20
This project involved the development of enhanced risk monitors (ERMs) for active components in Advanced Reactor (AdvRx) designs by integrating real-time information about equipment condition with risk monitors. Health monitoring techniques in combination with predictive estimates of component failure based on condition and risk monitors can serve to indicate the risk posed by continued operation in the presence of detected degradation. This combination of predictive health monitoring based on equipment condition assessment and risk monitors can also enable optimization of maintenance scheduling with respect to the economics of plant operation. This report summarizes PNNL’s multi-year project on the development and evaluation of an ERM concept for active components while highlighting FY2016 accomplishments. Specifically, this report provides a status summary of the integration and demonstration of the prototypic ERM framework with the plant supervisory control algorithms being developed at Oak Ridge National Laboratory (ORNL), and describes additional case studies conducted to assess sensitivity of the technology to different quantities. Supporting documentation on the software package to be provided to ONRL is incorporated in this report.
Walton, James S.; Hodgson, Peter; Hallamasek, Karen; Palmer, Jake
2003-07-01
4DVideo is creating a general purpose capability for capturing and analyzing kinematic data from video sequences in near real-time. The core element of this capability is a software package designed for the PC platform. The software ("4DCapture") is designed to capture and manipulate customized AVI files that can contain a variety of synchronized data streams -- including audio, video, centroid locations -- and signals acquired from more traditional sources (such as accelerometers and strain gauges.) The code includes simultaneous capture or playback of multiple video streams, and linear editing of the images (together with the ancilliary data embedded in the files). Corresponding landmarks seen from two or more views are matched automatically, and photogrammetric algorithms permit multiple landmarks to be tracked in two- and three-dimensions -- with or without lens calibrations. Trajectory data can be processed within the main application or they can be exported to a spreadsheet where they can be processed or passed along to a more sophisticated, stand-alone, data analysis application. Previous attempts to develop such applications for high-speed imaging have been limited in their scope, or by the complexity of the application itself. 4DVideo has devised a friendly ("FlowStack") user interface that assists the end-user to capture and treat image sequences in a natural progression. 4DCapture employs the AVI 2.0 standard and DirectX technology which effectively eliminates the file size limitations found in older applications. In early tests, 4DVideo has streamed three RS-170 video sources to disk for more than an hour without loss of data. At this time, the software can acquire video sequences in three ways: (1) directly, from up to three hard-wired cameras supplying RS-170 (monochrome) signals; (2) directly, from a single camera or video recorder supplying an NTSC (color) signal; and (3) by importing existing video streams in the AVI 1.0 or AVI 2.0 formats. The
A model of the AGS based on stepwise ray-tracing through the measured field maps of the main magnets
Dutheil Y.; Meot, F.; Tsoupas, N.
2012-05-20
Two-dimensional mid-plane magnetic field maps of two of the main AGS magnets were produced, from Hall probe measurements, for a series of different current settings. The analysis of these data yielded the excitation functions [1] and the harmonic coefficients [2] of the main magnets which have been used so far in all the models of the AGS. The constant increase of the computation power makes it possible today to directly use a stepwise raytracing through these measured field maps with a reasonable computation time. We describe in detail how these field maps have allowed the generation of models of the 6 different types of AGS main magnets, and how they are being handled with the Zgoubi ray-tracing code [3]. We give and discuss a number of results obtained regarding both beam and spin dynamics in the AGS, and we provide comparisons with other numerical and analytical modelling methods.
He, Wenjun; Fu, Yuegang; Liu, Zhiying; Zhang, Lei; Wang, Jiake; Zheng, Yang; Li, Yahong
2017-03-01
The polarization aberrations of a complex optical system with multi-element lens have been investigated using a 3D polarization aberration function. The 3D polarization ray-tracing matrix has been combined with the optical path difference to obtain a 3D polarization aberration function, which avoids the need for a complicated phase unwrapping process. The polarization aberrations of a microscope objective have been analyzed to include, the distributions of 3D polarization aberration functions, diattenuation aberration, retardance aberration, and polarization-dependent intensity on the exit pupil. Further, the aberrations created by the field of view and the coating on the distribution rules of 3D polarization aberration functions are discussed in detail. Finally a novel appropriate field of view and wavelength correction is proposed for a polarization aberration function which optimizes the image quality of a multi-element optical system.
Nugent, Allen H; Bertram, Christopher D
2010-02-01
Prediction of the effects of refractive index (RI) mismatch on laser Doppler anemometer (LDA) measurements within a curvilinear cavity (an artificial ventricle) was achieved by developing a general technique for modelling the paths of the convergent beams of the LDA system using 3D vector geometry. Validated by ray tracing through CAD drawings, the predicted maximum tolerance in RI between the solid model and the working fluid was +/- 0.0005, equivalent to focusing errors commensurate with the geometric and alignment uncertainties associated with the flow model and the LDA arrangement. This technique supports predictions of the effects of refraction within a complex geometry. Where the RI mismatch is unavoidable but known, it is possible not only to calculate the true position of the measuring volume (using the probe location and model geometry), but also to estimate degradation in signal quality arising from differential displacement and refraction of the laser beams.
The Ray Tracing Analytical Solution within the RAMOD framework. The case of a Gaia-like observer
Crosta, Mariateresa; de Felice, Fernando; Lattanzi, Mario Gilberto
2015-01-01
This paper presents the analytical solution of the inverse ray tracing problem for photons emitted by a star and collected by an observer located in the gravitational field of the Solar System. This solution has been conceived to suit the accuracy achievable by the ESA Gaia satellite (launched on December 19, 2013) consistently with the measurement protocol in General relativity adopted within the RAMOD framework. Aim of this study is to provide a general relativistic tool for the science exploitation of such a revolutionary mission, whose main goal is to trace back star directions from within our local curved space-time, therefore providing a three-dimensional map of our Galaxy. The results are useful for a thorough comparison and cross-checking validation of what already exists in the field of Relativistic Astrometry. Moreover, the analytical solutions presented here can be extended to model other measurements that require the same order of accuracy expected for Gaia.
改进的地震模型初值射线追踪方法%Improved Seismic Model Initial Value Ray Tracing Method
贺中银; 高阳
2011-01-01
The initial value ray tracing method is one of major method in modem ray tracing methods. It overcomes time-consuming computing efficiency in two spots ray tracing. Based on eikonal equation, improved initial value ray tracing, that is using square slowness to replace velocity parameters in model, make eikonal equation produces analytic solutions, a step further to derive computing expressions of reflection and transmission slowness vectors when the ray confiont with interface, and reflection and transmission coefficients function expressions. Through ray tracings of simple two layered interface syncline model and complex multiple layered salt-dome model, have shown the improvement of initial value ray tracing by comparison with Runge-Kutta discrete numerical solution, not only improved ray tracing efficiency (about 10 times), but also extended limit for the use of ray tracing method.%初值射线追踪方法是现代射线追踪方法中的一个很重要的理论,它克服了两点法射线追踪方法耗时的计算效率问题.以程函方程为基础,对初值射线追踪方法进行改进,即利用平方慢度来替换模型中的速度参数,使得程函方程产生解析解,从而进一步导出当射线遇到界面时的反射和透射慢度向量的计算表达式,以及反射、透射系数的函数表达式.通过对简单的两层界面向斜模型及复杂的多层盐丘模型的射线追踪,表明该初值射线追踪方法的改进相比于以往的龙格库塔离散数值解法,不但使射线追踪效率得到了大幅度提高(10倍左右),且也扩大了射线法使用范围.
Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions
Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.
2016-07-01
We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.
Tsujimura, T., Ii; Kubo, S.; Takahashi, H.; Makino, R.; Seki, R.; Yoshimura, Y.; Igami, H.; Shimozuma, T.; Ida, K.; Suzuki, C.; Emoto, M.; Yokoyama, M.; Kobayashi, T.; Moon, C.; Nagaoka, K.; Osakabe, M.; Kobayashi, S.; Ito, S.; Mizuno, Y.; Okada, K.; Ejiri, A.; Mutoh, T.
2015-11-01
The central electron temperature has successfully reached up to 7.5 keV in large helical device (LHD) plasmas with a central high-ion temperature of 5 keV and a central electron density of 1.3× {{10}19} m-3. This result was obtained by heating with a newly-installed 154 GHz gyrotron and also the optimisation of injection geometry in electron cyclotron heating (ECH). The optimisation was carried out by using the ray-tracing code ‘LHDGauss’, which was upgraded to include the rapid post-processing three-dimensional (3D) equilibrium mapping obtained from experiments. For ray-tracing calculations, LHDGauss can automatically read the relevant data registered in the LHD database after a discharge, such as ECH injection settings (e.g. Gaussian beam parameters, target positions, polarisation and ECH power) and Thomson scattering diagnostic data along with the 3D equilibrium mapping data. The equilibrium map of the electron density and temperature profiles are then extrapolated into the region outside the last closed flux surface. Mode purity, or the ratio between the ordinary mode and the extraordinary mode, is obtained by calculating the 1D full-wave equation along the direction of the rays from the antenna to the absorption target point. Using the virtual magnetic flux surfaces, the effects of the modelled density profiles and the magnetic shear at the peripheral region with a given polarisation are taken into account. Power deposition profiles calculated for each Thomson scattering measurement timing are registered in the LHD database. The adjustment of the injection settings for the desired deposition profile from the feedback provided on a shot-by-shot basis resulted in an effective experimental procedure.
Sarmah, Nabin; Richards, Bryce S; Mallick, Tapas K
2011-07-01
We present a detailed design concept and optical performance evaluation of stationary dielectric asymmetric compound parabolic concentrators (DiACPCs) using ray-tracing methods. Three DiACPC designs, DiACPC-55, DiACPC-66, and DiACPC-77, of acceptance half-angles (0° and 55°), (0° and 66°), and (0° and 77°), respectively, are designed in order to optimize the concentrator for building façade photovoltaic applications in northern latitudes (>55 °N). The dielectric concentrator profiles have been realized via truncation of the complete compound parabolic concentrator profiles to achieve a geometric concentration ratio of 2.82. Ray-tracing simulation results show that all rays entering the designed concentrators within the acceptance half-angle range can be collected without escaping from the parabolic sides and aperture. The maximum optical efficiency of the designed concentrators is found to be 83%, which tends to decrease with the increase in incidence angle. The intensity is found to be distributed at the receiver (solar cell) area in an inhomogeneous pattern for a wide range of incident angles of direct solar irradiance with high-intensity peaks at certain points of the receiver. However, peaks become more intense for the irradiation incident close to the extreme acceptance angles, shifting the peaks to the edge of the receiver. Energy flux distribution at the receiver for diffuse radiation is found to be homogeneous within ±12% with an average intensity of 520 W/m².
SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems
Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)
2013-10-01
SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.
Optimizing heliostat positions with local search metaheuristics using a ray tracing optical model
Reinholz, Andreas; Husenbeth, Christof; Schwarzbözl, Peter; Buck, Reiner
2017-06-01
The life cycle costs of solar tower power plants are mainly determined by the investment costs of its construction. Significant parts of these investment costs are used for the heliostat field. Therefore, an optimized placement of the heliostats gaining the maximal annual power production has a direct impact on the life cycle costs revenue ratio. We present a two level local search method implemented in MATLAB utilizing the Monte Carlo raytracing software STRAL [1] for the evaluation of the annual power output for a specific weighted annual time scheme. The algorithm was applied to a solar tower power plant (PS10) with 624 heliostats. Compared to former work of Buck [2], we were able to improve both runtime of the algorithm and quality of the output solutions significantly. Using the same environment for both algorithms, we were able to reach Buck's best solution with a speed up factor of about 20.
辛建丽; 徐桂香
2012-01-01
The key for road base construction quality control is to accurately measure and control the doses of cement and lime in cement stabilizer or lime stabilizer. Currently, the lime dose in field laboratory is estimated by human beings based on standard curve generated in EXCEL. Current practice is criticized for its low accuracy and long measurement time. We develop a new software package for rapid measurement of lime dose using EDTA titration. The software package is written in C# language and uses least - square fitting polynomial algorithm. The software package improves measurement accuracy and saves measurement time.%准确检测控制水泥或石灰稳定土中的水泥和石灰的剂量,是路面基层施工质量控制的关键.鉴于当前工地试验室对于灰剂量的现场检测大都是在EXCEL生成的标准曲线基础上采用人工估读得到.通过最小二乘法拟合多项式算法利用C#语言编程开发出适用于工地试验室用的EDTA滴定法快速测定灰剂量的软件.提高了该试验的检测精度并节省了时间.
Farace, Paolo; Righetto, Roberto; Deffet, Sylvain; Meijers, Arturs; Vander Stappen, Francois
2016-01-01
Purpose: To introduce a fast ray-tracing algorithm in pencil proton radiography (PR) with a multilayer ionization chamber (MLIC) for in vivo range error mapping. Methods: Pencil beam PR was obtained by delivering spots uniformly positioned in a square (45x45 mm(2) field-of-view) of 9x9 spots capable
Greenwald, R. A.; Frissell, N. A.; de Larquier, S.
2016-12-01
In this paper, we evaluate the performance of three methods used by HF radars in the SuperDARN network for determining the ground ranges to ionospheric scattering volumes. Each method uses somewhat different approaches, but the same equivalent-path analysis. We also show that Snell's Law can be added to this analysis to determine the refractive index of each scattering volume and thereby correct Doppler velocity measurements for ionospheric refraction. Two of these methods make their predictions using the group range to the scattering volume and a virtual height model, while the third method uses the group range and the elevation angle each backscattered return. The effectiveness of each of these methods is evaluated using ray tracing analyses through the International Reference Ionosphere. Ray tracings analysis provides determinations of the initial elevation angle, group range, group range, and refractive index of each ionospheric volume that backscatters signals to the radar. The initial or final elevation angle and the group range are used as inputs to the geolocation methods and the ground range and refractive index serve as reference data against which the predictions of the geolocation methods can be evaluated. We find that the methods using virtual height models actually change the initial elevation angle determined from ray tracing to a different elevation angle that is consistent with the virtual height model. Due to this change, predictions of the ground range and refractive index of scattering volumes located with virtual-height models are rarely consistent with the predictions obtained from ray tracing. In contrast, the geolocation method that uses the group range and initial or final elevation angle yields predictions that are in good agreement with ray tracing. Modifications to the equivalent-path analysis are required to obtain consistent predictions of the ground range and refractive index of backscatter from the topside F-layer.
Science and Children, 1990
1990-01-01
Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…
McGrath, Diane, Ed.
1989-01-01
Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)
郑紫英; 林水潮
2011-01-01
扫描探针显徽镜(SPM)是固体物体表面结构分析的重要手段.采用Visual Basic结合C语言编写动态链接库的方式设计了SPM图像处理的通用软件包.该软件包括SPM图像的显示、漂移校正、畸变校正、剖面线分析、傅里叶变换分析等,有效消除外界环境和仪器自身对SPM图像的干扰,丰富了SPM图的信息,并克服商品化仪器软件通用性差的缺点.%Scanning probe microscope (SPM) is a very important tool for surface analysis in real space. This paper reported a software package for processing SPM images using Visual Basic programming language and dynamic linkage library compiled by C programming language. The software package contained the display of SPM image, drift correction, line profile analysis and Fourier transformation analysis. Software effectively eliminated the impacts of the external environment and equipment on SPM image, enriched the information of SPM map,and overcomed the shortcomings of poor business software compatibility.
周庆华; 史建魁; 肖伏良
2011-01-01
A three-dimensional ray tracing study of a whistler-mode chorus is conducted for different geomagnetic activities by using a global core plasma density model. For the upperband chorus, the initial azimuthal wave angle affects slightly the projection of ray trajectories onto the plane （Z, √（x^2 ＋ y^2））, but controls the longitudinal propagation. The trajectory of the upper-band chorus is strongly associated with the plasmapause and the magnetic local time （MLT） of chorus source region. For the high geomagnetic activity, the chorus trajectory moves inward together with the plasmapause. In the bulge region, the plasmapause extends outward, while the chorus trajectory moves outward together with the plasmapause. For moderately or high geomagnetic activity, the lower-band chorus suffers low hybrid resonance （LHR） reflection before it reaches the plasmapause, leading to a weak correlation with the geomagnetic activity and magnetic local time of the chorus source region. For low geomagnetic activity, the lower-band chorus may be reflected firstly at the plasmapause instead of suffering LHR reflection, exhibiting a propagation characteristic similar to that of the upper-band chorus. The results provide a new insight into the propagation characteristics of the chorus for different geomagnetic activities and contribute to further understanding of the acceleration of energetic electron by a chorus wave.
Spin dynamics modeling in the AGS based on a stepwise ray-tracing method
Dutheil, Yann [Univ. of Grenoble (France)
2006-08-07
The AGS provides a polarized proton beam to RHIC. The beam is accelerated in the AGS from Gγ= 4.5 to Gγ = 45.5 and the polarization transmission is critical to the RHIC spin program. In the recent years, various systems were implemented to improve the AGS polarization transmission. These upgrades include the double partial snakes configuration and the tune jumps system. However, 100% polarization transmission through the AGS acceleration cycle is not yet reached. The current efficiency of the polarization transmission is estimated to be around 85% in typical running conditions. Understanding the sources of depolarization in the AGS is critical to improve the AGS polarized proton performances. The complexity of beam and spin dynamics, which is in part due to the specialized Siberian snake magnets, drove a strong interest for original methods of simulations. For that, the Zgoubi code, capable of direct particle and spin tracking through field maps, was here used to model the AGS. A model of the AGS using the Zgoubi code was developed and interfaced with the current system through a simple command: the AgsFromSnapRampCmd. Interfacing with the machine control system allows for fast modelization using actual machine parameters. Those developments allowed the model to realistically reproduce the optics of the AGS along the acceleration ramp. Additional developments on the Zgoubi code, as well as on post-processing and pre-processing tools, granted long term multiturn beam tracking capabilities: the tracking of realistic beams along the complete AGS acceleration cycle. Beam multiturn tracking simulations in the AGS, using realistic beam and machine parameters, provided a unique insight into the mechanisms behind the evolution of the beam emittance and polarization during the acceleration cycle. Post-processing softwares were developed to allow the representation of the relevant quantities from the Zgoubi simulations data. The Zgoubi simulations proved particularly
Comparison of VTEC from ground-based space geodetic techniques based on ray-traced mapping factors
Heinkelmann, Robert; Alizadeh, M. Mahdi; Schuh, Harald; Deng, Zhiguo; Zus, Florian; Etemadfard, M. Hossein
2016-07-01
For the derivation of vertical total electron content (VTEC) from slant total electron content (STEC), usually a standard approach is used based on mapping functions that assume a single-layer model of the ionosphere (e.g. IERS Conventions 2010). In our study we test the standard approach against a recently developed alternative which is based on station specific ray-traced mapping factors. For the evaluation of this new mapping concept, we compute VTEC at selected Very Long Baseline Interferometry (VLBI) stations using the dispersive delays and the corresponding formal errors obtained by observing extra-galactic radio sources at two radio frequencies in S- and X-bands by the permanent geodetic/astrometric program organized by the IVS (International VLBI Service for Geodesy and Astrometry). Additionally, by applying synchronous sampling and a consistent analysis configuration, we determine VTEC at Global Navigation Satellite System (GNSS) antennas using GPS (Global Positioning System) and/or GLONASS (Globalnaja nawigazionnaja sputnikowaja Sistema) observations provided by the IGS (International GNSS Service) that are operated in the vicinity of the VLBI antennas. We compare the VTEC time series obtained by the individual techniques over a period of about twenty years and describe their characteristics qualitatively and statistically. The length of the time series allows us to assess the long-term climatology of ionospheric VTEC during the last twenty years.
Fu, Lei
2017-05-11
Full-waveform inversion of land seismic data tends to get stuck in a local minimum associated with the waveform misfit function. This problem can be partly mitigated by using an initial velocity model that is close to the true velocity model. This initial starting model can be obtained by inverting traveltimes with ray-tracing traveltime tomography (RT) or wave-equation traveltime (WT) inversion. We have found that WT can provide a more accurate tomogram than RT by inverting the first-arrival traveltimes, and empirical tests suggest that RT is more sensitive to the additive noise in the input data than WT. We present two examples of applying WT and RT to land seismic data acquired in western Saudi Arabia. One of the seismic experiments investigated the water-table depth, and the other one attempted to detect the location of a buried fault. The seismic land data were inverted by WT and RT to generate the P-velocity tomograms, from which we can clearly identify the water table depth along the seismic survey line in the first example and the fault location in the second example.
A Three-Dimensional Ray-Tracing Study of R-X Mode Waves during High Geomagnetic Activity
XIAO Fu-Liang; CHEN Lun-Jin; ZHENG Hui-Nan; WANG Shui; GUO Jun
2008-01-01
We further present a three-dimensional(3D)ray-tracing study on the propagation characteristic of the superluminous R-X mode waves during high geomagnetic activity following our recent two-dimensional results [J.Geophys.Res.112(2007)A10214].We perform numerical calculations for this mode which originates at specific altitude r=2.0RE in the souice cavity along a 70°night geomagnetic field line.We demonstrate that the ray path of the R-X mode is essentially governed by the azimuthal angle of the wave vector k.Ray paths starting with azimuthal angle 180°(or in the meridian plane)can reach the lowest latitude,but stay at relatively higher latitudes with the azimuthal anglas other than 180°(or off the meridian plane).The results further supports the previous finding that the R-X mode may be physically present in the radiation belts under appropriate conditions.
Connolly, G D; Lowe, M J S; Temple, J A G; Rokhlin, S I
2010-05-01
The use of ultrasonic arrays has increased dramatically within recent years due to their ability to perform multiple types of inspection and to produce images of the structure through post-processing of received signals. Phased arrays offer many advantages over conventional transducers in the inspection of materials that are inhomogeneous with spatially varying anisotropic properties. In this paper, the arrays are focused on austenitic steel welds as a representative inhomogeneous material. The method of ray-tracing through a previously developed model of an inhomogeneous weld is shown, with particular emphasis on the difficulties presented by material inhomogeneity. The delay laws for the structure are computed and are used to perform synthetic focusing at the post-processing stage of signal data acquired by the array. It is demonstrated for a simulated austenitic weld that by taking material inhomogeneity and anisotropy into account, superior reflector location (and hence, superior sizing) results when compared to cases where these are ignored. The image is thus said to have been corrected. Typical images are produced from both analytical data in the frequency domain and data from finite element simulations in the time domain in a variety of wave modes, including cases with mode conversion and reflections.
Stevens, John [Univ. of California, Berkeley, CA (United States)
2013-12-01
Ray tracing was used to perform optical optimization of arrays of photovoltaic microrods and explore the interaction between light and bubbles of oxygen gas on the surface of the microrods. The incident angle of light was varied over a wide range. The percent of incident light absorbed by the microrods and reflected by the bubbles was computed over this range. It was found that, for the 10 μm diameter, 100 μm tall SrTiO_{3} microrods simulated in the model, the optimal center-to-center spacing was 14 μm for a square grid. This geometry produced 75% average and 90% maximum absorbance. For a triangular grid using the same microrods, the optimal center-to-center spacing was 14 μm. This geometry produced 67% average and 85% maximum absorbance. For a randomly laid out grid of 5 μm diameter, 100 μm tall SrTiO! microrods with an average center-to-center spacing of 20 μm, the average absorption was 23% and the maximum absorption was 43%. For a 50% areal coverage fraction of bubbles on the absorber surface, between 2%-20% of the incident light energy was reflected away from the rods by the bubbles, depending upon incident angle and bubble morphology.
Pelzers, R S; Yu, Q L; Mangkuto, R A
2014-10-01
This article aims to understand the radiation behavior within a photo-reactor, following the ISO 22197-1:2007 standard. The RADIANCE lighting simulation tool, based on the backward ray-tracing modeling method, is employed for a numerical computation of the radiation field. The reflection of the glass cover in the photo-reactor and the test sample influence the amount of irradiance received by the test-sample surface in the photo-reactor setup. The reflection of a white sample limits the irradiance reduction by the glass cover to 1.4 %, but darker samples can lead to an overestimation up to 9.8 % when used in the same setup. This overestimation could introduce considerable error into the interpretation of experiments. Furthermore, this method demonstrates that the kinetics for indoor photocatalytic pollutant degradation can be refined through radiation modeling of the reactor setup. In addition, RADIANCE may aid in future modeling of the more complex indoor environment where radiation affects significantly photocatalytic activity.
Sassen, Kenneth; Knight, Nancy C.; Takano, Yoshihide; Heymsfield, Andrew J.
1994-01-01
During the 1986 Project FIRE (First International Satellite Cloud Climatology Project Regional Experiment) field campaign, four 22 deg halo-producing cirrus clouds were studied jointly from a ground-based polarization lidar and an instrumented aircraft. The lidar data show the vertical cloud structure and the relative position of the aircraft, which collected a total of 84 slides by impaction, preserving the ice crystals for later microscopic examination. Although many particles were too fragile to survive impaction intact, a large fraction of the identifiable crystals were columns and radial bullet rosettes, with both displaying internal cavitations and radial plate-column combinations. Particles that were solid or displayed only a slight amount of internal structure were relatively rare, which shows that the usual model postulated by halo theorists, i.e., the randomly oriented, solid hexagonal crystal, is inappropriate for typical cirrus clouds. With the aid of new ray-tracing simulations for hexagonal hollow-ended column and bullet-rosette models, we evaluate the effects of more realistic ice-crystal structures on halo formation and lidar depolarization and consider why the common halo is not more common in cirrus clouds.
Mitsuishi, I.; Ezoe, Y.; Ogawa, T.; Sato, M.; Nakamura, K.; Numazawa, M.; Takeuchi, K.; Ohashi, T.; Ishikawa, K.; Mitsuda, K.
2016-01-01
To investigate a feasibility for in situ X-ray imaging spectrometer JUXTA (Jupiter X-ray Telescope Array) onboard a Japanese Jupiter exploration mission, we demonstrated the ideal performances, i.e., angular resolution, effective area and grasp, of our original, conically-approximated Wolter type-I MEMS-processed optics, by extending the previous ray-tracing simulator. The novel simulator enables us to study both on- and off-axis responses for our optics with two-stage optical configurations for the first time. The on-axis angular resolution is restricted to ∼ 13 μm corresponding to ∼ 10 arcsec on the detector plane without considering the diffraction effect and dominated by the diffraction effect below ∼ 1 keV (e.g., 13 arcsec at 1 keV). Si optics can achieve effective area of >700 mm2 and grasp of >1600 mm2 deg2 at our interesting energy of 600 eV. Larger effective area and grasp can be attained by employing Ni as a substrate material or Ir as a reflecting surface material. However, other factors produced in the fabrication processes such as the waviness on the mirror surface and the deformation error cause the significant performance degradation. Thus, we concluded that MEMS-processed optics can satisfy all the requirements of JUXTA only if the manufacturing accuracy can be controlled.