WorldWideScience

Sample records for coal-seismic desktop computer

  1. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  2. In-seam seismics for coal

    Energy Technology Data Exchange (ETDEWEB)

    Saviron Cidon, L [OCICARBON, Madrid (Spain)

    1989-11-01

    The project objective is to assess the degree of applicability of in-seam seismic technology in Spanish coal mines for use as a tool to predict the presence of irregularities in coal seams. By the very nature of coal mining, a large number of in-seam seismic research results are put directly to the test by the ensuing underground operations. The statistics from this continuous process of verification in other countries show this method to be extremely successful. Indeed, the use of the method has become habitual and it is recognised as an efficient instrument for aiding the location of faults and other irregularities in coal seams. 3 figs., 2 tabs.

  3. Indication to distinguish the burst region of coal gas from seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Jian-yuan Cheng; Hong-wei Tang; Lin Xu; Yan-fang Li [China Coal Research Institute, Xi' an (China). Xi' an Research Institute

    2009-09-15

    The velocity of an over-burst coal seam is about 1/3 compared to a normal coal seam based on laboratory test results. This can be considered as a basis to confirm the area of coal and gas burst by seismic exploration technique. Similarly, the simulation result of the theoretical seismic model shows that there is obvious distinction between over-burst coal and normal coal based on the coal reflection's travel-time, energy and frequency. The results from the actual seismic data acquired in the coal and gas over-burst cases is consistent with that of the laboratory and seismic modeling; that is, in the coal and gas burst region, seismic reflection travel time is delayed, seismic amplitude is weakened and seismic frequency is reduced. Therefore, it can be concluded that seismic exploration technique is promising for use in distinguishing coal and gas over-burst regions based on the variation of seismic reflection travel time, amplitude and frequency. 7 refs., 6 figs.

  4. 4D seismic data acquisition method during coal mining

    International Nuclear Information System (INIS)

    Du, Wen-Feng; Peng, Su-Ping

    2014-01-01

    In order to observe overburden media changes caused by mining processing, we take the fully-mechanized working face of the BLT coal mine in Shendong mine district as an example to develop a 4D seismic data acquisition methodology during coal mining. The 4D seismic data acquisition is implemented to collect 3D seismic data four times in different periods, such as before mining, during the mining process and after mining to observe the changes of the overburden layer during coal mining. The seismic data in the research area demonstrates that seismic waves are stronger in energy, higher in frequency and have better continuous reflectors before coal mining. However, all this is reversed after coal mining because the overburden layer has been mined, the seismic energy and frequency decrease, and reflections have more discontinuities. Comparing the records collected in the survey with those from newly mined areas and other records acquired in the same survey with the same geometry and with a long time for settling after mining, it clearly shows that the seismic reflections have stronger amplitudes and are more continuous because the media have recovered by overburden layer compaction after a long time of settling after mining. By 4D seismic acquisition, the original background investigation of the coal layers can be derived from the first records, then the layer structure changes can be monitored through the records of mining action and compaction action after mining. This method has laid the foundation for further research into the variation principles of the overburden layer under modern coal-mining conditions. (paper)

  5. Seismic characterization of CO{sub 2} in coals

    Energy Technology Data Exchange (ETDEWEB)

    McCrank, J.; Lawton, D.C. [Calgary Univ., AB (Canada). Dept. of Geoscience, Consortium for Research in Elastic Wave Exploration Seismology

    2008-07-01

    The Mynheer coal seam was targeted for an enhanced coalbed methane (CBM) experiment. During initial testing of the reservoir permeability, 180 tonnes of carbon dioxide (CO{sub 2}) was injected into the seam. The objective of the study was to characterize the coal zones and to determine if the small volume of CO{sub 2} in the thinly bedded and seismically tuned reservoir can be detected in the 3D surface seismic data. The multi-well pilot project took place in the Pembina Field of west-central Alberta. The Ardley coals were tested for CO{sub 2} injection, enhanced CBM production, and CO{sub 2} sequestration. The seismic survey captured the condition of the reservoir after formation permeability tests. It was concluded that the anomalies seen in the seismic data can be attributed to changes in the physical properties of the coal due to CO{sub 2} adsorption. 2 refs., 5 figs.

  6. Seismic modelling of coal bed methane strata, Willow Creek, Alberta

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, S.E.; Mayer, R.; Lawton, D.C.; Langenberg, W. [Consortium for Research in Elastic Wave Exploration Seismology, Calgary, AB (Canada)

    2001-07-01

    The purpose is to determine the feasibility of applying high- resolution reflection seismic surveying to coalbed methane (CBM) exploration and development. Numerical reflection seismic methods are examined for measuring the mapping continuity and coherence of coal zones. Numerical modelling of a coal zone in Upper Cretaceous sediments near Willow Creek, Alberta indicates that seismic data that is predominantly of 100 Hz is required to map the coal zone and lateral facies variations within the deposit. For resolution of individual coal seams, a central frequency >150 Hz would be needed. 26 refs., 17 figs., 3 tabs.

  7. Application of desktop computers in nuclear engineering education

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solution to the problem being solved, and yet be flexible enough to accommodate most problem solution options

  8. VRLane: a desktop virtual safety management program for underground coal mine

    Science.gov (United States)

    Li, Mei; Chen, Jingzhu; Xiong, Wei; Zhang, Pengpeng; Wu, Daozheng

    2008-10-01

    VR technologies, which generate immersive, interactive, and three-dimensional (3D) environments, are seldom applied to coal mine safety work management. In this paper, a new method that combined the VR technologies with underground mine safety management system was explored. A desktop virtual safety management program for underground coal mine, called VRLane, was developed. The paper mainly concerned about the current research advance in VR, system design, key techniques and system application. Two important techniques were introduced in the paper. Firstly, an algorithm was designed and implemented, with which the 3D laneway models and equipment models can be built on the basis of the latest mine 2D drawings automatically, whereas common VR programs established 3D environment by using 3DS Max or the other 3D modeling software packages with which laneway models were built manually and laboriously. Secondly, VRLane realized system integration with underground industrial automation. VRLane not only described a realistic 3D laneway environment, but also described the status of the coal mining, with functions of displaying the run states and related parameters of equipment, per-alarming the abnormal mining events, and animating mine cars, mine workers, or long-wall shearers. The system, with advantages of cheap, dynamic, easy to maintenance, provided a useful tool for safety production management in coal mine.

  9. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    Science.gov (United States)

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P computer use.Trial registration no: ACTRN12617000326392.

  10. Desk-top publishing using IBM-compatible computers.

    Science.gov (United States)

    Grencis, P W

    1991-01-01

    This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.

  11. Desk Congest Desktop Congesting Software for Desktop Clutter Congestion

    Directory of Open Access Journals (Sweden)

    Solomon A. Adepoju

    2015-06-01

    Full Text Available Abstract The computer desktop environment is a working environment which can be likened unto a users desk in homes and offices. Often times the computer desktop get cluttered with files either as shortcuts used for quick links files stored temporarily to be accessed later or just being dumped there for no vivid reasons. However previous researches have shown that cluttered desktop affects users productivity and getting these files organized is a laborious task for most users. To be able to conveniently alleviate the effect clutters have on users performances and productivity there is need for third party software that will help get the desktop environment organized in a logical and efficient manner. It is to this end that desktop decongesting software is being designed and implemented to help curb clutter problems which existing tools have only partially addressed. The system is designed using Visual Basic .Net and it proves to be effective in tackling desktop congestion problem.

  12. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  13. Density-based reflectivity in seismic exploration for coal in Alberta, Canada

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C.; Lyatsky, H.V. (University of Calgary, AB (Canada). Dept. of Geology and Geophysics)

    1991-01-01

    At a coal field in central Alberta, Canada, the acoustic reflectivity of shallow coal seams was found to be dominated by the density contrast between coal and host bentonitic sediments. Sonic logs and a check-shot survey showed that the compressional-wave velocity is almost constant through the coal zone and the overlying sediments, and ranges in value between 2000 m/s and 2350 m/s over different parts of the coal field. The average coal density is 1400 kg/m{sup 3}, whereas the density of the sediments is about 2200 kg/m{sup 3}. Results are illustrated using logs from a typical drillhole in the coal field. At this location, the time reflectivity sequence based on both the density and sonic logs is very similar to that obtained when the density log only is used, with a constant velocity assumed through the coal zone. At another drillhole location in the coal field, where reflection seismic data had been acquired, a synthetic seismogram generated from the density log closely matches the stacked seismic section. 6 refs., 4 figs.

  14. Pages from the Desktop: Desktop Publishing Today.

    Science.gov (United States)

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  15. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  16. Desk-top computer assisted processing of thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Archer, B.R.; Glaze, S.A.; North, L.B.; Bushong, S.C.

    1977-01-01

    An accurate dosimetric system utilizing a desk-top computer and high sensitivity ribbon type TLDs has been developed. The system incorporates an exposure history file and procedures designed for constant spatial orientation of each dosimeter. Processing of information is performed by two computer programs. The first calculates relative response factors to insure that the corrected response of each TLD is identical following a given dose of radiation. The second program computes a calibration factor and uses it and the relative response factor to determine the actual dose registered by each TLD. (U.K.)

  17. What's New in Software? Mastery of the Computer through Desktop Publishing.

    Science.gov (United States)

    Hedley, Carolyn N.; Ellsworth, Nancy J.

    1993-01-01

    Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)

  18. Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.

    Science.gov (United States)

    Crawford, Walt

    1987-01-01

    Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…

  19. Desktop Publishing Made Simple.

    Science.gov (United States)

    Wentling, Rose Mary

    1989-01-01

    The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)

  20. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  1. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  2. Desktop computer graphics for RMS/payload handling flight design

    Science.gov (United States)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  3. Evaluating virtual hosted desktops for graphics-intensive astronomy

    Science.gov (United States)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  4. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    Science.gov (United States)

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  5. Desktop Publishing for Counselors.

    Science.gov (United States)

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  6. Efficient Sustainable Operation Mechanism of Distributed Desktop Integration Storage Based on Virtualization with Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2015-06-01

    Full Text Available Following the rapid growth of ubiquitous computing, many jobs that were previously manual have now been automated. This automation has increased the amount of time available for leisure; diverse services are now being developed for this leisure time. In addition, the development of small and portable devices like smartphones, diverse Internet services can be used regardless of time and place. Studies regarding diverse virtualization are currently in progress. These studies aim to determine ways to efficiently store and process the big data generated by the multitude of devices and services in use. One topic of such studies is desktop storage virtualization, which integrates distributed desktop resources and provides these resources to users to integrate into distributed legacy desktops via virtualization. In the case of desktop storage virtualization, high availability of virtualization is necessary and important for providing reliability to users. Studies regarding hierarchical structures and resource integration are currently in progress. These studies aim to create efficient data distribution and storage for distributed desktops based on resource integration environments. However, studies regarding efficient responses to server faults occurring in desktop-based resource integration environments have been insufficient. This paper proposes a mechanism for the sustainable operation of desktop storage (SODS for high operational availability. It allows for the easy addition and removal of desktops in desktop-based integration environments. It also activates alternative servers when a fault occurs within a system.

  7. The Point Lepreau Desktop Simulator

    International Nuclear Information System (INIS)

    MacLean, M.; Hogg, J.; Newman, H.

    1997-01-01

    The Point Lepreau Desktop Simulator runs plant process modeling software on a 266 MHz single CPU DEC Alpha computer. This same Alpha also runs the plant control computer software on an SSCI 125 emulator. An adjacent Pentium PC runs the simulator's Instructor Facility software, and communicates with the Alpha through an Ethernet. The Point Lepreau Desktop simulator is constructed to be as similar as possible to the Point Lepreau full scope training simulator. This minimizes total maintenance costs and enhances the benefits of the desktop simulator. Both simulators have the same modeling running on a single CPU in the same schedule of calculations. Both simulators have the same Instructor Facility capable of developing and executing the same lesson plans, doing the same monitoring and control of simulations, inserting all the same malfunctions, performing all the same overrides, capable of making and restoring all the same storepoints. Both simulators run the same plant control computer software - the same assembly language control programs as the power plant uses for reactor control, heat transport control, annunciation, etc. This is a higher degree of similarity between a desktop simulator and a full scope training simulator than previously reported for a computer controlled nuclear plant. The large quantity of control room hardware missing from the desktop simulator is replaced by software. The Instructor Facility panel override software of the training simulator provides the means by which devices (switches, controllers, windows, etc.) on the control room panels can be controlled and monitored in the desktop simulator. The CRT of the Alpha provides a mouse operated DCC keyboard mimic for controlling the plant control computer emulation. Two emulated RAMTEK display channels appear as windows for monitoring anything of interest on plant DCC displays, including one channel for annunciation. (author)

  8. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    Science.gov (United States)

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  9. Desktop Technology for Newspapers: Use of the Computer Tool.

    Science.gov (United States)

    Wilson, Howard Alan

    This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…

  10. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  11. High resolution seismic survey (of the) Rawlins, Wyoming underground coal gasification area. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Youngberg, A.D.; Berkman, E.; Orange, A.S.

    1983-01-01

    In October 1982, a high resolution seismic survey was conducted at the Gulf Research and Development Company's underground coal gasification test site near Rawlins, Wyoming. The objectives of the survey were to utilize high resolution seismic technology to locate and characterize two underground coal burn zones. Seismic data acquisition and processing parameters were specifically designed to emphasize reflections at the shallow depths of interest. A three-dimensional grid of data was obtained over the Rawlins burn zones. Processing included time varying filters, trace composition, and two-dimensional areal stacking of the data in order to identify burn zone anomalies. An anomaly was discernable resulting from the rubble-collapse cavity associated with the burn zone which was studied in detail at the Rawlins 1 and 2 test sites. 21 refs., 20 figs.

  12. The impact of the structural features of the rock mass on seismicity in Polish coal mines

    Science.gov (United States)

    Patyńska, Renata

    2017-11-01

    The article presents seismic activity induced in the coal mines of the Upper Silesian Coal Basin (GZW) in relation to the locations of the occurrence of rockbursts. The comparison of these measurements with the structural features of the rock mass of coal mines indicates the possibility of estimating the so-called Unitary Energy Expenditure (UEE) in a specific time. The obtained values of UEE were compared with the distribution of seismic activity in GZW mines. The level of seismic activity in the analysed period changed and depended on the intensity of mining works and diverse mining and geological conditions. Five regions, where tremors occurred (Bytom Trough, Main Saddle, Main Trough, Kazimierz Trough, and Jejkowice and Chwałowice Trough) which belong to various structural units of the Upper Silesia were analyzed. It was found out that rock bursts were recorded only in three regions: Main Saddle, Bytom Trough, and Jejkowice and Chwałowice Trough.

  13. Detecting voids in a 0. 6m coal seam, 7m deep, using seismic reflection

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.D.; Steeples, D.W. (University of Kansas, Lawrence, KS (USA). Kansas Geological Survey)

    1991-07-01

    Surface collapse over abandoned subsurface coal mines is a problem in many parts of the world. High-resolution P-wave reflection seismology was successfully used to evaluate the risk of an active sinkhole to a main north-south railroad line in an undermined area of southeastern Kansas, USA. Water-filled cavities responsible for sinkholes in this area are in a 0.6 m thick coal seam, 7 m deep. Dominant reflection frequencies in excess of 200 Hz enabled reflections from the coal seam to be discerned from the direct wave, refractions, air wave, and ground roll on unprocessed field files. Repetitive void sequences within competent coal on three seismic profiles are consistent with the 'room and pillar' mining technique practiced in this area near the turn of the century. The seismic survey showed that the apparent active sinkhole was not the result of reactivated subsidence but probably the results of erosion. 14 refs., 6 figs.

  14. Preliminary report on LLNL mine seismicity deployment at the Twentymile Coal Mine

    International Nuclear Information System (INIS)

    Walter, W.R.; Hunter, S.L.; Glenn, L.A.

    1996-01-01

    This report summarizes the preliminary results of a just completed experiment at the Twentymile Coal Mine, operated by the Cyprus Amax Coal Company near Oak Creek, CO. The purpose of the experiment was to obtain local and regional seismic data from roof caves associated with long-wall mining activities and to use this data to help determine the effectiveness with which these events can be discriminated from underground nuclear explosions under a future Comprehensive Test Ban Treaty

  15. A Five-Year Hedonic Price Breakdown for Desktop Personal Computer Attributes in Brazil

    Directory of Open Access Journals (Sweden)

    Nuno Manoel Martins Dias Fouto

    2009-07-01

    Full Text Available The purpose of this article is to identify the attributes that discriminate the prices of personal desktop computers. We employ the hedonic price method in evaluating such characteristics. This approach allows market prices to be expressed as a function, a set of attributes present in the products and services offered. Prices and characteristics of up to 3,779 desktop personal computers offered in the IT pages of one of the main Brazilian newspapers were collected from January 2003 to December 2007. Several specifications for the hedonic (multivariate linear regression were tested. In this particular study, the main attributes were found to be hard drive capacity, screen technology, main board brand, random memory size, microprocessor brand, video board memory, digital video and compact disk recording devices, screen size and microprocessor speed. These results highlight the novel contribution of this study: the manner and means in which hedonic price indexes may be estimated in Brazil.

  16. Application of the surface reflection seismic method to shallow coal exploration in the plains of Alberta

    Energy Technology Data Exchange (ETDEWEB)

    Lyatsky, H.V.; Lawton, D.C. (University of Victoria, Victoria, BC (Canada). Dept. of Physics and Astronomy)

    1988-12-01

    A study was done to make a quantitative interpretation of reflection seismic data from the Highvale-Whitewood shallow coal deposit in central Alberta. Results showed that the data is useful in demonstrating coal thickness and stratigraphy as well as structural formation. Reflection character is affected by nature of the strata surrounding the coal deposit. 22 refs., 1 tab., 23 figs.

  17. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  18. Seismic proving test of process computer systems with a seismic floor isolation system

    International Nuclear Information System (INIS)

    Fujimoto, S.; Niwa, H.; Kondo, H.

    1995-01-01

    The authors have carried out seismic proving tests for process computer systems as a Nuclear Power Engineering Corporation (NUPEC) project sponsored by the Ministry of International Trade and Industry (MITI). This paper presents the seismic test results for evaluating functional capabilities of process computer systems with a seismic floor isolation system. The seismic floor isolation system to isolate the horizontal motion was composed of a floor frame (13 m x 13 m), ball bearing units, and spring-damper units. A series of seismic excitation tests was carried out using a large-scale shaking table of NUPEC. From the test results, the functional capabilities during large earthquakes of computer systems with a seismic floor isolation system were verified

  19. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    Science.gov (United States)

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    may be freely developed and integrated without having to recompile the core code. Therefore, the users can build new external classes implementing custom GMPM modules by adhering to the programming-interface specification, which is delivered as part of the executable program. On the other hand, generalized attenuation models are non-parametric probabilistic descriptions of the ground motions produced by individual earthquakes with known magnitude and location. In the context of CRISIS, a generalized attenuation model is a collection of probabilistic footprints, one for each of the events considered in the analysis. Each footprint gives the geographical distribution of the intensities produced by this event. CRISIS permits now the inclusion of local site effects in hazard computations. Site effects are given to CRISIS in terms of amplification factors that depend on site location, period, and ground-motion level (in order to account for soil non-linearity). Enhanced capabilities to make logic-tree computations and to produce seismic disaggregation charts. A new presentation layer, developed for accessing the same functionalities of the desktop version via web (CRISISWeb). Examples will be presented and the program will be made available to all interested persons.

  20. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  1. Semantic document architecture for desktop data integration and management

    OpenAIRE

    Nesic, Sasa; Jazayeri, Mehdi

    2011-01-01

    Over the last decade, personal desktops have faced the problem of information overload due to increasing computational power, easy access to the Web and cheap data storage. Moreover, an increasing number of diverse end-user desktop applications have led to the problem of information fragmentation. Each desktop application has its own data, unaware of related and relevant data in other applications. In other words, personal desktops face a lack of interoperability of data managed by differ...

  2. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    Directory of Open Access Journals (Sweden)

    Supat Faarungsang

    2017-04-01

    Full Text Available The Reverse Threshold Model Theory (RTMT model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives complete elimination of random error. Furthermore, RTMT has several advantages over CM and is therefore proposed to be applied to most research data.

  3. Collection and analysis of environmental radiation data using a desktop computer

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1982-04-01

    A portable instrumentation sytem using a Hewlett-Packard HP-9825 desktop computer for the collection and analysis of environmental radiation data is described. Procedures for the transmission of data between the HP-9825 and various nuclear counters are given together with a description of the necessary hardware and software. Complete programs for the analysis of Ge(Li) and NaI(Tl) gamma-ray spectra, high pressure ionization chamber monitor data, 86 Kr monitor data and air filter sample alpha particle activity measurements are presented. Some utility programs, intended to increase system flexibility, are included

  4. Multicomponent seismic applications in coalbed methane development

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.; Trend, S. [Calgary Univ., AB (Canada). Dept. of Geology and Geophysics

    2004-07-01

    Seismic applications for coalbed methane (CBM) development are used to address the following challenges: lateral continuity of coal zones; vertical continuity of coal seams; permeability of cleats and fractures; coal quality and gas content; wet versus dry coal zones; and, monitoring storage of greenhouse gases. This paper presented a brief description of existing seismic programs, including 2-D and 3-D surface seismic surveys; multicomponent seismic surveys; vertical seismic profiles; cross-well seismic surveys; and, time-lapse seismic surveys. A comparative evaluation of their use in the Horseshoe Canyon Formation and the Ardley Formation was presented. The study showed that variations in reservoir properties resulting from gas production and dewatering can be effectively imaged using seismic surveys. Seismic surveys are useful in reservoir management, monitoring sweep efficiency during enhanced natural gas from coal (NGC) production, monitoring disposal of produced water and verifying storage of carbon dioxide for carbon credits. tabs., figs.

  5. Computing on the Desktop: From Batch to Online in Two Large Danish Service Bureaus

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    The advent of the personal computer is often hailed as the major step towards empowering the computer user. This step was indeed significant, but it was preceeded by a similar step some 10-15 years earlier: the advent of the video terminal or ”glass–TTY”. The video terminal invaded the desktop...... of many while collar workers and the workplace of many blue collar workers in the 1970s and 1980s. It replaced batch processing and facilitated direct, interactive access to computing services. This had a considerable impact on working conditions. This paper addresses this transition in two large Danish...

  6. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    Science.gov (United States)

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  7. The influence of the mining operation on the mine seismicity of Vorkuta coal deposit

    Science.gov (United States)

    Zmushko, T.; Turuntaev, S. B.; Kulikov, V. I.

    2012-04-01

    The mine seismicity of Vorkuta coal deposit was analyzed. Seismic network consisting of 24 seismic sensors (accelerometers) cover the area of "Komsomolskaya" and "North" mines of Vorkuta deposit. Also there is seismic station of IDG RAS with three-component seismometer near this mines for better defining energy of the seismic events. The catalogs of seismic events contain 9000 and 7000 events with maximum magnitude M=2.3 for "Komsomolskaya" and "North" mines respectively and include the period from 01.09.2008 to 01.09.2011. The b-value of the magnitude-frequency relation was -1.0 and -1.15 respectively for the mines, meanwhile b-value for the nature seismicity was -0,9. It was found, that the number of seismic events per hour during mine combine operation is higher in 2.5 times than the number of seismic events during the break in the operation. Also, the total energy of the events per hour during the operation is higher in 3-5 times than during the break. The study showed, that the number and the energy of the seismic events relate with the hours of mine combine operation. The spatial distribution of the seismic events showed, that 80% of all events and 85% of strong events (M>1.6) were located in and near the longwall under development during the mine combine operations as well asduring the breaks. The isoclines of seismic event numbers proved that the direction of motion of the boundary of seismic events extension coincides with the direction of development, the maximum number of events for any period lies within the wall under operation. The rockburst with M=2.3 occurring at the North mine at July 16, 2011 was considered. The dependences of the energy and of the number of events with different magnitudes on the time showed that the number of events with M=1 and especially M=0.5 before the rockburst decreased, which corresponds to the prognostic seismic quietness, described in the research works. The spatial distribution of the events for the 6 month before the

  8. A desktop PRA

    International Nuclear Information System (INIS)

    Dolan, B.J.; Weber, B.J.

    1989-01-01

    This paper reports that Duke Power Company has completed full-scope PRAs for each of its nuclear stations - Oconee, McGuire and Catawba. These living PRAs are being maintained using desktop personal computers. Duke's PRA group now has powerful personal computer-based tools that have both decreased direct costs (computer analysis expenses) and increased group efficiency (less time to perform analyses). The shorter turnaround time has already resulted in direct savings through analyses provided in support of justification for continued station operation. Such savings are expected to continue with similar future support

  9. Desktop Publishing: A Powerful Tool for Advanced Composition Courses.

    Science.gov (United States)

    Sullivan, Patricia

    1988-01-01

    Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)

  10. Large-Strain Monitoring Above a Longwall Coal Mine With GPS and Seismic Measurements

    Science.gov (United States)

    Swanson, P. L.; Andreatta, V.; Meertens, C. M.; Krahenbuhl, T.; Kenner, B.

    2001-12-01

    As part of an effort to evaluate continuous GPS measurements for use in mine safety studies, a joint GPS-seismic experiment was conducted at an underground longwall coal mine near Paonia, Colorado in June, 2001. Seismic and deformation signals were measured using prototype low-cost monitoring systems as a longwall panel was excavated 150 m beneath the site. Data from both seismic and GPS instruments were logged onto low-power PC-104 Linux computers which were networked using a wireless LAN. The seismic system under development at NIOSH/SRL is based on multiple distributed 8-channel 24-bit A/D converters. The GPS system uses a serial single-frequency (L1) receiver and UNAVCO's "Jstream" Java data logging software. For this experiment, a continuously operating dual-frequency GPS receiver was installed 2.4 km away to serve as a reference site. In addition to the continuously operating sites, 10 benchmarks were surveyed daily with short "rapid-static" occupations in order to provide greater spatial sampling. Two single-frequency sites were located 35 meters apart on a relatively steep north-facing slope. As mining progressed from the east, net displacements of 1.2 meters to the north and 1.65 meters of subsidence were observed over a period of 6 days. The east component exhibited up to 0.45 meters of eastward displacement (toward the excavation) followed by reverse movement to the west. This cycle, observed approximately two days earlier at the eastern L1 site, is consistent with a change in surface strain from tension to compression as the excavation front passed underneath. As this strain "wave" propagated across the field site, surface deformation underwent a cycle of tension crack nucleation, crack opening (up to 15 cm normal displacements), subsequent crack closure, and production of low-angle-thrust compressional deformation features. Analysis of seismic results, surface deformation, and additional survey results are presented.

  11. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    Science.gov (United States)

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  12. Fab the coming revolution on your desktop : from personal computers to personal fabrication

    CERN Document Server

    Gershenfeld, Neil

    2005-01-01

    What if you could someday put the manufacturing power of an automobile plant on your desktop? According to Neil Gershenfeld, the renowned MIT scientist and inventor, the next big thing is personal fabrication-the ability to design and produce your own products, in your own home, with a machine that combines consumer electronics and industrial tools. Personal fabricators are about to revolutionize the world just as personal computers did a generation ago, and Fab shows us how.

  13. Evaluation of geological conditions for coalbed methane occurrence based on 3D seismic information: a case study in Fowa region, Xinjing coal mine, China

    Science.gov (United States)

    Li, Juanjuan; Li, Fanjia; Hu, Mingshun; Zhang, Wei; Pan, Dongming

    2017-04-01

    The research on geological conditions of coalbed methane (CBM) occurrence is of great significance for predicting the high abundance CBM rich region and gas outburst risk area pre-warning. The No. 3 coal seam, in Yangquan coalfield of Qinshui basin, is the research target studied by 3D seismic exploration technique. The geological factors which affect CBM occurrence are interpreted based on the 3D seismic information. First, the geological structure (faults, folds, and collapse columns) is found out by the 3D seismic structural interpretation and the information of buried depth and thickness of the coal seam is calculated by the seismic horizons. Second, 3D elastic impedance (EI) and natural gamma attribute volumes are generated by prestack EI inversion and multi-attribute probabilistic neural network (PNN) inversion techniques which reflect the information of coal structure types and lithology of the roof and floor. Then, the information of metamorphic degree of seam and hydrogeology conditions can be obtained by the geological data. Consequently, geological conditions of CBM occurrence in No. 3 coal seam are evaluated which will provide scientific reference for high abundance CBM rich region prediction and gas outburst risk area pre-warning.

  14. Empirical Analysis of Server Consolidation and Desktop Virtualization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Bao Rong Chang

    2013-01-01

    Full Text Available Physical server transited to virtual server infrastructure (VSI and desktop device to virtual desktop infrastructure (VDI have the crucial problems of server consolidation, virtualization performance, virtual machine density, total cost of ownership (TCO, and return on investments (ROI. Besides, how to appropriately choose hypervisor for the desired server/desktop virtualization is really challenging, because a trade-off between virtualization performance and cost is a hard decision to make in the cloud. This paper introduces five hypervisors to establish the virtual environment and then gives a careful assessment based on C/P ratio that is derived from composite index, consolidation ratio, virtual machine density, TCO, and ROI. As a result, even though ESX server obtains the highest ROI and lowest TCO in server virtualization and Hyper-V R2 gains the best performance of virtual machine management; both of them however cost too much. Instead the best choice is Proxmox Virtual Environment (Proxmox VE because it not only saves the initial investment a lot to own a virtual server/desktop infrastructure, but also obtains the lowest C/P ratio.

  15. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    National Research Council Canada - National Science Library

    Beal, Scott A

    2007-01-01

    Fifty-two leaders in the Basic Non-Commissioned Officer Course (BNCOC) at Fort Benning, Georgia, participated in an assessment of two desk-top computer simulations used to train tactical decision making...

  16. On the quantitative determination of coal seam thickness by means of in-seam seismic surveys

    Czech Academy of Sciences Publication Activity Database

    Schott, W.; Waclawik, Petr

    2015-01-01

    Roč. 52, č. 10 (2015), s. 1496-1504 ISSN 0008-3674. [International Colloquium on Geomechanics and Geophysics /5./. Karolinka, 25.06.2014-27.06.2014] R&D Projects: GA MŠk ED2.1.00/03.0082; GA MŠk(CZ) LO1406 Institutional support: RVO:68145535 Keywords : in-seam seismic (ISS) * ISS wave * Love wave * coal seam thickness * dispersion Subject RIV: DH - Mining, incl. Coal Mining Impact factor: 1.877, year: 2015 http://www.nrcresearchpress.com/doi/full/10.1139/cgj-2014-0466#.VgqE1Zc70mt

  17. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  18. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.; Schneider, J.; Hansen, A.; Lee, M.; Turney, S. G.; Faulkner-Jones, B. E.; Hecht, J. L.; Najarian, R.; Yee, E.; Lichtman, J. W.; Pfister, H.

    2013-01-01

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  19. Installation of a digital, wireless, strong-motion network for monitoring seismic activity in a western Colorado coal mining region

    Energy Technology Data Exchange (ETDEWEB)

    Peter Swanson; Collin Stewart; Wendell Koontz [NIOSH, Spokane, WA (USA). Spokane Research Laboratory

    2007-01-15

    A seismic monitoring network has recently been installed in the North Fork Valley coal mining region of western Colorado as part of a NIOSH mine safety technology transfer project with two longwall coal mine operators. Data recorded with this network will be used to characterize mining related and natural seismic activity in the vicinity of the mines and examine potential hazards due to ground shaking near critical structures such as impoundment dams, reservoirs, and steep slopes. Ten triaxial strong-motion accelerometers have been installed on the surface to form the core of a network that covers approximately 250 square kilometers (100 sq. miles) of rugged canyon-mesa terrain. Spread-spectrum radio networks are used to telemeter continuous streams of seismic waveform data to a central location where they are converted to IP data streams and ported to the Internet for processing, archiving, and analysis. 4 refs.

  20. Semantic Desktop

    Science.gov (United States)

    Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar

    In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.

  1. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  2. Desktop Publishing: Its Impact on Community College Journalism.

    Science.gov (United States)

    Grzywacz-Gray, John; And Others

    1987-01-01

    Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)

  3. Desktop Virtualization: Applications and Considerations

    Science.gov (United States)

    Hodgman, Matthew R.

    2013-01-01

    As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…

  4. Computer application in coal preparation industry in China

    Energy Technology Data Exchange (ETDEWEB)

    Lu, M.; Wu, L.; Ni, Q. (China Univ. of Mining and Technology, Xuzhou (China))

    1990-01-01

    This paper describes several packages of microcomputer programs developed for designing and managing the coal preparation plants. Three parts are included: Coal Cleaning Package (CCP), Coal Preparation Optimization Program (CPO) and Coal Preparation Computer Aided Design System (CPCAD). The function of CCP is: evaluating and predicting coal cleaning result. Coal presentation process modelling and optimization; coal preparation flowsheet design and optimization. The CPO is a nonlinear optimization program. It can simulate and optimize the profit for different flowsheet to get the best combination of the final products. The CPCAD was developed based upon AutoCAD and makes full use of AutoLISP, digitizer menus and AutoCAD commands, combining the functions provided by AutoCAD and the principle used in conventional coal preparation plant design, forming a designer-oriented CPCAD system. These packages have proved to be reliable, flexible and easy to learn and use. They are a powerful tool for coal preparation plant design and management. (orig.).

  5. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  6. Analysis of helium-ion scattering with a desktop computer

    Science.gov (United States)

    Butler, J. W.

    1986-04-01

    This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.

  7. A VM-shared desktop virtualization system based on OpenStack

    Science.gov (United States)

    Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie

    2018-04-01

    With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.

  8. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    Science.gov (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  9. Basics of Desktop Publishing. Teacher Edition.

    Science.gov (United States)

    Beeby, Ellen

    This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…

  10. Seismic applications in CBM exploration and development

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, S.E.; Lawton, D.C. [Calgary Univ., AB (Canada)

    2002-07-01

    This Power Point presentation reviewed seismic methods, coal seam seismology, seismology and coalbed methane (CBM) development, and time-lapse seismic imaging with reference to numerical modelling and physical testing. The issue of resolution versus detection in various seismic methods was discussed. The thinnest resolvable beds are usually about 1.0 m thick. Coal zones with thin seams can be mapped using seismic reflection, but individual seams are difficult to resolve in field data. In terms of coal seismology, it was noted that seismic surveys make it possible to identify seam thickness, field geometry, subsurface structuring and facies changes. Facies model make it possible to determine the depositional environment, coal type, coal quality and lateral continuity. Some successes in coal seismology include the Cedar Hill and Ferron fields in the San Juan Basin. Numerical modelling methods include digital dipole compressional sonic and density well logs through Ardley Coal Zone, P-wave synthetic seismograms generated in SYNTH (MATLAB), and the alteration of density/velocity values to create new seismograms. Another numerical method is to take the difference between original and altered seismograms. It was shown that dewatering causes a decrease in velocity of about 20 per cent, and a 15 per cent decrease in density. Changes as small as 5 per cent in reservoir properties can be successfully imaged. It was concluded that the identification of dewatered zones allow for optimal positioning of development wells. Further physical testing will involve wet and dry p-wave velocities, s-wave velocities will be tested, and velocities will be measured under pressure. 2 tabs., 10 figs.

  11. Development of a 3-dimensional seismic isolation floor for computer systems

    International Nuclear Information System (INIS)

    Kurihara, M.; Shigeta, M.; Nino, T.; Matsuki, T.

    1991-01-01

    In this paper, we investigated the applicability of a seismic isolation floor as a method for protecting computer systems from strong earthquakes, such as computer systems in nuclear power plants. Assuming that the computer system is guaranteed for 250 cm/s 2 of input acceleration in the horizontal and vertical directions as the seismic performance, the basic design specification of the seismic isolation floor is considered as follows. Against S 1 level earthquakes, the maximum acceleration response of the seismic isolation floor in the horizontal and vertical directions is kept less than 250 cm/s 2 to maintain continuous computer operation. Against S 2 level earthquakes, the isolation floor allows large horizontal movement and large displacement of the isolation devices to reduce the acceleration response, although it is not guaranteed to be less than 250 cm/s 2 . By reducing the acceleration response, however, serious damage to the computer systems is reduced, so that they can be restarted after an earthquake. Usually, seismic isolation floor systems permit 2-dimensional (horizontal) isolation. However, in the case of just-under-seated earthquakes, which have large vertical components, the vertical acceleration response of this system is amplified by the lateral vibration of the frame of the isolation floor. Therefore, in this study a 3-dimensional seismic isolation floor, including vertical isolation, was developed. This paper describes 1) the experimental results of the response characteristics of the 3-dimensional seismic isolation floor built as a trial using a 3-dimensional shaking table, and 2) comparison of a 2-dimensional analytical model, for motion in one horizontal direction and the vertical direction, to experimental results. (J.P.N.)

  12. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  13. Seismic monitoring of ground caving processes associated with longwall mining of coal

    International Nuclear Information System (INIS)

    Hatherly, P.; Luo, X.; Dixon, R.; McKavanagh, B.

    1997-01-01

    At the Gordonstone Coal Mine in Central Queensland, Australia, a microseismic monitoring study was undertaken to investigate the extent of ground failure caused by longwall mining. Twenty seven triaxial geophones were deployed in three vertical boreholes and over a six week period more than 1200 events were recorded. The seismicity correlated with periods of longwall production and occurred mainly within the 250 m wide mining panel. There was an arcuate zone of activity which extended from behind the face, at the sides of the panel and up to 70 m ahead of the face in the middle. There was lesser activity to a depth of about 30 m into the floor. The focal mechanisms show that reverse faulting was dominant. The presence of activity and reverse faulting ahead of the face was an unexpected result. However, piezometer readings at the time of the study and subsequent numerical modelling have supported this finding. This was the first detailed microseismic monitoring study of caving in an Australian underground coal mine. 9 refs., 6 figs

  14. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  15. Desktop Genetics

    OpenAIRE

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-01-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learni...

  16. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.; Burger, K.; Reichl, F.; Meneveau, C.; Szalay, A.; Westermann, R.

    2012-01-01

    is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence

  17. Development of an automated desktop procedure for defining macro ...

    African Journals Online (AJOL)

    2006-07-03

    break points' such as ... An automated desktop procedure was developed for computing statistically defensible, multiple change .... from source to mouth. .... the calculated value was less than the test statistic given in Owen.

  18. Desktop Virtualization in Action: Simplicity Is Power

    Science.gov (United States)

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  19. Desktop Publishing: A New Frontier for Instructional Technologists.

    Science.gov (United States)

    Bell, Norman T.; Warner, James W.

    1986-01-01

    Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)

  20. Desktop Publishing: A Brave New World and Publishing from the Desktop.

    Science.gov (United States)

    Lormand, Robert; Rowe, Jane J.

    1988-01-01

    The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…

  1. Investigation Methodology of a Virtual Desktop Infrastructure for IoT

    Directory of Open Access Journals (Sweden)

    Doowon Jeong

    2015-01-01

    Full Text Available Cloud computing for IoT (Internet of Things has exhibited the greatest growth in the IT market in the recent past and this trend is expected to continue. Many companies are adopting a virtual desktop infrastructure (VDI for private cloud computing to reduce costs and enhance the efficiency of their servers. As a VDI is widely used, threats of cyber terror and invasion are also increasing. To minimize the damage, response procedure for cyber intrusion on a VDI should be systematized. Therefore, we propose an investigation methodology for VDI solutions in this paper. Here we focus on a virtual desktop infrastructure and introduce various desktop virtualization solutions that are widely used, such as VMware, Citrix, and Microsoft. In addition, we verify the integrity of the data acquired in order that the result of our proposed methodology is acceptable as evidence in a court of law. During the experiment, we observed an error: one of the commonly used digital forensic tools failed to mount a dynamically allocated virtual disk properly.

  2. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Science.gov (United States)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  3. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    Directory of Open Access Journals (Sweden)

    Chakravarthy Srinivas R.

    2018-03-01

    Full Text Available Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  4. Effects of torpedo blasting on rockburst prevention during deep coal seam mining in the Upper Silesian Coal Basin

    Directory of Open Access Journals (Sweden)

    Ł. Wojtecki

    2017-08-01

    Full Text Available In the Upper Silesian Coal Basin (USCB, coal seams are exploited under progressively more difficult geological and mining conditions (greater depth, higher horizontal stress, more frequent occurrence of competent rock layers, etc.. Mining depth, dislocations and mining remnants in coal seams are the most important factors responsible for the occurrence of rockburst hazards. Longwall mining next to the mining edges of neighbouring coal seams is particularly disadvantageous. The levels of rockburst hazards are minimised via the use of rockburst prevention methods. One active prevention method is torpedo blasting in roof rocks. Torpedo blastings are performed in order to decrease local stress concentrations in rock masses and to fracture the roof rocks to prevent or minimise the impact of high-energy tremors on excavations. The estimation of the effectiveness of torpedo blasting is particularly important when mining is under difficult geological and mining conditions. Torpedo blasting is the main form of active rockburst prevention in the assigned colliery in the Polish part of the USCB. The effectiveness of blasting can be estimated using the seismic effect method, in which the seismic monitoring data and the mass of explosives are taken into consideration. The seismic effect method was developed in the Czech Republic and is always being used in collieries in the Czech part of the coal basin. Now, this method has been widely adopted for our selected colliery in the Polish part of the coal basin. The effectiveness of torpedo blastings in the faces and galleries of the assigned longwall in coal seam 506 has been estimated. The results show that the effectiveness of torpedo blastings for this longwall was significant in light of the seismic effect method, which corresponds to the in situ observations. The seismic effect method is regularly applied to estimating the blasting effectiveness in the selected colliery.

  5. Improvements in seismic event locations in a deep western U.S. coal mine using tomographic velocity models and an evolutionary search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Adam Lurka; Peter Swanson [Central Mining Institute, Katowice (Poland)

    2009-09-15

    Methods of improving seismic event locations were investigated as part of a research study aimed at reducing ground control safety hazards. Seismic event waveforms collected with a 23-station three-dimensional sensor array during longwall coal mining provide the data set used in the analyses. A spatially variable seismic velocity model is constructed using seismic event sources in a passive tomographic method. The resulting three-dimensional velocity model is used to relocate seismic event positions. An evolutionary optimization algorithm is implemented and used in both the velocity model development and in seeking improved event location solutions. Results obtained using the different velocity models are compared. The combination of the tomographic velocity model development and evolutionary search algorithm provides improvement to the event locations. 13 refs., 5 figs., 4 tabs.

  6. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  7. Thermoelectric cooling of microelectronic circuits and waste heat electrical power generation in a desktop personal computer

    International Nuclear Information System (INIS)

    Gould, C.A.; Shammas, N.Y.A.; Grainger, S.; Taylor, I.

    2011-01-01

    Thermoelectric cooling and micro-power generation from waste heat within a standard desktop computer has been demonstrated. A thermoelectric test system has been designed and constructed, with typical test results presented for thermoelectric cooling and micro-power generation when the computer is executing a number of different applications. A thermoelectric module, operating as a heat pump, can lower the operating temperature of the computer's microprocessor and graphics processor to temperatures below ambient conditions. A small amount of electrical power, typically in the micro-watt or milli-watt range, can be generated by a thermoelectric module attached to the outside of the computer's standard heat sink assembly, when a secondary heat sink is attached to the other side of the thermoelectric module. Maximum electrical power can be generated by the thermoelectric module when a water cooled heat sink is used as the secondary heat sink, as this produces the greatest temperature difference between both sides of the module.

  8. Time-Independent Annual Seismic Rates, Based on Faults and Smoothed Seismicity, Computed for Seismic Hazard Assessment in Italy

    Science.gov (United States)

    Murru, M.; Falcone, G.; Taroni, M.; Console, R.

    2017-12-01

    In 2015 the Italian Department of Civil Protection, started a project for upgrading the official Italian seismic hazard map (MPS04) inviting the Italian scientific community to participate in a joint effort for its realization. We participated providing spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events on the entire Italian territory, considering cells of 0.1°x0.1° from M4.5 up to M8.1 for magnitude bin of 0.1 units. Our final model was composed by two different models, merged in one ensemble model, each one with the same weight: the first one was realized by a smoothed seismicity approach, the second one using the seismogenic faults. The spatial smoothed seismicity was obtained using the smoothing method introduced by Frankel (1995) applied to the historical and instrumental seismicity. In this approach we adopted a tapered Gutenberg-Richter relation with a b-value fixed to 1 and a corner magnitude estimated with the bigger events in the catalogs. For each seismogenic fault provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate (for each cells of 0.1°x0.1°) for magnitude bin of 0.1 units, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the same tapered Gutenberg-Richter relation of the smoothed seismicity model. The annual rate for the final model was determined in the following way: if the cell falls within one of the seismic sources, we merge the respective value of rate determined by the seismic moments of the earthquakes generated by each fault and the value of the smoothed seismicity model with the same weight; if instead the cells fall outside of any seismic source we considered the rate obtained from the spatial smoothed seismicity. Here we present the final results of our study to be used for the new Italian seismic hazard map.

  9. Desktop Genetics.

    Science.gov (United States)

    Hough, Soren H; Ajetunmobi, Ayokunmi; Brody, Leigh; Humphryes-Kirilov, Neil; Perello, Edward

    2016-11-01

    Desktop Genetics is a bioinformatics company building a gene-editing platform for personalized medicine. The company works with scientists around the world to design and execute state-of-the-art clustered regularly interspaced short palindromic repeats (CRISPR) experiments. Desktop Genetics feeds the lessons learned about experimental intent, single-guide RNA design and data from international genomics projects into a novel CRISPR artificial intelligence system. We believe that machine learning techniques can transform this information into a cognitive therapeutic development tool that will revolutionize medicine.

  10. Reflection seismic investigations of western Canadian coalfields. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C.; Bertram, M.B.

    1983-03-01

    High resolution reflection seismic studies using a seisgun surface source were undertaken at four sites in Alberta. The objective of the project was to test the feasibility of the seismic method for the exploration and evaluation of coal deposits in a range of environments in western Canada. At Camrose, coherent reflections from a coal zone 70-110 m below the surface were recorde along a 5 km profile. Variations in reflection amplitude and character were interpreted in terms of two main seams. Channel washouts, faults with throws of 5 m or greater, and effects of differential compaction were resolved. Studies at a foothills site showed that good data can be obtained in structurally disturbed areas with mild deformation. At this site, faults with vertical throws of up to 40 m were delineated. In the mountain region, studies indicated that the seismic method is not appropriate in areas with strong deformation. Deep weathering, variable topography and rapid lateral changes in reflector dip were the main reasons for poor data quality. The seisgun is a threshold seismic source which performs well in areas with a shallow water table and a zone of interest within 350 m of the surface. Its effectiveness decreases dramatically if the overburden is both thick and dry. Careful selection of field geometry and recording parameters is critical. In data processing, important aspects are the careful muting of first breaks and evaluation of short and long wavelength weathering statics corrections. A computer program listing for static correction analysis is included. The seismic method is very appropriate for evaluation of Plains and Foothills coal deposits in Alberta. It can provide continuous subsurface coverage between drillholes and therefore reduce the density of drillholes required to delineate a prospective area. 29 refs., 33 figs., 2 tabs.

  11. Instant Citrix XenDesktop 5 starter

    CERN Document Server

    Magdy, Mahmoud

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This easy-to-follow, hands-on guide shows you how to implement desktop virtualization with real life cases and step-by-step instructions. It is a tutorial with step-by-step instructions and adequate screenshots for the installation and administration of Citrix XenDesktop.If you are new to XenDesktop or are looking to build your skills in desktop virtualization, this is your step-by-step guide to learning Citrix XenDesktop. For those architects a

  12. Linux Desktop Pocket Guide

    CERN Document Server

    Brickner, David

    2005-01-01

    While Mac OS X garners all the praise from pundits, and Windows XP attracts all the viruses, Linux is quietly being installed on millions of desktops every year. For programmers and system administrators, business users, and educators, desktop Linux is a breath of fresh air and a needed alternative to other operating systems. The Linux Desktop Pocket Guide is your introduction to using Linux on five of the most popular distributions: Fedora, Gentoo, Mandriva, SUSE, and Ubuntu. Despite what you may have heard, using Linux is not all that hard. Firefox and Konqueror can handle all your web bro

  13. SERVICE HANDBOOK FOR THE DESKTOP SUPPORT CONTRACT WIH IT DIVISION

    CERN Multimedia

    2000-01-01

    A Desktop Support Contract has been running since January 1999 to offer help to all users at CERN with problems that occur with their desktop computers. The contract is run conjointly by the Swedish Company WM-data and the Swiss company DCS.The contract is comprised of the Computing Helpdesk, a General Service for all parts of CERN and also Local Service for those divisions and groups that want faster response times and additional help with their specific computer environment.In order to describe what services are being offered, and also to give a better understanding of the structure of the contract, a Service Handbook has been created. The intended audience for the Service Handbook is everyone that is using the contract, i.e. users, managers and also the service staff inside the contract. In the handbook you will find what help you can get from the contract, how to get in touch with the contract, and also what response times you can expect. Since the computer environment at CERN is a never-changing entity, ...

  14. Evaluation of paranasal sinus mucosa in coal worker's pneumoconiosis - A computed tomographic study

    Energy Technology Data Exchange (ETDEWEB)

    Ozdemir, H.; Altin, R.; Mahmutyazicioglu, K.; Kart, L.; Uzun, L.; Savranlar, A.; Davcanci, H.; Gundogdu, S. [Zonguldak Karaelmas University, Zonguldak (Turkey). School of Medicine

    2004-09-01

    Objective: To evaluate by computed tomographic scanning the paranasal mucosal changes of coal workers with and without pneumoconiosis. Methods: Examination of images and scores from paranasal computed tomographic scans. The study participants were 26 coal workers with pneumoconiosis, 29 coal workers without pneumoconiosis, and 20 controls. All were men. The extent and patterns of inflamatory paranasal sinus disease were evaluated on computed, tomographic scans by 2 radiologists using the terminology and definitions of Newman and associates. Results: Interobserver agreement for the presence of abnormalities was from good to excellent (K, 0.63-0.89). The mucosal scores of individuals and groups were higher for coal workers than for control subjects. Both scores were significantly higher in the pneumoconiosis group than in the 2 other groups. Conclusions: This study shows that paranasal sinuses were affected more severely in coal workers than in control subjects. In coal workers with pneumoconiosis, the affection was most severe. The relationship between coal dust exposure and paranasal mucosal changes needs further study.

  15. Basics of Desktop Publishing. Second Edition.

    Science.gov (United States)

    Beeby, Ellen; Crummett, Jerrie

    This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…

  16. DIaaS: Resource Management System for the Intra-Cloud with On-Premise Desktops

    Directory of Open Access Journals (Sweden)

    Hyun-Woo Kim

    2017-01-01

    Full Text Available Infrastructure as a service with desktops (DIaaS based on the extensible mark-up language (XML is herein proposed to utilize surplus resources. DIaaS is a traditional surplus-resource integrated management technology. It is designed to provide fast work distribution and computing services based on user service requests as well as storage services through desktop-based distributed computing and storage resource integration. DIaaS includes a nondisruptive resource service and an auto-scalable scheme to enhance the availability and scalability of intra-cloud computing resources. A performance evaluation of the proposed scheme measured the clustering performance time for surplus resource utilization. The results showed improvement in computing and storage services in a connection of at least two computers compared to the traditional method for high-availability measurement of nondisruptive services. Furthermore, an artificial server error environment was used to create a clustering delay for computing and storage services and for nondisruptive services. It was compared to the Hadoop distributed file system (HDFS.

  17. Desktop supercomputer: what can it do?

    Science.gov (United States)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  18. Choosing the Right Desktop Publisher.

    Science.gov (United States)

    Eiser, Leslie

    1988-01-01

    Investigates the many different desktop publishing packages available today. Lists the steps to desktop publishing. Suggests which package to use with specific hardware available. Compares several packages for IBM, Mac, and Apple II based systems. (MVL)

  19. Multimedia architectures: from desktop systems to portable appliances

    Science.gov (United States)

    Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.

    1997-01-01

    Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.

  20. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1990-01-01

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  1. New generation of 3D desktop computer interfaces

    Science.gov (United States)

    Skerjanc, Robert; Pastoor, Siegmund

    1997-05-01

    Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).

  2. Regional-scale geomechanical impact assessment of underground coal gasification by coupled 3D thermo-mechanical modeling

    Science.gov (United States)

    Otto, Christopher; Kempka, Thomas; Kapusta, Krzysztof; Stańczyk, Krzysztof

    2016-04-01

    Underground coal gasification (UCG) has the potential to increase the world-wide coal reserves by utilization of coal deposits not mineable by conventional methods. The UCG process involves combusting coal in situ to produce a high-calorific synthesis gas, which can be applied for electricity generation or chemical feedstock production. Apart from its high economic potentials, UCG may induce site-specific environmental impacts such as fault reactivation, induced seismicity and ground subsidence, potentially inducing groundwater pollution. Changes overburden hydraulic conductivity resulting from thermo-mechanical effects may introduce migration pathways for UCG contaminants. Due to the financial efforts associated with UCG field trials, numerical modeling has been an important methodology to study coupled processes considering UCG performance. Almost all previous UCG studies applied 1D or 2D models for that purpose, that do not allow to predict the performance of a commercial-scale UCG operation. Considering our previous findings, demonstrating that far-field models can be run at a higher computational efficiency by using temperature-independent thermo-mechanical parameters, representative coupled simulations based on complex 3D regional-scale models were employed in the present study. For that purpose, a coupled thermo-mechanical 3D model has been developed to investigate the environmental impacts of UCG based on a regional-scale of the Polish Wieczorek mine located in the Upper Silesian Coal Basin. The model size is 10 km × 10 km × 5 km with ten dipping lithological layers, a double fault and 25 UCG reactors. Six different numerical simulation scenarios were investigated, considering the transpressive stress regime present in that part of the Upper Silesian Coal Basin. Our simulation results demonstrate that the minimum distance between the UCG reactors is about the six-fold of the coal seam thickness to avoid hydraulic communication between the single UCG

  3. Induced seismicity in Carbon and Emery counties, Utah

    Science.gov (United States)

    Brown, Megan R. M.

    Utah is one of the top producers of oil and natural gas in the United States. Over the past 18 years, more than 4.2 billion gallons of wastewater from the petroleum industry have been injected into the Navajo Sandstone, Kayenta Formation, and Wingate Sandstone in two areas in Carbon and Emery County, Utah, where seismicity has increased during the same period. In this study, I investigated whether or not wastewater injection is related to the increased seismicity. Previous studies have attributed all of the seismicity in central Utah to coal mining activity. I found that water injection might be a more important cause. In the coal mining area, seismicity rate increased significantly 1-5 years following the commencement of wastewater injection. The increased seismicity consists almost entirely of earthquakes with magnitudes of less than 3, and is localized in areas seismically active prior to the injection. I have established the spatiotemporal correlations between the coal mining activities, the wastewater injection, and the increased seismicity. I used simple groundwater models to estimate the change in pore pressure and evaluate the observed time gap between the start of injection and the onset of the increased seismicity in the areas surrounding the injection wells. To ascertain that the increased seismicity is not fluctuation of background seismicity, I analyzed the magnitude-frequency relation of these earthquakes and found a clear increase in the b-value following the wastewater injection. I conclude that the marked increase of seismicity rate in central Utah is induced by both mining activity and wastewater injection, which raised pore pressure along pre-existing faults.

  4. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    Science.gov (United States)

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  5. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  6. Desktop Publishing Choices: Making an Appropriate Decision.

    Science.gov (United States)

    Crawford, Walt

    1991-01-01

    Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…

  7. Desktop supercomputer: what can it do?

    International Nuclear Information System (INIS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-01-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  8. Multimodal Desktop Interaction: The Face –Object-Gesture–Voice Example

    DEFF Research Database (Denmark)

    Vidakis, Nikolas; Vlasopoulos, Anastasios; Kounalakis, Tsampikos

    2013-01-01

    This paper presents a natural user interface system based on multimodal human computer interaction, which operates as an intermediate module between the user and the operating system. The aim of this work is to demonstrate a multimodal system which gives users the ability to interact with desktop...

  9. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  10. Determination of Destress Blasting Effectiveness Using Seismic Source Parameters

    Science.gov (United States)

    Wojtecki, Łukasz; Mendecki, Maciej J.; Zuberek, Wacaław M.

    2017-12-01

    Underground mining of coal seams in the Upper Silesian Coal Basin is currently performed under difficult geological and mining conditions. The mining depth, dislocations (faults and folds) and mining remnants are responsible for rockburst hazard in the highest degree. This hazard can be minimized by using active rockburst prevention, where destress blastings play an important role. Destress blastings in coal seams aim to destress the local stress concentrations. These blastings are usually performed from the longwall face to decrease the stress level ahead of the longwall. An accurate estimation of active rockburst prevention effectiveness is important during mining under disadvantageous geological and mining conditions, which affect the risk of rockburst. Seismic source parameters characterize the focus of tremor, which may be useful in estimating the destress blasting effects. Investigated destress blastings were performed in coal seam no. 507 during its longwall mining in one of the coal mines in the Upper Silesian Coal Basin under difficult geological and mining conditions. The seismic source parameters of the provoked tremors were calculated. The presented preliminary investigations enable a rapid estimation of the destress blasting effectiveness using seismic source parameters, but further analysis in other geological and mining conditions with other blasting parameters is required.

  11. Negotiation of Meaning in Desktop Videoconferencing-Supported Distance Language Learning

    Science.gov (United States)

    Wang, Yuping

    2006-01-01

    The aim of this research is to reveal the dynamics of focus on form in task completion via videoconferencing. This examination draws on current second language learning theories regarding effective language acquisition, research in Computer Mediated Communication (CMC) and empirical data from an evaluation of desktop videoconferencing-supported…

  12. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    Science.gov (United States)

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  13. Cloud Computing Services for Seismic Networks

    Science.gov (United States)

    Olson, Michael

    This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN---the Community Seismic Network---which uses relatively low-cost sensors deployed by members of the community, and (2) SAF---the Situation Awareness Framework---which integrates data from multiple sources, including the CSN, CISN---the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California---and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

  14. GTfold: Enabling parallel RNA secondary structure prediction on multi-core desktops

    DEFF Research Database (Denmark)

    Swenson, M Shel; Anderson, Joshua; Ash, Andrew

    2012-01-01

    achieved significant improvements in runtime, but their implementations were not portable from niche high-performance computers or easily accessible to most RNA researchers. With the increasing prevalence of multi-core desktop machines, a new parallel prediction program is needed to take full advantage...

  15. Monte Carlo simulation of electrothermal atomization on a desktop personal computer

    Science.gov (United States)

    Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.

    1996-07-01

    Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.

  16. Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.

    Science.gov (United States)

    Danziger, Pamela N.

    This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…

  17. Study of Coal Burst Source Locations in the Velenje Colliery

    Directory of Open Access Journals (Sweden)

    Goran Vižintin

    2016-06-01

    Full Text Available The Velenje coal mine (VCM is situated on the largest Slovenian coal deposit and in one of the thickest layers of coal known in the world. The thickness of the coal layer causes problems for the efficiency of extraction, since the majority of mining operations is within the coal layer. The selected longwall coal mining method with specific geometry, increasing depth of excavations, changes in stress state and naturally given geomechanical properties of rocks induce seismic events. Induced seismic events can be caused by caving processes, blasting or bursts of coal or the surrounding rock. For 2.5D visualization, data of excavations, ash content and calorific value of coal samples, hanging wall and footwall occurrence, subsidence of the surface and coal burst source locations were collected. Data and interpolation methods available in software package Surfer®12 were statistically analyzed and a Kriging (KRG interpolation method was chosen. As a result 2.5D visualizations of coal bursts source locations with geomechanical properties of coal samples taken at different depth in the coal seam in the VCM were made with data-visualization packages Surfer®12 and Voxler®3.

  18. Desktop Publishing: Changing Technology, Changing Occupations.

    Science.gov (United States)

    Stanton, Michael

    1991-01-01

    Describes desktop publishing (DTP) and its place in corporations. Lists job titles of those working in desktop publishing and describes DTP as it is taught at secondary and postsecondary levels and by private trainers. (JOW)

  19. Nuclear Plant Analyzer desktop workstation: An integrated interactive simulation, visualization and analysis tool

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1991-01-01

    The advanced, best-estimate, reactor thermal-hydraulic codes were originally developed as mainframe computer applications because of speed, precision, memory and mass storage requirements. However, the productivity of numerical reactor safety analysts has historically been hampered by mainframe dependence due to limited mainframe CPU allocation, accessibility and availability, poor mainframe job throughput, and delays in obtaining and difficulty comprehending printed numerical results. The Nuclear Plant Analyzer (NPA) was originally developed as a mainframe computer-graphics aid for reactor safety analysts in addressing the latter consideration. Rapid advances in microcomputer technology have since enabled the installation and execution of these reactor safety codes on desktop computers thereby eliminating mainframe dependence. The need for a complementary desktop graphics display generation and presentation capability, coupled with the need for software standardization and portability, has motivated the redesign of the NPA as a UNIX/X-Windows application suitable for both mainframe and microcomputer

  20. Promises and Realities of Desktop Publishing.

    Science.gov (United States)

    Thompson, Patricia A.; Craig, Robert L.

    1991-01-01

    Examines the underlying assumptions of the rhetoric of desktop publishing promoters. Suggests four criteria to help educators provide insights into issues and challenges concerning desktop publishing technology that design students will face on the job. (MG)

  1. Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).

    Science.gov (United States)

    Guthrie, Jim

    1995-01-01

    Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…

  2. Crowd-Sourcing Seismic Data for Education and Research Opportunities with the Quake-Catcher Network

    Science.gov (United States)

    Sumy, D. F.; DeGroot, R. M.; Benthien, M. L.; Cochran, E. S.; Taber, J. J.

    2016-12-01

    The Quake Catcher Network (QCN; quakecatcher.net) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records, including the 2010 M8.8 Maule, Chile, the 2010 M7.1 Darfield, New Zealand, and 2015 M7.8 Gorkha, Nepal earthquakes. In 2016, the QCN in the United States transitioned to the Incorporated Research Institutions for Seismology (IRIS) Consortium and the Southern California Earthquake Center (SCEC), which are facilities funded through the National Science Foundation and the United States Geological Survey, respectively. The transition has allowed for an influx of new ideas and new education related efforts, which include focused installations in several school districts in southern California, on Native American reservations in North Dakota, and in the most seismically active state in the contiguous U.S. - Oklahoma. We present and describe these recent educational opportunities, and highlight how QCN has engaged a wide sector of the public in scientific data collection, particularly through the QCN-EPIcenter Network and NASA Mars InSight teacher programs. QCN provides the public with information and insight into how seismic data are collected, and how researchers use these data to better understand and characterize seismic activity. Lastly, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate felt earthquakes, and look towards the bright future of the network.

  3. Making the Leap to Desktop Publishing.

    Science.gov (United States)

    Schleifer, Neal

    1986-01-01

    Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)

  4. A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing

    Science.gov (United States)

    Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.

    2012-04-01

    Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking

  5. Desktop publishing and validation of custom near visual acuity charts.

    Science.gov (United States)

    Marran, Lynn; Liu, Lei; Lau, George

    2008-11-01

    Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.

  6. Global seismic tomography and modern parallel computers

    Directory of Open Access Journals (Sweden)

    A. Piersanti

    2006-06-01

    Full Text Available A fast technological progress is providing seismic tomographers with computers of rapidly increasing speed and RAM, that are not always properly taken advantage of. Large computers with both shared-memory and distributedmemory architectures have made it possible to approach the tomographic inverse problem more accurately. For example, resolution can be quantified from the resolution matrix rather than checkerboard tests; the covariance matrix can be calculated to evaluate the propagation of errors from data to model parameters; the L-curve method can be applied to determine a range of acceptable regularization schemes. We show how these exercises can be implemented efficiently on different hardware architectures.

  7. Effects of torpedo blasting on rockburst prevention during deep coal seam mining in the Upper Silesian Coal Basin

    Czech Academy of Sciences Publication Activity Database

    Wojtecki, Ł.; Koníček, Petr; Schreiber, J.

    2017-01-01

    Roč. 9, č. 4 (2017), s. 694-701 ISSN 1674-7755 Institutional support: RVO:68145535 Keywords : rockburst prevention * torpedo blasting * seismic effect * Upper Silesian Coal Basin (USCB) Subject RIV: DH - Mining, incl. Coal Mining OBOR OECD: Mining and mineral processing http://www.sciencedirect.com/science/article/pii/S1674775517300896

  8. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  9. Technical Writing Teachers and the Challenges of Desktop Publishing.

    Science.gov (United States)

    Kalmbach, James

    1988-01-01

    Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

  10. Desktop Publishing for the Gifted/Talented.

    Science.gov (United States)

    Hamilton, Wayne

    1987-01-01

    Examines the nature of desktop publishing and how it can be used in the classroom for gifted/talented students. Characteristics and special needs of such students are identified, and it is argued that desktop publishing addresses those needs, particularly with regard to creativity. Twenty-six references are provided. (MES)

  11. Seismic modelling of shallow coalfields

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C. (University of Calgary, Calgary, Alberta (Canada). Dept. of Geology and Geophysics.)

    1987-01-01

    This study was undertaken in order to determine whether reflection seismic surveys can be used to map stratigraphic and structural detail of shallow Plains-type coal deposits. Two coalfields in central Alberta were used to examine and determine optimum acquisition parameters for reflection seismic surveys in such settings. The study was based on 1-D and 2-D numerical seismic modelling using sonic and density well logs to formulate a layered earth model. Additional objectives were to interpret the reflection seismic data in terms of geologic features in the study area, and to investigate the relationship between vertical resolution and field acquisition geometry. 27 refs., 41 figs.

  12. Desktop Publishing.

    Science.gov (United States)

    Stanley, Milt

    1986-01-01

    Defines desktop publishing, describes microcomputer developments and software tools that make it possible, and discusses its use as an instructional tool to improve writing skills. Reasons why students' work should be published, examples of what to publish, and types of software and hardware to facilitate publishing are reviewed. (MBR)

  13. The Printout: Desktop Pulishing in the Classroom.

    Science.gov (United States)

    Balajthy, Ernest; Link, Gordon

    1988-01-01

    Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)

  14. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report is Volume 2 of the three volume documentation of the Seismic Module of CARES and represents the User's Manual. 14 refs

  15. Computer-aided planning of brown coal seam mining in regard to coal quality

    Energy Technology Data Exchange (ETDEWEB)

    Ciesielski, R.; Lehmann, A.; Rabe, H.; Richter, S.

    1988-09-01

    Discusses features of the geologic SORVER software developed at the Freiberg Fuel Institute, GDR. The program processes geologic data from exploratory wells, petrographic characteristics of a coal seam model, technological mining parameters and coal quality requirements of consumers. Brown coal reserves of coking coal, gasification coal, briquetting coal and steam coal are calculated. Vertical seam profiles and maps of seam horizon isolines can be plotted using the program. Coal quality reserves along the surface of mine benches, mining block widths and lengths for excavators, maximum possible production of individual coal qualities by selective mining, and coal quality losses due to mining procedures are determined. The program is regarded as a means of utilizing deposit reserves more efficiently. 5 refs.

  16. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  17. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    Science.gov (United States)

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  18. SEISRISK II; a computer program for seismic hazard estimation

    Science.gov (United States)

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  19. The Virtual Desktop: Options and Challenges in Selecting a Secure Desktop Infrastructure Based on Virtualization

    Science.gov (United States)

    2011-10-01

    the virtual desktop environment still functions for the users associated with it. Users can access the virtual desktop through the local network and...technologie de virtualisation du poste de travail peut contribuer à combler les besoins de partage de l’information sécuritaire au sein du MDN. Le... virtualisation . Il englobe un aperçu de la virtualisation d’un poste de travail, y compris un examen approfondi de deux architectures différentes : le

  20. Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts

    Science.gov (United States)

    Jacobson, Jeffery

    2013-01-01

    In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…

  1. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 3 of the volume documentation of the Seismic Module of CARES. It presents three sample problems typically encountered in the Soil-Structure Interaction analyses. 14 refs., 36 figs., 2 tabs

  2. U.S. origin coking coal in the global market : a seismic shift in the global coal market

    International Nuclear Information System (INIS)

    Thrasher, E.

    2010-01-01

    This presentation discussed conditions in the global coal market and its impact on producers in the United States (U.S). The significant factors include the strong recovery in Asia, the switch from annual benchmark pricing to quarterly pricing, and the return of U.S. origin coking coal as a long-term supply source for Asia. The global recovery in manufacturing is strong in Asia and weak in more mature economies. A shift in trade patterns has occurred in that 4 of the top 10 destinations for U.S. coking coal exports are now in Asia, up from 1 in 2009, and the tonnage increases to these destinations are at unprecedented levels. Demand for U.S. origin coal will continue to increase as the economies in Western Europe improve and the emerging economies in Eastern Europe and South America grow. Looking at the U.S. coking coal supply, high volume type A coal will be used in the domestic market while high volume type B coal will be used for international demand. Government regulatory agencies create an uncertain environment for investments. Geology and the effects of regulatory actions have decreased productivity. An improvement to the supply chain is that lower cost ocean freight lowers the cost of delivered coal. The prices of coking coal have stabilized at levels that support reasonable returns on investment. The seaborne coking coal market has changed with China's shift to being a significant importer. Mine, rail, and port capacity will constrain the ability of producers in the U.S. to export coking coal to some degree. 2 tabs., 13 figs.

  3. VMware Horizon 6 desktop virtualization solutions

    CERN Document Server

    Cartwright, Ryan; Langone, Jason; Leibovici, Andre

    2014-01-01

    If you are a desktop architect, solution provider, end-user consultant, virtualization engineer, or anyone who wants to learn how to plan and design the implementation of a virtual desktop solution based on Horizon 6, then this book is for you. An understanding of VMware vSphere fundamentals coupled with experience in the installation or administration of a VMware environment would be a plus during reading.

  4. Interim report on Tanjung Enim IV coal exploration project. South Arahan area (1998/1999)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-05-01

    The exploration project in Indonesia covered geological mapping, drilling, geophysical logging, underground water pumping tests, vertical seismic profiling (VSP), and seismic reflection survey. Ten boreholes were drilled. Moreover, coal property analysis, geotechnical rock test, geochemical analysis, and the like were conducted by examining core specimens sampled from the boreholes. It was found that there are three main coal beds which continuously extend to the two ends of the synclinic structure. It was also found that there is a 6m-thick coal bed 200m further below the three main coal beds, and it is estimated to produce approximately 6,000kcal/kg. Coal from two of the three beds produces 5,000kcal/kg, containing but a little ash and sulfur. Coal from the third includes 1.17% of sulfur. Coal in all the beds is summed up, and then it is estimated that there is approximately 1,054-million tons of coal in reserve in the South Arahan area. (NEDO)

  5. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    Science.gov (United States)

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.

  6. Desktop Publishing in a PC-Based Environment.

    Science.gov (United States)

    Sims, Harold A.

    1987-01-01

    Identifies, considers, and interrelates the functionality of hardware, firmware, and software types; discusses the relationship of input and output devices in the PC-based desktop publishing environment; and reports some of what has been experienced in three years of working intensively in/with desktop publishing devices and solutions. (MES)

  7. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licencing reviews of nuclear power plant structures. The docomentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 1 of the three volume documentation of the Seismic Module of CARES. It concentrates on the theoretical basis of the system and presents modeling assumptions and limitations as well as solution schemes and algorithms of CARES. 31 refs., 6 figs

  8. Turbulence Visualization at the Terascale on Desktop PCs

    KAUST Repository

    Treib, M.

    2012-12-01

    Despite the ongoing efforts in turbulence research, the universal properties of the turbulence small-scale structure and the relationships between small-and large-scale turbulent motions are not yet fully understood. The visually guided exploration of turbulence features, including the interactive selection and simultaneous visualization of multiple features, can further progress our understanding of turbulence. Accomplishing this task for flow fields in which the full turbulence spectrum is well resolved is challenging on desktop computers. This is due to the extreme resolution of such fields, requiring memory and bandwidth capacities going beyond what is currently available. To overcome these limitations, we present a GPU system for feature-based turbulence visualization that works on a compressed flow field representation. We use a wavelet-based compression scheme including run-length and entropy encoding, which can be decoded on the GPU and embedded into brick-based volume ray-casting. This enables a drastic reduction of the data to be streamed from disk to GPU memory. Our system derives turbulence properties directly from the velocity gradient tensor, and it either renders these properties in turn or generates and renders scalar feature volumes. The quality and efficiency of the system is demonstrated in the visualization of two unsteady turbulence simulations, each comprising a spatio-temporal resolution of 10244. On a desktop computer, the system can visualize each time step in 5 seconds, and it achieves about three times this rate for the visualization of a scalar feature volume. © 1995-2012 IEEE.

  9. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    Science.gov (United States)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to

  10. Mathematical model for water quality impact assessment and its computer application in coal mine water

    International Nuclear Information System (INIS)

    Sundararajan, M.; Chakraborty, M.K.; Gupta, J.P.; Saxena, N.C.; Dhar, B.B.

    1994-01-01

    This paper presents a mathematical model to assess the Water Quality Impact in coal mine or in river system by accurate and rational method. Algorithm, flowchart and computer programme have been developed upon this model to assess the quality of coal mine water. 3 refs., 2 figs., 2 tabs

  11. Bringing the medical library to the office desktop.

    Science.gov (United States)

    Brown, S R; Decker, G; Pletzke, C J

    1991-01-01

    This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.

  12. MICA: desktop software for comprehensive searching of DNA databases

    Directory of Open Access Journals (Sweden)

    Glick Benjamin S

    2006-10-01

    Full Text Available Abstract Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software.

  13. Critical Analysis of Underground Coal Gasification Models. Part II: Kinetic and Computational Fluid Dynamics Models

    Directory of Open Access Journals (Sweden)

    Alina Żogała

    2014-01-01

    Originality/value: This paper presents state of art in the field of coal gasification modeling using kinetic and computational fluid dynamics approach. The paper also presents own comparative analysis (concerned with mathematical formulation, input data and parameters, basic assumptions, obtained results etc. of the most important models of underground coal gasification.

  14. An efficient implementation of 3D high-resolution imaging for large-scale seismic data with GPU/CPU heterogeneous parallel computing

    Science.gov (United States)

    Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng

    2018-02-01

    De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.

  15. Desktop Publishing as a Learning Resources Service.

    Science.gov (United States)

    Drake, David

    In late 1988, Midland College in Texas implemented a desktop publishing service to produce instructional aids and reduce and complement the workload of the campus print shop. The desktop service was placed in the Media Services Department of the Learning Resource Center (LRC) for three reasons: the LRC was already established as a campus-wide…

  16. Life cycle assessment study of a Chinese desktop personal computer.

    Science.gov (United States)

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study.

  17. Life cycle assessment study of a Chinese desktop personal computer

    International Nuclear Information System (INIS)

    Duan Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li Jinhui

    2009-01-01

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps - i.e. the end of life phase - lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study

  18. Computational fluid dynamic simulations of coal-fired utility boilers: An engineering tool

    Energy Technology Data Exchange (ETDEWEB)

    Efim Korytnyi; Roman Saveliev; Miron Perelman; Boris Chudnovsky; Ezra Bar-Ziv [Ben-Gurion University of the Negev, Beer-Sheva (Israel)

    2009-01-15

    The objective of this study was to develop an engineering tool by which the combustion behavior of coals in coal-fired utility boilers can be predicted. We presented in this paper that computational fluid dynamic (CFD) codes can successfully predict performance of - and emission from - full-scale pulverized-coal utility boilers of various types, provided that the model parameters required for the simulation are properly chosen and validated. For that purpose we developed a methodology combining measurements in a 50 kW pilot-scale test facility with CFD simulations using the same CFD code configured for both test and full-scale furnaces. In this method model parameters of the coal processes are extracted and validated. This paper presents the importance of the validation of the model parameters which are used in CFD codes. Our results show very good fit of CFD simulations with various parameters measured in a test furnace and several types of utility boilers. The results of this study demonstrate the viability of the present methodology as an effective tool for optimization coal burning in full-scale utility boilers. 41 refs., 9 figs., 3 tabs.

  19. System Testing of Desktop and Web Applications

    Science.gov (United States)

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  20. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  1. SMACS: a system of computer programs for probabilistic seismic analysis of structures and subsystems. Volume I. User's manual

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.; Bumpus, S.; Gerhard, M.A.

    1985-03-01

    The SMACS (Seismic Methodology Analysis Chain with Statistics) system of computer programs, one of the major computational tools of the Seismic Safety Margins Research Program (SSMRP), links the seismic input with the calculation of soil-structure interaction, major structure response, and subsystem response. The seismic input is defined by ensembles of acceleration time histories in three orthogonal directions. Soil-structure interaction and detailed structural response are then determined simultaneously, using the substructure approach to SSI as implemented in the CLASSI family of computer programs. The modus operandi of SMACS is to perform repeated deterministic analyses, each analysis simulating an earthquake occurrence. Parameter values for each simulation are sampled from assumed probability distributions according to a Latin hypercube experimental design. The user may specify values of the coefficients of variation (COV) for the distributions of the input variables. At the heart of the SMACS system is the computer program SMAX, which performs the repeated SSI response calculations for major structure and subsystem response. This report describes SMAX and the pre- and post-processor codes, used in conjunction with it, that comprise the SMACS system

  2. Computer models and simulations of IGCC power plants with Canadian coals

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, L.; Furimsky, E.

    1999-07-01

    In this paper, three steady state computer models for simulation of IGCC power plants with Shell, Texaco and BGL (British Gas Lurgi) gasifiers will be presented. All models were based on a study by Bechtel for Nova Scotia Power Corporation. They were built by using Advanced System for Process Engineering (ASPEN) steady state simulation software together with Fortran programs developed in house. Each model was integrated from several sections which can be simulated independently, such as coal preparation, gasification, gas cooling, acid gas removing, sulfur recovery, gas turbine, heat recovery steam generation, and steam cycle. A general description of each process, model's overall structure, capability, testing results, and background reference will be given. The performance of some Canadian coals on these models will be discussed as well. The authors also built a computer model of IGCC power plant with Kellogg-Rust-Westinghouse gasifier, however, due to limitation of paper length, it is not presented here.

  3. Seismic changes industry

    International Nuclear Information System (INIS)

    Taylor, G.

    1992-01-01

    This paper discusses the growth in the seismic industry as a result of the recent increases in the foreign market. With the decline of communism and the opening of Latin America to exploration, seismic teams have moved out into these areas in support of the oil and gas industry. The paper goes on to discuss the improved technology available for seismic resolution and the subsequent use of computers to field-proof the data while the seismic team is still on-site. It also discusses the effects of new computer technology on reducing the amount of support staff that is required to both conduct and interpret seismic information

  4. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  5. Desktop Publishing in Libraries.

    Science.gov (United States)

    Cisler, Steve

    1987-01-01

    Describes the components, costs, and capabilities of several desktop publishing systems, and examines their possible impact on work patterns within organizations. The text and graphics of the article were created using various microcomputer software packages. (CLB)

  6. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  7. LCCP Desktop Application v1.0 Engineering Reference

    Energy Technology Data Exchange (ETDEWEB)

    Beshr, Mohamed [Univ. of Maryland, College Park, MD (United States); Aute, Vikrant [Univ. of Maryland, College Park, MD (United States)

    2014-04-01

    This Life Cycle Climate Performance (LCCP) Desktop Application Engineering Reference is divided into three parts. The first part of the guide, consisting of the LCCP objective, literature review, and mathematical background, is presented in Sections 2-4. The second part of the guide (given in Sections 5-10) provides a description of the input data required by the LCCP desktop application, including each of the input pages (Application Information, Load Information, and Simulation Information) and details for interfacing the LCCP Desktop Application with the VapCyc and EnergyPlus simulation programs. The third part of the guide (given in Section 11) describes the various interfaces of the LCCP code.

  8. Current Induced Seismicity in the Paskov Mine Field

    Czech Academy of Sciences Publication Activity Database

    Holub, Karel; Rušajová, Jana; Holečko, J.

    2013-01-01

    Roč. 10, č. 2 (2013), s. 181-187 ISSN 1214-9705 R&D Projects: GA MŠk LM2010008 Institutional support: RVO:68145535 Keywords : Ostrava-Karviná coal mines * seismic network * induced seismicity * location plot Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.667, year: 2013 http://www.irsm.cas.cz/materialy/acta_content/2013_02/acta_170_07_%20Holub_181-187.pdf

  9. Citrix XenApp 7.5 desktop virtualization solutions

    CERN Document Server

    Paul, Andy

    2014-01-01

    If you are a Citrix® engineer, a virtualization consultant, or an IT project manager with prior experience of using Citrix XenApp® and related technologies for desktop virtualization and want to further explore the power of XenApp® for flawless desktop virtualization, then this book is for you.

  10. Telemedicine in rural areas. Experience with medical desktop-conferencing via satellite.

    Science.gov (United States)

    Ricke, J; Kleinholz, L; Hosten, N; Zendel, W; Lemke, A; Wielgus, W; Vöge, K H; Fleck, E; Marciniak, R; Felix, R

    1995-01-01

    Cooperation between physicians in hospitals in rural areas can be assisted by desktop-conferencing using a satellite link. For six weeks, medical desktop-conferencing was tested during daily clinical conferences between the Virchow-Klinikum, Berlin, and the Medical Academy, Wroclaw. The communications link was provided by the German Telekom satellite system MCS, which allowed temporary connections to be established on demand by manual dialling. Standard hardware and software were used for videoconferencing, as well as software for medical communication developed in the BERMED project. Digital data, such as computed tomography or magnetic resonance images, were transmitted by a digital data channel in parallel to the transmission of analogue video and audio signals. For conferences involving large groups of people, hardware modifications were required. These included the installation of a video projector, adaptation of the audio system with improved echo cancellation, and installation of extra microphones. Learning to use an unfamiliar communication medium proved to be uncomplicated for the participating physicians.

  11. Detailed comparison between computed and measured FBR core seismic responses

    International Nuclear Information System (INIS)

    Forni, M.; Martelli, A.; Melloni, R.; Bonacina, G.

    1988-01-01

    This paper presents a detailed comparison between seismic calculations and measurements performed for various mock-ups consisting of groups of seven and nineteen simplified elements of the Italian PEC fast reactor core. Experimental tests had been performed on shaking tables in air and water (simulating sodium) with excitations increasing up to above Safe Shutdown Earthquake. The PEC core-restraint ring had been simulated in some tests. All the experimental tests have been analysed by use of both the one-dimensional computer program CORALIE and the two-dimensional program CLASH. Comparisons have been made for all the instrumented elements, in both the time and the frequency domains. The good agreement between calculations and measurements has confirmed adequacy of the fluid-structure interaction model used for PEC core seismic design verification

  12. SONATINA-1: a computer program for seismic response analysis of column in HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1980-11-01

    An computer program SONATINA-1 for predicting the behavior of a prismatic high-temperature gas-cooled reactor (HTGR) core under seismic excitation has been developed. In this analytical method, blocks are treated as rigid bodies and are constrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions. Coulomb friction between blocks and between dowel holes and pins is also considered. A spring dashpot model is used for the collision process between adjacent blocks and between blocks and boundary walls. Analytical results are compared with experimental results and are found to be in good agreement. The computer program can be used to predict the behavior of the HTGR core under seismic excitation. (author)

  13. Man-caused seismicity of Kuzbass

    Science.gov (United States)

    Emanov, Alexandr; Emanov, Alexey; Leskova, Ekaterina; Fateyev, Alexandr

    2010-05-01

    A natural seismicity of Kuznetsk Basin is confined in the main to mountain frame of Kuznetsk hollow. In this paper materials of experimental work with local station networks within sediment basin are presented. Two types of seismicity display within Kuznetsk hollow have been understood: first, man-caused seismic processes, confined to mine working and concentrated on depths up to one and a half of km; secondly, seismic activations on depths of 2-56 km, not coordinated in plan with coal mines. Every of studied seismic activations consists of large quantity of earthquakes of small powers (Ms=1-3). From one to first tens of earthquakes were recorded in a day. The earthquakes near mine working shift in space along with mine working, and seismic process become stronger at the instant a coal-plough machine is operated, and slacken at the instant the preventive works are executed. The seismic processes near three lavas in Kuznetsk Basin have been studied in detail. Uplift is the most typical focal mechanism. Activated zone near mine working reach in diameter 1-1,5 km. Seismic activations not linked with mine working testify that the subsoil of Kuznetsk hollow remain in stress state in whole. The most probable causes of man-caused action on hollow are processes, coupled with change of physical state of rocks at loss of methane from large volume or change by mine working of rock watering in large volume. In this case condensed rocks, lost gas and water, can press out upwards, realizing the reverse fault mechanism of earthquakes. A combination of stress state of hollow with man-caused action at deep mining may account for incipient activations in Kuznetsk Basin. Today earthquakes happen mainly under mine workings, though damages of workings themselves do not happen, but intensive shaking on surface calls for intent study of so dangerous phenomena. In 2009 replicates of the experiment on research of seismic activations in area of before investigated lavas have been conducted

  14. Perception Analysis of Desktop and Mobile Service Website

    Directory of Open Access Journals (Sweden)

    Rizqiyatul Khoiriyah

    2016-12-01

    Full Text Available The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014. The expected outcome is to know the user perceptions of the available services and information in the website along with the possibility of desktop and mobile gap arising from differences in the two services. These results can be used as a service model on the website of the user experience.

  15. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    Science.gov (United States)

    Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-03-01

    3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.

  16. In-seam seismic surveys at Polio and Santiago collieries during the month of January

    Energy Technology Data Exchange (ETDEWEB)

    1987-01-01

    In-seam seismic surveys were carried out over the last two weekends in January in order to assess two coal panels, one in seam Cuatro at Polio colliery between levels 4 and 5 of the Centella field and the other in the Mariana seam at Santiago colliery between levels 3,5, and 7 in the Desquite zone. Geological research called in-seam seismics, which is just being developed in Spain, is a geophysical method developed specially for detailed investigation of coal seams.

  17. 1982 Australian coal conference papers

    Energy Technology Data Exchange (ETDEWEB)

    1982-01-01

    This third Australian coal conference included papers discussing the market for coal, finance and investment, use of computers, mining, coal research, coal preparation and waste disposal, marketing and trade, and the transport of coal. All papers have been individually abstracted.

  18. A 1.8 trillion degrees-of-freedom, 1.24 petaflops global seismic wave simulation on the K computer

    KAUST Repository

    Tsuboi, Seiji

    2016-03-01

    We present high-performance simulations of global seismic wave propagation with an unprecedented accuracy of 1.2 s seismic period for a realistic three-dimensional Earth model using the spectral element method on the K computer. Our seismic simulations use a total of 665.2 billion grid points and resolve 1.8 trillion degrees of freedom. To realize these large-scale computations, we optimize a widely used community software code to efficiently address all hardware parallelization, especially thread-level parallelization to solve the bottleneck of memory usage for coarse-grained parallelization. The new code exhibits excellent strong scaling for the time stepping loop, that is, parallel efficiency on 82,134 nodes relative to 36,504 nodes is 99.54%. Sustained performance of these computations on the K computer is 1.24 petaflops, which is 11.84% of its peak performance. The obtained seismograms with an accuracy of 1.2 s for the entire globe should help us to better understand rupture mechanisms of devastating earthquakes.

  19. A 1.8 trillion degrees-of-freedom, 1.24 petaflops global seismic wave simulation on the K computer

    KAUST Repository

    Tsuboi, Seiji; Ando, Kazuto; Miyoshi, Takayuki; Peter, Daniel; Komatitsch, Dimitri; Tromp, Jeroen

    2016-01-01

    We present high-performance simulations of global seismic wave propagation with an unprecedented accuracy of 1.2 s seismic period for a realistic three-dimensional Earth model using the spectral element method on the K computer. Our seismic simulations use a total of 665.2 billion grid points and resolve 1.8 trillion degrees of freedom. To realize these large-scale computations, we optimize a widely used community software code to efficiently address all hardware parallelization, especially thread-level parallelization to solve the bottleneck of memory usage for coarse-grained parallelization. The new code exhibits excellent strong scaling for the time stepping loop, that is, parallel efficiency on 82,134 nodes relative to 36,504 nodes is 99.54%. Sustained performance of these computations on the K computer is 1.24 petaflops, which is 11.84% of its peak performance. The obtained seismograms with an accuracy of 1.2 s for the entire globe should help us to better understand rupture mechanisms of devastating earthquakes.

  20. Coal and Open-pit surface mining impacts on American Lands (COAL)

    Science.gov (United States)

    Brown, T. A.; McGibbney, L. J.

    2017-12-01

    Mining is known to cause environmental degradation, but software tools to identify its impacts are lacking. However, remote sensing, spectral reflectance, and geographic data are readily available, and high-performance cloud computing resources exist for scientific research. Coal and Open-pit surface mining impacts on American Lands (COAL) provides a suite of algorithms and documentation to leverage these data and resources to identify evidence of mining and correlate it with environmental impacts over time.COAL was originally developed as a 2016 - 2017 senior capstone collaboration between scientists at the NASA Jet Propulsion Laboratory (JPL) and computer science students at Oregon State University (OSU). The COAL team implemented a free and open-source software library called "pycoal" in the Python programming language which facilitated a case study of the effects of coal mining on water resources. Evidence of acid mine drainage associated with an open-pit coal mine in New Mexico was derived by correlating imaging spectrometer data from the JPL Airborne Visible/InfraRed Imaging Spectrometer - Next Generation (AVIRIS-NG), spectral reflectance data published by the USGS Spectroscopy Laboratory in the USGS Digital Spectral Library 06, and GIS hydrography data published by the USGS National Geospatial Program in The National Map. This case study indicated that the spectral and geospatial algorithms developed by COAL can be used successfully to analyze the environmental impacts of mining activities.Continued development of COAL has been promoted by a Startup allocation award of high-performance computing resources from the Extreme Science and Engineering Discovery Environment (XSEDE). These resources allow the team to undertake further benchmarking, evaluation, and experimentation using multiple XSEDE resources. The opportunity to use computational infrastructure of this caliber will further enable the development of a science gateway to continue foundational COAL

  1. GSAC - Generic Seismic Application Computing

    Science.gov (United States)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient

  2. Tomographic imaging of rock conditions ahead of mining using the shearer as a seismic source - A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Luo, X.; King, A.; Van de Werken, M. [CSIRO, Brisbane, Qld. (Australia)

    2009-11-15

    Roof falls due to poor rock conditions in a coal longwall panel may threaten miner's life and cause significant interruption to mine production. There has been a requirement for technologies that are capable of imaging the rock conditions in longwall coal mining, ahead of the working face and without any interruption to production. A feasibility study was carried out to investigate the characteristics of seismic signals generated by the continuous coal cutter (shearer) and recorded by geophone arrays deployed ahead of the working face, for the purpose of seismic tomographic imaging of roof strata condition before mining. Two experiments were conducted at a coal mine using two arrays of geophones. The experiments have demonstrated that the longwall shearer generates strong and low-frequency (similar to 40 Hz) seismic energy that can be adequately detected by geophones deployed in shallow boreholes along the roadways as far as 300 m from the face. Using noise filtering and signal cross correlation techniques, the seismic arrival times associated with the shearer cutting can be reliably determined. It has proved the concept that velocity variations ahead of the face can be mapped out using tomographic techniques while mining is in progress.

  3. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1991-01-01

    In the process of review and evaluation of licensing issues related to nuclear power plants, it is essential to understand the behavior of seismic loading, foundation and structural properties and their impact on the overall structural response. In most cases, such knowledge could be obtained by using simplified engineering models which, when properly implemented, can capture the essential parameters describing the physics of the problem. Such models do not require execution on large computer systems and could be implemented through a personal computer (PC) based capability. Recognizing the need for a PC software package that can perform structural response computations required for typical licensing reviews, the US Nuclear Regulatory Commission sponsored the development of a PC operated computer software package CARES (Computer Analysis for Rapid Evaluation of Structures) system. This development was undertaken by Brookhaven National Laboratory (BNL) during FY's 1988 and 1989. A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to operate on a PC, have user friendly input/output interface, and have quick turnaround. This paper describes the various features which have been implemented into the seismic module of CARES version 1.0

  4. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  5. Analytical Hierarchy Process for the selection of strategic alternatives for introduction of infrastructure virtual desktop infrastructure in the university

    Directory of Open Access Journals (Sweden)

    Katerina A. Makoviy

    2017-12-01

    Full Text Available The task of choosing a strategy for implementing the virtual desktop infrastructure into the IT infrastructure of the university is considered. The infrastructure of virtual desktops is a technology that provides centralization of management of client workplaces, increase the service life of computers in classrooms. The analysis of strengths and weaknesses, threats and opportunities for introducing virtualization in the university. Alternatives to implementation based on the results of the pilot project have been developed. To obtain quantitative estimates in the SWOT - analysis of the pilot project, the analytical hierarchy process is used. The analysis of implementation of the pilot project by experts is carried out and the integral value of quantitative estimates of various alternatives is generated. The combination of the analytical hierarchy process and SWOT - analysis allows you to choose the optimal strategy for implementing desktop virtualization.

  6. Desktop Publishing in Education.

    Science.gov (United States)

    Hall, Wendy; Layman, J.

    1989-01-01

    Discusses the state of desktop publishing (DTP) in education today and describes the weaknesses of the systems available for use in the classroom. Highlights include document design and layout; text composition; graphics; word processing capabilities; a comparison of commercial and educational DTP packages; and skills required for DTP. (four…

  7. A personal computer code for seismic evaluations of nuclear power plants facilities

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Graves, H.

    1990-01-01

    The program CARES (Computer Analysis for Rapid Evaluation of Structures) is an integrated computational system being developed by Brookhaven National Laboratory (BNL) for the U.S. Nuclear Regulatory Commission. It is specifically designed to be a personal computer (PC) operated package which may be used to determine the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants. CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the various features which have been implemented into the Seismic Module of CARES

  8. NEDO coal resources exploitation subcommittee. 18th project report meeting; NEDO sekitan shigen kaihatsu bunkakai. Dai 18 kai jigyo hokokukai

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-09-01

    In a report on a 'survey for coal transportation system optimization in southern Sumatra' which is to contribute to the improvement of coal exploitation efficiency in the Musi river area in southern Sumatra, the coal transportation system from the coal mine to the harbor is reviewed, scenarios on funding and cost effectiveness and environmental impact are comprehensively examined, and suggestions are submitted for higher efficiency and cost effectiveness. In a report on a 'current utilization status and effectiveness of a non-destructive electromagnetic vibrator shock source,' an electromagnetic vibrator shock source out of several new coal exploration technologies which are combinations of test boring and seismic prospecting is taken up, and is compared, in terms of technical feature and cost effectiveness, with the seismic reflection survey technique that uses an explosive shock source, and then a conclusion is reported that an electromagnetic vibration shock source method is superior. Using the new electromagnetic method, a seismic wave frequency is so chosen as to be suitable for a given depth. Since it is non-destructive and emits less noise, it is expected that it will serve in various fields other than coal mining. (NEDO)

  9. Firing a sub-bituminous coal in pulverized coal boilers configured for bituminous coals

    Energy Technology Data Exchange (ETDEWEB)

    N. Spitz; R. Saveliev; M. Perelman; E. Korytni; B. Chudnovsky; A. Talanker; E. Bar-Ziv [Ben-Gurion University of the Negev, Beer-Sheva (Israel)

    2008-07-15

    It is important to adapt utility boilers to sub-bituminous coals to take advantage of their environmental benefits while limiting operation risks. We discuss the performance impact that Adaro, an Indonesian sub-bituminous coal with high moisture content, has on opposite-wall and tangentially-fired utility boilers which were designed for bituminous coals. Numerical simulations were made with GLACIER, a computational-fluid-dynamic code, to depict combustion behavior. The predictions were verified with full-scale test results. For analysis of the operational parameters for firing Adaro coal in both boilers, we used EXPERT system, an on-line supervision system developed by Israel Electric Corporation. It was concluded that firing Adaro coal, compared to a typical bituminous coal, lowers NOx and SO{sub 2} emissions, lowers LOI content and improves fouling behavior but can cause load limitation which impacts flexible operation. 21 refs., 7 figs., 3 tabs.

  10. Desktop publishing com o scribus

    OpenAIRE

    Silva, Fabrício Riff; Uchôa, Kátia Cilene Amaral

    2015-01-01

    Este artigo apresenta um breve tutorial sobre Desktop Publishing, com ênfase no software livre Scribus, através da criação de um exemplo prático que explora algumas de suas principais funcionalidades.

  11. Integrated system for seismic evaluations

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1989-01-01

    This paper describes the various features of the seismic module of the CARES system (computer analysis for rapid evaluation of structures). This system was developed to perform rapid evaluations of structural behavior and capability of nuclear power plant facilities. The CARES is structural in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the features of the seismic module in particular. The development of the seismic modules of the CARES system is based on an approach which incorporates major aspects of seismic analysis currently employed by the industry into an integrated system that allows for carrying out interactively computations of structural response to seismic motions. The code operates on a PC computer system and has multi-graphics capabilities

  12. Exploring Graphic Design. A Short Course in Desktop Publishing.

    Science.gov (United States)

    Stanley, MLG

    This course in desktop publishing contains seven illustrated modules designed to meet the following objectives: (1) use a desktop publishing program to explore advanced topics in graphic design; (2) learn about typography and how to make design decisions on the use of typestyles; (3) learn basic principles in graphic communications and apply them…

  13. Improving coal flotation recovery using computational fluid dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Peter Koh [CSIRO Minerals (Australia)

    2009-06-15

    This work involves using the latest advances in computational fluid dynamics (CFD) to increase understanding of the hydrodynamics in coal flotation and to identify any opportunities to improve design and operation of both the Microcel column and Jameson cell. The CSIRO CFD model incorporates micro-processes from cell hydrodynamics that affect particle-bubble attachments and detachments. CFD simulation results include the liquid velocities, turbulent dissipation rates, gas hold-up, particle-bubble attachment rates and detachment rates. This work has demonstrated that CFD modelling is a cost effective means of developing an understanding of particle-bubble attachments and detachments, and can be used to identify and test potential cell or process modifications.

  14. Desktop war - data suppliers competing for bigger market share

    International Nuclear Information System (INIS)

    Sword, M.

    1999-01-01

    The intense competition among suppliers of computerized data and computer software to the petroleum and natural gas industry in western Canada is discussed. It is estimated that the Canadian oil patch spends a large sum, about $ 400 million annually on geoscience information and related costs and industry is looking for ways to significantly reduce those costs. There is a need for integrated, desktop driven data sets. Sensing the determination of industry to reduce information acquisition costs, data providers are responding with major consolidation of data sets. The major evolution in the industry is on-line access to increase the speed of information delivery. Data vendors continue to integrate land, well, log, production and other data sets whether public or proprietary. The result is stronger foundations as platforms for interpretive software. Another development is the rise of the Internet and Intranets and the re-definition of the role of information technology departments in the industry as both of these are paving the way for electronic delivery of information and software tools to the desktop. Development of proprietary data sets, acquisition of competitors with complimentary data sets that enhances products and services are just some of the ways data vendors are using to get a bigger piece of the exploration and development pie

  15. FORMED: Bringing Formal Methods to the Engineering Desktop

    Science.gov (United States)

    2016-02-01

    FORMED: BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP BAE SYSTEMS FEBRUARY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...This report is published in the interest of scientific and technical information exchange, and its publication does not constitute the Government’s...BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP 5a. CONTRACT NUMBER FA8750-14-C-0024 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 63781D

  16. Application of parallel computing to seismic damage process simulation of an arch dam

    International Nuclear Information System (INIS)

    Zhong Hong; Lin Gao; Li Jianbo

    2010-01-01

    The simulation of damage process of high arch dam subjected to strong earthquake shocks is significant to the evaluation of its performance and seismic safety, considering the catastrophic effect of dam failure. However, such numerical simulation requires rigorous computational capacity. Conventional serial computing falls short of that and parallel computing is a fairly promising solution to this problem. The parallel finite element code PDPAD was developed for the damage prediction of arch dams utilizing the damage model with inheterogeneity of concrete considered. Developed with programming language Fortran, the code uses a master/slave mode for programming, domain decomposition method for allocation of tasks, MPI (Message Passing Interface) for communication and solvers from AZTEC library for solution of large-scale equations. Speedup test showed that the performance of PDPAD was quite satisfactory. The code was employed to study the damage process of a being-built arch dam on a 4-node PC Cluster, with more than one million degrees of freedom considered. The obtained damage mode was quite similar to that of shaking table test, indicating that the proposed procedure and parallel code PDPAD has a good potential in simulating seismic damage mode of arch dams. With the rapidly growing need for massive computation emerged from engineering problems, parallel computing will find more and more applications in pertinent areas.

  17. Desktop aligner for fabrication of multilayer microfluidic devices.

    Science.gov (United States)

    Li, Xiang; Yu, Zeta Tak For; Geraldo, Dalton; Weng, Shinuo; Alve, Nitesh; Dun, Wu; Kini, Akshay; Patel, Karan; Shu, Roberto; Zhang, Feng; Li, Gang; Jin, Qinghui; Fu, Jianping

    2015-07-01

    Multilayer assembly is a commonly used technique to construct multilayer polydimethylsiloxane (PDMS)-based microfluidic devices with complex 3D architecture and connectivity for large-scale microfluidic integration. Accurate alignment of structure features on different PDMS layers before their permanent bonding is critical in determining the yield and quality of assembled multilayer microfluidic devices. Herein, we report a custom-built desktop aligner capable of both local and global alignments of PDMS layers covering a broad size range. Two digital microscopes were incorporated into the aligner design to allow accurate global alignment of PDMS structures up to 4 in. in diameter. Both local and global alignment accuracies of the desktop aligner were determined to be about 20 μm cm(-1). To demonstrate its utility for fabrication of integrated multilayer PDMS microfluidic devices, we applied the desktop aligner to achieve accurate alignment of different functional PDMS layers in multilayer microfluidics including an organs-on-chips device as well as a microfluidic device integrated with vertical passages connecting channels located in different PDMS layers. Owing to its convenient operation, high accuracy, low cost, light weight, and portability, the desktop aligner is useful for microfluidic researchers to achieve rapid and accurate alignment for generating multilayer PDMS microfluidic devices.

  18. MELCOR/VISOR PWR desktop simulator

    International Nuclear Information System (INIS)

    With, Anka de; Wakker, Pieter

    2010-01-01

    Increasingly, there is a need for a learning support and training tool for nuclear engineers, utilities and students in order to broaden their understanding of advanced nuclear plant characteristics, dynamics, transients and safety features. Nuclear system analysis codes like ASTEC, RELAP5, RETRAN and MELCOR provide calculation results of and visualization tools can be used to graphically represent these results. However, for an efficient education and training a more interactive tool such as a simulator is needed. The simulator connects the graphical tool with the calculation tool in an interactive manner. A small number of desktop simulators exist [1-3]. The existing simulators are capable of representing different types of power plants and various accident conditions. However, they were found to be too general to be used as a reliable plant-specific accident analysis or training tool. A desktop simulator of the Pressurized Water Reactor (PWR) has been created under contract of the Dutch nuclear regulatory body (KFD). The desktop simulator is a software package that provides a close to real simulation of the Dutch nuclear power plant Borssele (KCB) and is used for training of the accident response. The simulator includes the majority of the power plant systems, necessary for the successful simulation of the KCB plant during normal operation, malfunctions and accident situations, and it has been successfully validated against the results of the safety evaluations from the KCB safety report. (orig.)

  19. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to

  20. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  1. Computational fluid dynamics (CFD) analysis of the coal combustion in a boiler of a thermal power plant using different kinds of the manufactured coals

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Cristiano Vitorino da; Lazzari, Luis Carlos; Ziemniczak, Aline; Beskow, Arthur Bortolin [Universidade Regional Integrada do Alto Uruguai e das Missoes (URI), Erechim, RS (Brazil)], E-mails: cristiano@uricer.edu.br, arthur@uricer.edu.br

    2010-07-01

    The state of the art in computational fluid dynamics and the availability of commercial codes encourage numerical studies of combustion processes. In the present work the commercial software CFX Ansys Europe Ltd. has been used to study the combustion of pulverized coal into the boiler of a thermal power plant. The objective of this work is to obtain new information for process optimization. Different kinds of manufactured coals were numerically tested in a thermal power plant installed at the southeast region of Brazil. The simulations were made using the actual burning conditions of the boiler. Results include the residence time of the fuel into the combustion chamber, temperature fields, flow fluid mechanics, heat transfer and pollutant formation, as well as the CO and NOx concentrations, aiming to determinate the best conditions to burn the investigated coals. The numerical investigation of the phenomena involved on the coal combustion processes are used to complete the experimental information obtained in operational tests. Considering the characteristics of different kinds of manufactured coals used, with this study is possible to achieve the most efficient boiler operation parameters, with decreasing costs of electricity production and reduction of environmentally harmful emissions. It was verified that the different kinds of manufactured coals demand different operation conditions, and the kind of manufactured coal used on the combustion process has a significant effect on the pollutant formation, mainly in rel action with ash concentration. (author)

  2. Design and validation of a 3D virtual reality desktop system for sonographic length and volume measurements in early pregnancy evaluation.

    Science.gov (United States)

    Baken, Leonie; van Gruting, Isabelle M A; Steegers, Eric A P; van der Spek, Peter J; Exalto, Niek; Koning, Anton H J

    2015-03-01

    To design and validate a desktop virtual reality (VR) system, for presentation and assessment of volumetric data, based on commercially off-the-shelf hardware as an alternative to a fully immersive CAVE-like I-Space VR system. We designed a desktop VR system, using a three-dimensional (3D) monitor and a six degrees-of-freedom tracking system. A personal computer uses the V-Scope (Erasmus MC, Rotterdam, The Netherlands) volume-rendering application, developed for the I-Space, to create a hologram of volumetric data. Inter- and intraobserver reliability for crown-rump length and embryonic volume measurements are investigated using Bland-Altman plots and intraclass correlation coefficients. Time required for the measurements was recorded. Comparing the I-Space and the desktop VR system, the mean difference for crown-rump length is -0.34% (limits of agreement -2.58-1.89, ±2.24%) and for embryonic volume -0.92% (limits of agreement -6.97-5.13, ±6.05%). Intra- and interobserver intraclass correlation coefficients of the desktop VR system were all >0.99. Measurement times were longer on the desktop VR system compared with the I-Space, but the differences were not statistically significant. A user-friendly desktop VR system can be put together using commercially off-the-shelf hardware at an acceptable price. This system provides a valid and reliable method for embryonic length and volume measurements and can be used in clinical practice. © 2014 Wiley Periodicals, Inc.

  3. Seismic activity and environment protection in rock burst areas

    International Nuclear Information System (INIS)

    Travnicek, L.; Holecko, J.; Knotek, S.

    1993-01-01

    The significance is pointed out of seismic activity caused by mining activities in rock burst areas of the Ostrava-Karvinna district. The need is emphasized of the monitoring of the seismic activity at the Czech-Poland border as needed by the Two-party international committee for exploitation of coal supplies on the common border. The adverse effect of rock burst on the surface is documented by examples provided by the Polish party. The technique is described of investigating the DPB seismic polygon, allowing to evaluate the adverse impact of rock burst on the environment. (author) 1 fig., 8 refs

  4. The desktop interface in intelligent tutoring systems

    Science.gov (United States)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  5. 3D seismic surveys for shallow targets

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C.; Stewart, R.R.; Bertram, M.B. [Calgary Univ., AB (Canada). Dept. of Geoscience, Consortium for Research in Elastic Wave Exploration Seismology

    2008-07-01

    Although 3D seismic surveys are generally used to map deep hydrocarbon plays, this study demonstrated that they can be useful for characterizing shallow targets, such as oilsands deposits. A high-resolution 3D seismic survey was undertaken to map shallow stratigraphy near Calgary, Alberta. The project demonstrated the efficacy of reflection seismic surveys for shallow targets ranging from 100 to 500 metres. The purpose of the program was to map shallow stratigraphy and structure to depths of up to 500m, and to investigate shallow aquifers in the study area. The results of the survey illustrated the opportunity that 3D seismic surveys provide for mapping shallow reflectors and the acquisition geometry needed to image them. Applications include mapping the distribution of shallow aquifers, delineating shallow coals and investigating oilsands deposits. 2 refs., 5 figs.

  6. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  7. Detailed geological characterisation from seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Peter Hatherly; Binzhong Zhou; Troy Peters; Milovan Urosevic [CRC Mining (Australia)

    2009-02-15

    The use of seismic reflection surveying continues to grow within Australia's underground coal mining regions of the Sydney and Bowen Basins. For this project, the potential for acoustic impedance inversion to complement the information available from conventional seismic surveys was investigated. Acoustic impedance is defined by the product of seismic P-wave velocity and rock density. The methods of seismic inversion have been developed mainly for the investigation of petroleum reservoirs. Commercial software packages are available and for this project we utilised the Hampson and Russell software available at Curtin University of Technology. For the true amplitude processing of the seismic data, the Promax software operated at Velseis Processing was used. Inversions were undertaken for three 3D seismic surveys and two 2D surveys. The sites were at Grasstree and North Goonyella Mines in the Bowen Basin and at West Cliff and Dendrobium Collieries in the Sydney Basin. An empirical relationship was derived between acoustic impedance and the newly developed Geophysical Strata Rating (GSR). This allows impedance values to be converted into GSR values that have more meaning in geotechnical assessment. To obtain satisfactory inversions, we used the model based approach.

  8. Perception Analysis of Desktop and Mobile Service Website

    OpenAIRE

    Khoiriyah, Rizqiyatul

    2016-01-01

    The research was conducted as a qualitative study of the website to deeper explore and examine the analysis of user perception of desktop and mobile website services. This research reviewed about user perception of desktop and mobile service website used by using qualitative methods adapted to WebQual and User Experience approach. This qualitative research refered to the theoretical reference written by Creswell (2014). The expected outcome is to know the user perceptions of the available ser...

  9. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    Science.gov (United States)

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  10. Fabrication of low cost soft tissue prostheses with the desktop 3D printer.

    Science.gov (United States)

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-11-27

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  11. Desktop Publishing: Organizational Considerations for Adoption and Implementation. TDC Research Report No. 6.

    Science.gov (United States)

    Lee, Paul

    This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…

  12. Using distributed processing on a local area network to increase available computing power

    International Nuclear Information System (INIS)

    Capps, K.S.; Sherry, K.J.

    1996-01-01

    The migration from central computers to desktop computers distributed the total computing horsepower of a system over many different machines. A typical engineering office may have several networked desktop computers that are sometimes idle, especially after work hours and when people are absent. Users would benefit if applications were able to use these networked computers collectively. This paper describes a method of distributing the workload of an application on one desktop system to otherwise idle systems on the network. The authors present this discussion from a developer's viewpoint, because the developer must modify an application before the user can realize any benefit of distributed computing on available systems

  13. Integrasi pemrograman web pada pemrograman desktop sebagai alternatif fasilitas laporan dalam pengembangan program aplikasi

    Directory of Open Access Journals (Sweden)

    Mardainis Mardainis

    2017-11-01

    Full Text Available AbstrakPemrograman Desktop adalah program aplikasi yang mampu beroperasi tanpa mengandalkan jaringan internet. Penggunaan program desktop biasanya digunakan untuk membuat program yang akan dioperasikan tanpa memerlukan jaringan internet dengan area kerja berada disatu lokasi saja. Sedangkan program web pemakaiannya sangat bergantung pada jaringan internet agar bisa menghubungkan antar pengguna. Pilihan menggunakan program desktop atau program berbasis Web ditentukan oleh kebutuhan dan implementasinya. Jika implementasinya hanya untuk lingkungan perusahaan yang berada di satu tempat, program sebaiknya menggunakan program berbasis desktop. Namun, jika perusahaan memiliki lokasi terpisah di beberapa daerah, penggunaan program berbasis web lebih tepat. Namun banyak programmer, terutama pemula yang enggan menggunakan pemrograman desktop karena dalam membuat laporan harus menggunakan aplikasi pembuat laporan khusus seperti Crystal Report. Kesulitan yang dialami untuk menggunakan aplikasi khusus ini adalah tidak tersedianya aplikasi dalam sistem sehingga perlu diadakan secara khusus. Dalam membuat laporan kadang dirasa agak rumit karena tampilan laporan harus diseting secara manual. Sedangkan dalam bahasa pemrograman berbasis web untuk menampilkan informasi bisa langsung dibuat dengan mudah dalam program itu sendiri tanpa harus menggunakan aplikasi tambahan. Jadi membuat laporan dengan program berbasis web lebih mudah. Untuk menghindari kesulitan para pemrogram dalam membuat laporan tentang program desktop, peneliti mengintegrasikan program berbasis Web dengan pemrograman berbasis desktop dengan tujuan mempermudah membuat laporan. Kata kunci:  Pemrograman Desktop, Implementasi, Integrasi, Crystal Report.  AbstractDesktop Programming is an application programmer capable of operating without relying on the internet network. The use of desktop programs is usually used to create a program that will be operated without the need for internet network with

  14. Integrated system for seismic evaluations

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1989-01-01

    This paper describes the various features of the Seismic Module of the CARES system (Computer Analysis for Rapid Evaluation of Structures). This system was developed by Brookhaven National Laboratory (BNL) for the US Nuclear Regulatory Commission to perform rapid evaluations of structural behavior and capability of nuclear power plant facilities. The CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the features of the Seismic Module in particular. The development of the Seismic Module of the CARES system is based on an approach which incorporates all major aspects of seismic analysis currently employed by the industry into an integrated system that allows for carrying out interactively computations of structural response to seismic motions. The code operates on a PC computer system and has multi-graphics capabilities. It has been designed with user friendly features and it allows for interactive manipulation of various analysis phases during the seismic design process. The capabilities of the seismic module include (a) generation of artificial time histories compatible with given design ground response spectra, (b) development of Power Spectral Density (PSD) functions associated with the seismic input, (c) deconvolution analysis using vertically propagating shear waves through a given soil profile, and (d) development of in-structure response spectra or corresponding PSD's. It should be pointed out that these types of analyses can also be performed individually by using available computer codes such as FLUSH, SAP, etc. The uniqueness of the CARES, however, lies on its ability to perform all required phases of the seismic analysis in an integrated manner. 5 refs., 6 figs

  15. MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop

    Science.gov (United States)

    ... responsivefull.html MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop To use the sharing features on ... provide a consistent user experience from a desktop, tablet, or phone. All users, regardless of how they ...

  16. A Course in Desktop Publishing.

    Science.gov (United States)

    Somerick, Nancy M.

    1992-01-01

    Describes "Promotional Publications," a required course for public relations majors, which teaches the basics of desktop publishing. Outlines how the course covers the preparation of publications used as communication tools in public relations, advertising, and organizations, with an emphasis upon design, layout, and technology. (MM)

  17. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    OpenAIRE

    Vieriu, Valentin; Tuican, Catalin

    2009-01-01

    Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  18. FY 1998 basic survey for coal resource development. Data collection of the joint research of new technology in the geophysical exploration of coal resources (land area shallow seam survey); 1998 nendo sekitan shigen kaihatsu kiso chosa shiryoshu. Shintansa gijutsu chosa kaihatsu (rikuiki senso tansa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This is a compilation of the data on the coal resource land area shallow seam survey conducted in FY 1998 as the basic survey for coal resource development. The trend survey was made from July 26 to August 6, 1998. The purposes of the survey are to study the image analysis method, examples of application of the reflection seismic survey to coal, and inversion technology. The data compilation includes the following: 1. Minutes of the proceedings of the FY 1998 Japan-Australia steering committee (in English). 2. Data/proceedings of the FY 1998 Japan-Australia technical study committee (in English). 3. Results of the GPS measurement of reflection seismic survey traverse lines in Caroona district. 4. List of parameters in the FY 1998 reflection seismic survey data processing. 5. Report on the work of inspection/repair of seismic pulse generator. 6. List of the data on diameter of the test boring conducted in FY 1998. 7. NEDO-DMR CAROONA DDH borehole core pictures. 8. Estimated curves. 9. Report on the trend survey of the FY 1998 coal resource development basic survey (land area shallow seam survey). 10. Pictures. 11. Data on the 1st (FY 1998) new exploration technology study committee. (NEDO)

  19. Detection of frictional heat in seismic faults by coal reflectance

    Science.gov (United States)

    Kitamura, M.; Mukoyoshi, H.; Fulton, P. M.; Hirose, T.

    2012-12-01

    Quantitative assessment of heat generation along a fault during coseismic faulting is of primary importance in understanding the dynamics of earthquakes. Evidence of substantial frictional heating along a fault is also a reliable indicator determining whether a fault has slipped at high velocity in the past, which is crucial for assessing earthquake and tsunami hazard. The reflectance measurement of vitrinite (one of the primary components of coals) has been considered a possible geothermometer of fault zones, especially in accretionary wedges where vitrinite fragments are common [e.g., Sakaguchi et al., 2011]. Under normal burial conditions, vitrinite reflectance (Ro) increases by irreversible maturation reaction as temperature is elevated and thus sensitively records the maximum temperature to which the vitrinite is subjected. However, the commonly used kinetic models of vitrinite maturation [e.g., Sweeney and Burnham, 1990] may not yield accurate estimates of the peak temperature in a fault zone resulting from fast frictional heating rates [Fulton and Harris, 2012]. Whether or not coal can mature in typical earthquake rise time (e.g., ~10 seconds) remains uncertain. Here we present the results of friction experiments aimed at revealing coal maturation by frictional heat generated at slip velocities representative of natural earthquakes of up to 1.3 m/s. All friction experiments were conducted on a mixture of 90 wt% quartz powder and 10 wt% coal grains for simulated fault gouge at three different velocities of 0.0013 m/s, 0.65 m/s and 1.3 m/s, a constant normal stress of 1.0 MPa and ~15 m displacement under anoxic, dry nitrogen atmosphere at room temperature. We also measured temperature in the gouge zone during faulting by thermocouples. The initial coal fragments consist of vitrinite, inertinite and liptinite. Although liptinite was easy to identify microscopically, it was difficult to discriminate between vitrinite and inertinite grains as their grain size

  20. Multifractal Analysis of Seismically Induced Soft-Sediment Deformation Structures Imaged by X-Ray Computed Tomography

    Science.gov (United States)

    Nakashima, Yoshito; Komatsubara, Junko

    Unconsolidated soft sediments deform and mix complexly by seismically induced fluidization. Such geological soft-sediment deformation structures (SSDSs) recorded in boring cores were imaged by X-ray computed tomography (CT), which enables visualization of the inhomogeneous spatial distribution of iron-bearing mineral grains as strong X-ray absorbers in the deformed strata. Multifractal analysis was applied to the two-dimensional (2D) CT images with various degrees of deformation and mixing. The results show that the distribution of the iron-bearing mineral grains is multifractal for less deformed/mixed strata and almost monofractal for fully mixed (i.e. almost homogenized) strata. Computer simulations of deformation of real and synthetic digital images were performed using the egg-beater flow model. The simulations successfully reproduced the transformation from the multifractal spectra into almost monofractal spectra (i.e. almost convergence on a single point) with an increase in deformation/mixing intensity. The present study demonstrates that multifractal analysis coupled with X-ray CT and the mixing flow model is useful to quantify the complexity of seismically induced SSDSs, standing as a novel method for the evaluation of cores for seismic risk assessment.

  1. Seismic shear wall ISP NUPEC's seismic ultimate dynamic response test. Comparison report

    International Nuclear Information System (INIS)

    1996-01-01

    In the seismic design of a nuclear power plant, evaluation of the ultimate strength of the nuclear reactor building is an important subject for assessment of seismic reliability of the plant. In order to carry out the evaluation, the response characteristics of reinforced concrete seismic shear walls up to their ultimate state have to be understood. For this purpose, there is a need to develop reliable non-linear response analysis methods which enables the reliable ultimate strength evaluation of nuclear reactor buildings. Along with this need, many computer codes have been developed. These computer codes are compared. (K.A.)

  2. A NICE approach to managing large numbers of desktop PC's

    International Nuclear Information System (INIS)

    Foster, David

    1996-01-01

    The problems of managing desktop systems are far from resolved. As we deploy increasing numbers of systems, PC's Mackintoshes and UN*X Workstations. This paper will concentrate on the solution adopted at CERN for the management of the rapidly increasing numbers of desktop PC's in use in all parts of the laboratory. (author)

  3. Adobe AIR, Bringing Rich Internet Applications to the Desktop

    Directory of Open Access Journals (Sweden)

    Valentin Vieriu

    2009-01-01

    Full Text Available Rich Internet Applications are the new trend in software development today. Adobe AIR offers the possibility to create cross-platform desktop applications using popular Web technologies like HTML, JavaScript, Flash and Flex. This article is focused on presenting the advantages that this new environment has to offer for the web development community and how quickly you can develop a desktop application using Adobe AIR.

  4. Scale of seismic and rock burst hazard in the Silesian companies in Poland

    Energy Technology Data Exchange (ETDEWEB)

    Renata Patynska; Jozef Kabiesz [Central Mining Institute, Katowice (Poland)

    2009-09-15

    Presently the seismic and rock burst hazard appears still to be important in most of hard coal mines in Poland. Recently, there was a significant increase of seismic activity of the Silesian rock massive, when compared with the previous years. In the period 1999-2008 the hard coal mines experienced 34 rock bursts. The causes of rockburst occurrence are presented based on the analysis of the rockbursts occurring in the Polish hard coal mines. The scale of the rockburst hazard has been characterized with respect to the mining and geological conditions of the existing exploitation. Of the factors influencing the state of rockburst hazard, the most essential one is considered the depth interval ranging from 600 m to 900 m. The basic factors that promote the rockburst occurrence are as follows: seismogenic strata, edges and remnants, goaf, faults, pillars and excessive paneling. 5 refs., 3 figs., 2 tabs.

  5. Selected elements of rock burst state assessment in case studies from the Silesian hard coal mines

    Energy Technology Data Exchange (ETDEWEB)

    Jozef Kabiesz; Janusz Makowka [Central Mining Institute, Katowice (Poland)

    2009-09-15

    Exploitation of coal seams in the Upper Silesian Coal Basin is conducted in complex and difficult conditions. These difficulties are connected with the occurrence of many natural mining hazards and limitations resulting from the existing in this area surface infrastructure. One of the most important problems of Polish mining is the rock burst hazard and reliable evaluation of its condition. During long-years' mining practice in Poland a comprehensive system of evaluation and control of this hazard was developed. In the paper the main aspects of rock burst hazard state evaluation will be presented, comprising: 1) rock mass inclination for rock bursts, i.e., rock strength properties investigation, comprehensive parametric evaluation of rock mass inclination for rock bursts, prognosis of seismic events induced by mining operations, methods of computer-aided modelling of stress and rock mass deformation parameters distribution, strategic rock mass classification under rock burst degrees; 2) immediate seismic and rock burst hazard state evaluation, i.e., low diameter test drilling method, seismologic and seismoacoustic method, comprehensive method of rock burst hazard state evaluation, non-standard methods of evaluation; 3) legal aspects of rock burst hazard state evaluation. Selected elements of the hazard state evaluation system are illustrated with specific practical examples of their application. 11 refs., 14 figs.

  6. Development of seismic tomography software for hybrid supercomputers

    Science.gov (United States)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on

  7. Data processing of natural and induced events recorded at the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC

    Directory of Open Access Journals (Sweden)

    Nov¨¢k Josef

    2001-09-01

    Full Text Available The operation of the seismic station Ostrava-Kr¨¢sn¨¦ Pole (OKC (¦Õ = 49.8352¡ãN; ¦Ë = 18.1422¡ãE which is situated at present in an experimental gallery nearby the Ostrava planetarium started in the year 1983 being equiped initially by analogue instrumentation. Modernization of instrumentation at the station was aimed at the installation of a new digital data acquisition system and the respective software packages for data interpretation and transmission.Data acquisition system VISTEC is based on PC which enables continuous recording of three- component short-period and medium-period systems with the sampling frequency of 20 Hz. The basic advantage of the OS Linux adopted allows remote access (telnet and the possibility of the recorded data transmission (ftp. Possible troubles in the seismic station operation can be quickly detected (even automatically and all recorded data are with minimum delay on disposal. The use of the remote access makes possible also to change the parameters of measuring set-up. The standard form of output data allows the application of standard software packages for visualisation and evaluation. There are on disposal following formates: GSE2/CM6, GSE2/INT and MiniSEED. The output data sets can be compressed by a special procedure. For interactive interpretation od digital seismic data, software package EVENT developed in the Geophysical Institute AS CR and package WAVE developed in the Institute of Geonics AS CR are used.Experimental operation of digital seismographs at the station OKC confirmed justification of its incorporation into the seismic stations of the Czech national seismological network (CNSN. Based on the preliminary analysis of digital data it proved that following groups of seismic events are recorded: earthquakes, induced seismic events from Polish copper and coal mines, induced seismic events from the Ostrava-Karvin¨¢ Coal Basin, quarry blasts and weak regional seismic events of the

  8. Numerical Optimization Using Desktop Computers

    Science.gov (United States)

    1980-09-11

    geophysical, optical and economic analysis to compute a life-cycle cost for a design with a stated energy capacity. NISCO stands for NonImaging ...more efficiently by nonimaging optical systems than by conventional image forming systems. The methodology of designing optimized ronimaging systems...compound parabolic concentrating iWelford, W. T. and Winston, R., The Optics of Nonimaging Concentrators, Light and Solar Energy, p. ix, Academic

  9. Economic analysis of cloud-based desktop virtualization implementation at a hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-10-30

    Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.

  10. Economic analysis of cloud-based desktop virtualization implementation at a hospital

    Directory of Open Access Journals (Sweden)

    Yoo Sooyoung

    2012-10-01

    Full Text Available Abstract Background Cloud-based desktop virtualization infrastructure (VDI is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time. Return on investment (ROI, net present value (NPV, and internal rate of return (IRR indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users, the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS operation and utilization in a tertiary hospital setting.

  11. Full-scope nuclear training simulator -brought to the desktop

    International Nuclear Information System (INIS)

    LaPointe, D.J.; Manz, A.; Hall, G.S.

    1997-01-01

    RighTSTEP is a suite of simulation software which has been initially designed to facilitate upgrade of Ontario Hydro's full-scope simulators, but is also adaptable to a variety of other roles. it is presently being commissioned at Bruch A Training Simulator and has seen preliminary use in desktop and classroom roles. Because of the flexibility of the system, we anticipate it will see common use in the corporation for full-scope simulation roles. A key reason for developing RighTSTEP (Real Time Simulator Technology Extensible and Portable) was the need to modernize and upgrade the full-scope training simulator while protecting the investment in modelling code. This modelling code represents the end product of 18 years of evolution from the beginning of its development in 1979. Bringing this modelling code to a modern and more useful framework - the combination of simulator host, operating system, and simulator operating system - also could provide many spin-off benefits. The development (and first implementation) of the righTSTEP system was cited for saving the corporation 5.6M$ and was recognized by a corporate New Technology Award last year. The most important spin-off from this project has been the desktop version of the full-scope simulator. The desktop simulator uses essentially the same software as does its full-scope counterpart, and may be used for a variety of new purposes. Classroom and individual simulator training can now be easily accommodated since a desktop simulator is both affordable and relatively ease to use. Further, a wide group of people can be trained using the desktop simulator: by contrast the full-scope simulators were almost exclusively devoted to front-line operating staff. The desktop is finding increasing use in support of engineering applications, resulting from its easy accessibility, breadth of station systems represented, and tools for analysis and viewing. As further plant models are made available on the new simulator platform and

  12. Desktop Publishing in the University.

    Science.gov (United States)

    Burstyn, Joan N., Ed.

    Highlighting changes in the work of people within the university, this book presents nine essays that examine the effects of desktop publishing and electronic publishing on professors and students, librarians, and those who work at university presses and in publication departments. Essays in the book are: (1) "Introduction: The Promise of Desktop…

  13. seismic-py: Reading seismic data with Python

    Directory of Open Access Journals (Sweden)

    2008-08-01

    Full Text Available The field of seismic exploration of the Earth has changed
    dramatically over the last half a century. The Society of Exploration
    Geophysicists (SEG has worked to create standards to store the vast
    amounts of seismic data in a way that will be portable across computer
    architectures. However, it has been impossible to predict the needs of the
    immense range of seismic data acquisition systems. As a result, vendors have
    had to bend the rules to accommodate the needs of new instruments and
    experiment types. For low level access to seismic data, there is need for a
    standard open source library to allow access to a wide range of vendor data
    files that can handle all of the variations. A new seismic software package,
    seismic-py, provides an infrastructure for creating and managing drivers for
    each particular format. Drivers can be derived from one of the known formats
    and altered to handle any slight variations. Alternatively drivers can be
    developed from scratch for formats that are very different from any previously
    defined format. Python has been the key to making driver development easy
    and efficient to implement. The goal of seismic-py is to be the base system
    that will power a wide range of experimentation with seismic data and at the
    same time provide clear documentation for the historical record of seismic
    data formats.

  14. Comparative study of computational intelligence approaches for NOx reduction of coal-fired boiler

    International Nuclear Information System (INIS)

    Wei, Zhongbao; Li, Xiaolu; Xu, Lijun; Cheng, Yanting

    2013-01-01

    This paper focuses on NO x emission prediction and operating parameters optimization for coal-fired boilers. Support Vector Regression (SVR) model based on CGA (Conventional Genetic Algorithm) was proposed to model the relationship between the operating parameters and the concentration of NO x emission. Then CGA and two modified algorithms, the Quantum Genetic Algorithm (QGA) and SAGA (Simulated Annealing Genetic Algorithm), were employed to optimize the operating parameters of the coal-fired boiler to reduce NO x emission. The results showed that the proposed SVR model was more accurate than the widely used Artificial Neural Network (ANN) model when employed to predict the concentration of NO x emission. The mean relative error and correlation coefficient calculated by the proposed SVR model were 2.08% and 0.95, respectively. Among the three optimization algorithms implemented in this paper, the SAGA showed superiority to the other two algorithms considering the quality of solution within a given computing time. The SVR plus SAGA method was preferable to predict the concentration of NO x emission and further to optimize the operating parameters to achieve low NO x emission for coal-fired boilers. - Highlights: • The CGA based SVR model is proposed to predict the concentration of NO x emission. • The CGA based SVR model performs better than the widely used ANN model. • CGA and two modified algorithms are compared to optimize the parameters. • The SAGA is preferable for its high quality of solution and low computing time. • The SVR plus SAGA is successfully employed to optimize the operating parameters

  15. AcquisitionFootprintAttenuationDrivenbySeismicAttributes

    Directory of Open Access Journals (Sweden)

    Cuellar-Urbano Mayra

    2014-04-01

    Full Text Available Acquisition footprint, one of the major problems that PEMEX faces in seismic imaging, is noise highly correlated to the geometric array of sources and receivers used for onshore and offshore seismic acquisitions. It prevails in spite of measures taken during acquisition and data processing. This pattern, throughout the image, is easily confused with geological features and misguides seismic attribute computation. In this work, we use seismic data from PEMEX Exploración y Producción to show the conditioning process for removing random and coherent noise using linear filters. Geometric attributes used in a workflow were computed for obtaining an acquisition footprint noise model and adaptively subtract it from the seismic data.

  16. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    Science.gov (United States)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and

  17. A flowsheet model of a coal-fired MHD/steam combined electricity generating cycle, using the access computer model

    International Nuclear Information System (INIS)

    Davison, J.E.; Eldershaw, C.E.

    1992-01-01

    This document forms the final report on a study of a coal-fired magnetohydrodynamic (MHD)/steam electric power generation system carried out by British Coal Corporation for the Commission of the European Communities. The study objective was to provide mass and energy balances and overall plant efficiency predictions for MHD to assist the Commission in their evaluation of advanced power generation technologies. In early 1990 the British Coal Corporation completed a study for the Commission in which a computer flowsheet modelling package was used to predict the performance of a conceptual air blown MHD plant. Since that study was carried out increasing emphasis has been placed on the possible need to reduce CO 2 emissions to counter the so-called greenhouse effect. Air blown MHD could greatly reduce CO 2 emissions per KWh by virtue of its high thermal efficiency. However, if even greater reductions in CO 2 emissions were required the CO 2 produced by coal combustion may have to be disposed of, for example into the deep ocean or underground caverns. To achieve this at minimum cost a concentrated CO 2 flue gas would be required. This could be achieved in an MHD plant by using a mixture of high purity oxygen and recycled CO 2 flue gas in the combustor. To assess this plant concept the European Commission awarded British Coal a contract to produce performance predictions using the access computer program

  18. Computational Fluid Dynamics Simulation of Oxygen Seepage in Coal Mine Goaf with Gas Drainage

    Directory of Open Access Journals (Sweden)

    Guo-Qing Shi

    2015-01-01

    Full Text Available Mine fires mainly arise from spontaneous combustion of coal seams and are a global issue that has attracted increasing public attention. Particularly in china, the closure of coal workfaces because of spontaneous combustion has contributed to substantial economic loss. To reduce the occurrence of mine fires, the spontaneous coal combustion underground needs to be studied. In this paper, a computational fluid dynamics (CFD model was developed for coal spontaneous combustion under goaf gas drainage conditions. The CFD model was used to simulate the distribution of oxygen in the goaf at the workface in a fully mechanized cave mine. The goaf was treated as an anisotropic medium, and the effects of methane drainage and oxygen consumption on spontaneous combustion were considered. The simulation results matched observational data from a field study, which indicates CFD simulation is suitable for research on the distribution of oxygen in coalmines. The results also indicated that near the workface spontaneous combustion was more likely to take place in the upper part of the goaf than near the bottom, while further from workface the risk of spontaneous combustion was greater in the lower part of the goaf. These results can be used to develop firefighting approaches for coalmines.

  19. Investigation of Coal-biomass Catalytic Gasification using Experiments, Reaction Kinetics and Computational Fluid Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, Francine [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Agblevor, Foster [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Klein, Michael [Univ. of Delaware, Newark, DE (United States); Sheikhi, Reza [Northeastern Univ., Boston, MA (United States)

    2015-12-31

    A collaborative effort involving experiments, kinetic modeling, and computational fluid dynamics (CFD) was used to understand co-gasification of coal-biomass mixtures. The overall goal of the work was to determine the key reactive properties for coal-biomass mixed fuels. Sub-bituminous coal was mixed with biomass feedstocks to determine the fluidization and gasification characteristics of hybrid poplar wood, switchgrass and corn stover. It was found that corn stover and poplar wood were the best feedstocks to use with coal. The novel approach of this project was the use of a red mud catalyst to improve gasification and lower gasification temperatures. An important results was the reduction of agglomeration of the biomass using the catalyst. An outcome of this work was the characterization of the chemical kinetics and reaction mechanisms of the co-gasification fuels, and the development of a set of models that can be integrated into other modeling environments. The multiphase flow code, MFIX, was used to simulate and predict the hydrodynamics and co-gasification, and results were validated with the experiments. The reaction kinetics modeling was used to develop a smaller set of reactions for tractable CFD calculations that represented the experiments. Finally, an efficient tool was developed, MCHARS, and coupled with MFIX to efficiently simulate the complex reaction kinetics.

  20. Efficiency Sustainability Resource Visual Simulator for Clustered Desktop Virtualization Based on Cloud Infrastructure

    Directory of Open Access Journals (Sweden)

    Jong Hyuk Park

    2014-11-01

    Full Text Available Following IT innovations, manual operations have been automated, improving the overall quality of life. This has been possible because an organic topology has been formed among many diverse smart devices grafted onto real life. To provide services to these smart devices, enterprises or users use the cloud. Cloud services are divided into infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. SaaS is operated on PaaS, and PaaS is operated on IaaS. Since IaaS is the foundation of all services, algorithms for the efficient operation of virtualized resources are required. Among these algorithms, desktop resource virtualization is used for high resource availability when existing desktop PCs are unavailable. For this high resource availability, clustering for hierarchical structures is important. In addition, since many clustering algorithms show different percentages of the main resources depending on the desktop PC distribution rates and environments, selecting appropriate algorithms is very important. If diverse attempts are made to find algorithms suitable for the operating environments’ desktop resource virtualization, huge costs are incurred for the related power, time and labor. Therefore, in the present paper, a desktop resource virtualization clustering simulator (DRV-CS, a clustering simulator for selecting clusters of desktop virtualization clusters to be maintained sustainably, is proposed. The DRV-CS provides simulations, so that clustering algorithms can be selected and elements can be properly applied in different desktop PC environments through the DRV-CS.

  1. Design Options for a Desktop Publishing Course.

    Science.gov (United States)

    Mayer, Kenneth R.; Nelson, Sandra J.

    1992-01-01

    Offers recommendations for development of an undergraduate desktop publishing course. Discusses scholastic level and prerequisites, purpose and objectives, instructional resources and methodology, assignments and evaluation, and a general course outline. (SR)

  2. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    Science.gov (United States)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  3. Prediction, prevention and fight automatic control in mining an outburst prone coal seam; Control Automatico de las Medidas de Prediccion, Prevencion y Lucha, para la Explotacion Mecanizada de una Capa Susceptible de desprendimientos Instantaneos

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    Outburst are instantaneous catastrophic failure of the coal mine structure, characterised by emissions of large quantities of finely divided coal dust and gas from a coal face. In Spain, outbursts have represented a serious problem in the zone of Aller (Asturias), particularly in San Antonio mine. The main objective of the research work was to determine whether outburst precursory micro seismic activity could be discerned by monitoring the coal mining district of San Antonio. Micro seismic activity on the underground seismometer contains distinct style of events: natural events, stress events and outburst events. The correlation between the rate of extraction and the number of micro seismic activity has been analysed. A real time software was developed which would discriminate mining activity from background seismic noise. the algorithm used for event detection in based on comparing the short term average (STA) with the long term average (LTA) of the signal energy. (Author)

  4. A Comparison between Model Base Hardconstrain, Bandlimited, and Sparse-Spike Seismic Inversion: New Insights for CBM Reservoir Modelling on Muara Enim Formation, South Sumatra

    Science.gov (United States)

    Mohamad Noor, Faris; Adipta, Agra

    2018-03-01

    Coal Bed Methane (CBM) as a newly developed resource in Indonesia is one of the alternatives to relieve Indonesia’s dependencies on conventional energies. Coal resource of Muara Enim Formation is known as one of the prolific reservoirs in South Sumatra Basin. Seismic inversion and well analysis are done to determine the coal seam characteristics of Muara Enim Formation. This research uses three inversion methods, which are: model base hard- constrain, bandlimited, and sparse-spike inversion. Each type of seismic inversion has its own advantages to display the coal seam and its characteristic. Interpretation result from the analysis data shows that the Muara Enim coal seam has 20 (API) gamma ray value, 1 (gr/cc) – 1.4 (gr/cc) from density log, and low AI cutoff value range between 5000-6400 (m/s)*(g/cc). The distribution of coal seam is laterally thinning northwest to southeast. Coal seam is seen biasedly on model base hard constraint inversion and discontinued on band-limited inversion which isn’t similar to the geological model. The appropriate AI inversion is sparse spike inversion which has 0.884757 value from cross plot inversion as the best correlation value among the chosen inversion methods. Sparse Spike inversion its self-has high amplitude as a proper tool to identify coal seam continuity which commonly appears as a thin layer. Cross-sectional sparse spike inversion shows that there are possible new boreholes in CDP 3662-3722, CDP 3586-3622, and CDP 4004-4148 which is seen in seismic data as a thick coal seam.

  5. The forecast of mining-induced seismicity and the consequent risk of damage to the excavation in the area of seismic event

    Directory of Open Access Journals (Sweden)

    Jan Drzewiecki

    2017-01-01

    forecast of the seismic energy of a shock with the defined location of its source: value of the coefficient λ of dispersion/attenuation of seismic energy and the flux of seismic energy at predetermined distances r from the tremor source. The proposed solution for forecasting the seismic energy of tremors and the level of risk of damage to the excavation during the functioning of mining operations is helpful in the development of bump prevention. Changing the intensity of mining operations enables the level of the seismic energy induced by the operations both at the stage of its development and during the excavation of a seam using the longwall method to be “controlled”. The presented solution has been produced for an area disturbed by the mining of coal seam 510 in the hard coal mine, Jas-Mos. An original program developed by CMI was used for the calculations.

  6. High resolution reflection seismic mapping of shallow coal seams

    CSIR Research Space (South Africa)

    Mngadi, SB

    2013-10-01

    Full Text Available the extent of the mine workings. Two 94 m profiles (tied to boreholes) were surveyed using a sledgehammer source. Processing was optimized to image the shallow reflections. The refraction seismic models and stacked time sections were compared and integrated...

  7. Inkjet printing of transparent sol-gel computer generated holograms

    NARCIS (Netherlands)

    Yakovlev, A.; Pidko, E.A.; Vinogradov, A.

    2016-01-01

    In this paper we report for the first time a method for the production of transparent computer generated holograms by desktop inkjet printing. Here we demonstrate a methodology suitable for the development of a practical approach towards fabrication of diffraction patterns using a desktop inkjet

  8. A Unified Algorithm for Virtual Desktops Placement in Distributed Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiangtao Zhang

    2016-01-01

    Full Text Available Distributed cloud has been widely adopted to support service requests from dispersed regions, especially for large enterprise which requests virtual desktops for multiple geodistributed branch companies. The cloud service provider (CSP aims to deliver satisfactory services at the least cost. CSP selects proper data centers (DCs closer to the branch companies so as to shorten the response time to user request. At the same time, it also strives to cut cost considering both DC level and server level. At DC level, the expensive long distance inter-DC bandwidth consumption should be reduced and lower electricity price is sought. Inside each tree-like DC, servers are trying to be used as little as possible so as to save equipment cost and power. In nature, there is a noncooperative relation between the DC level and server level in the selection. To attain these objectives and capture the noncooperative relation, multiobjective bilevel programming is used to formulate the problem. Then a unified genetic algorithm is proposed to solve the problem which realizes the selection of DC and server simultaneously. The extensive simulation shows that the proposed algorithm outperforms baseline algorithm in both quality of service guaranteeing and cost saving.

  9. Microsoft Virtualization Master Microsoft Server, Desktop, Application, and Presentation Virtualization

    CERN Document Server

    Olzak, Thomas; Boomer, Jason; Keefer, Robert M

    2010-01-01

    Microsoft Virtualization helps you understand and implement the latest virtualization strategies available with Microsoft products. This book focuses on: Server Virtualization, Desktop Virtualization, Application Virtualization, and Presentation Virtualization. Whether you are managing Hyper-V, implementing desktop virtualization, or even migrating virtual machines, this book is packed with coverage on all aspects of these processes. Written by a talented team of Microsoft MVPs, Microsoft Virtualization is the leading resource for a full installation, migration, or integration of virtual syste

  10. Working Inside The Box: An Example Of Google Desktop Search in a Forensic Examination

    Directory of Open Access Journals (Sweden)

    Timothy James LaTulippe

    2011-12-01

    Full Text Available Information and the technological advancements for which mankind develops with regards to its storage has increased tremendously over the past few decades. As the total amount of data stored rapidly increases in conjunction with the amount of widely available computer-driven devices being used, solutions are being developed to better harness this data. These types of advancements are continually assisting investigators and computer forensic examiners. One such application which houses copious amounts of fruitful data is the Google Desktop Search program. Coupled with tested and verified techniques, examiners can exploit the power of this application to cater to their investigative needs. Please find within a real world case example of these techniques and its subsequent outcome.

  11. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  12. ADAM (Affordable Desktop Application Manager): a Unix desktop application manager

    International Nuclear Information System (INIS)

    Liebana, M.; Marquina, M.; Ramos, R.

    1996-01-01

    ADAM stands for Affordable Desktop Application Manager. It is a GUI developed at CERN with the aim to ease access to applications. The motivation to develop ADAM came from the unavailability of environments like COSE/CDE and their heavy resource consumption. ADAM has proven to be user friendly: new users are able to customize it to their needs in few minutes. Groups of users may share through ADAM a common application environment. ADAM also integrates the Unix and the PC world. PC users can excess Unix applications in the same way as their usual Windows applications. This paper describes all the ADAM features, how they are used at CERN Public Services, and the future plans for ADAM. (author)

  13. Thermodynamic analysis and conceptual design for partial coal gasification air preheating coal-fired combined cycle

    Science.gov (United States)

    Xu, Yue; Wu, Yining; Deng, Shimin; Wei, Shirang

    2004-02-01

    The partial coal gasification air pre-heating coal-fired combined cycle (PGACC) is a cleaning coal power system, which integrates the coal gasification technology, circulating fluidized bed technology, and combined cycle technology. It has high efficiency and simple construction, and is a new selection of the cleaning coal power systems. A thermodynamic analysis of the PGACC is carried out. The effects of coal gasifying rate, pre-heating air temperature, and coal gas temperature on the performances of the power system are studied. In order to repower the power plant rated 100 MW by using the PGACC, a conceptual design is suggested. The computational results show that the PGACC is feasible for modernizing the old steam power plants and building the new cleaning power plants.

  14. Cubby : Multiscreen Desktop VR Part III

    NARCIS (Netherlands)

    Djajadiningrat, J.P.; Gribnau, M.W.

    2000-01-01

    In this month's final episode of our 'Cubby: Multiscreen Desktop VR' trilogy we explain how you read the InputSprocket driver from part II, how you use it as input for the cameras from part I and how you calibrate the input device so that it leads to the correct head position.

  15. Cubby : Multiscreen Desktop VR Part II

    NARCIS (Netherlands)

    Gribnau, M.W.; Djajadiningrat, J.P.

    2000-01-01

    In this second part of our 'Cubby: Multiscreen Desktop VR' trilogy, we will introduce you to the art of creating a driver to read an Origin Instruments Dynasight input device. With the Dynasight, the position of the head of the user is established so that Cubby can display the correct images on its

  16. Salvo: Seismic imaging software for complex geologies

    Energy Technology Data Exchange (ETDEWEB)

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  17. The File Sync Algorithm of the ownCloud Desktop Clients

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The ownCloud desktop clients provide file syncing between desktop machines and the ownCloud server, available for the important desktop platforms. This presentation will give an overview of the sync algorithm used by the clients to provide a fast, reliable and robust syncing experience for the users. It will describe the phases a sync run will go through and how it is triggered. It also will provide an insight on the algorithms that decided if a file is uploaded, downloaded or even deleted on either on the local machine or in the cloud. Some examples of non obvious situations in file syncing will be described and discussed. As the ownCloud sync protocol is based on the open standard WebDAV the resulting challenges and the solutions will be illustrated. Finally a couple of frequently proposed enhancements will be reviewed and assed for the future development of the ownCloud server and syncing clients.

  18. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section B

    Science.gov (United States)

    Knightly, W. F.

    1980-01-01

    About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidate which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented. Two nocogeneration base cases are included: coal fired and residual fired process boilers.

  19. Wireless Cloud Computing on Guided Missile Destroyers: A Business Case Analysis

    Science.gov (United States)

    2013-06-01

    Cloud Computing Network (WCCN) onboard Guided Missile Destroyers (DDGs) utilizing tablet computers. It compares the life cycle costs of WCCNs utilizing tablet computers over a mixed network of thin clients and desktop computers. Currently, the Consolidated Afloat Networks and Enterprise Services (CANES) program will install both thin clients and desktops on board new and old DDGs to implement the unclassified portion of its network. The main cost benefits of tablets will be realized through energy savings and an increase in productivity. The net present value of tablets is

  20. The KnowRISK project - Know your city, Reduce seISmic risK through non-structural elements

    Science.gov (United States)

    Sousa Oliveria, Carlos; Amaral Ferreira, Mónica; Lopez, Mário; Sousa Silva, Delta; Musacchio, Gemma; Rupakhety, Rajesh; Falsaperla, Susanna; Meroni, Fabrizio; Langer, Horst

    2016-04-01

    Historically, there is a tendency to focus on seismic structural performance of buildings, neglecting the potential for damage of non-structural elements. In particular, non-structural elements of buildings are their architectural parts (i.e. partitions, ceilings, cladding), electrical and mechanical components (i.e., distribution panels, piping, plumbing), and contents (e.g., furniture, bookcases, computers and desktop equipment). Damage of these elements often contributes significantly to earthquake impacts. In the 1999 Izmit Earthquake, Turkey, 50% of the injuries and 3% of human losses were caused by non-structural failures. In the 2010-2011 Christchurch Earthquakes (New Zealand), 40% of building damage was induced by non-structural malfunctions. Around 70%-85% of construction cost goes into these elements, and their damage can strongly influence the ability of communities to cope with and recover from earthquakes. The project Know your city, Reduce seISmic risK through non-structural elements (KnowRISK) aims at facilitating local communities' access to expert knowledge on non-structural seismic protection solutions. The project will study seismic scenarios critical for non-structural damage, produce a portfolio of non-structural protection measures and investigate the level of awareness in specific communities. We will implement risk communication strategies that will take into account the social and cultural background and a participatory approach to raise awareness in local communities. The paradox between the progress of scientific knowledge and the ongoing increase of losses from natural disasters worldwide is a well-identified gap in the UN Hyogo Framework for Action 2005-2015, in which one of the main priorities is the investment on "knowledge use, innovation and education to build a culture of safety and resilience". The KnowRISK is well aligned with these priorities and will contribute to participatory action aimed at: i) transferring expert knowledge

  1. Implementation Issues of Virtual Desktop Infrastructure and Its Case Study for a Physician's Round at Seoul National University Bundang Hospital

    OpenAIRE

    Yoo, Sooyoung; Kim, Seok; Kim, Taegi; Kim, Jon Soo; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-01-01

    Objectives The cloud computing-based virtual desktop infrastructure (VDI) allows access to computing environments with no limitations in terms of time or place such that it can permit the rapid establishment of a mobile hospital environment. The objective of this study was to investigate the empirical issues to be considered when establishing a virtual mobile environment using VDI technology in a hospital setting and to examine the utility of the technology with an Apple iPad during a physici...

  2. The age, palaeoclimate, palaeovegetation, coal seam architecture/mire types, paleodepositional environments and thermal maturity of syn-collision paralic coal from Mukah, Sarawak, Malaysia

    Science.gov (United States)

    Sia, Say-Gee; Abdullah, Wan Hasiah; Konjing, Zainey; Koraini, Ahmad Munif

    2014-02-01

    The Mukah coal accumulated in the Balingian Formation where the time-stratigraphic position is poorly defined by fauna, though a probable Late Miocene age has always been assigned to this formation. Samples collected in the present study that yielded an abundance of Casuarina pollen associated with occurrences of Dacrydium, Stenochlaena palustris, Florschuetzia levipoli and also Stenochlaena areolaris spores, compare closely to zone PR9 of the palynological zonation of the Malay Basin, and can be tied to depositional sequences of Malay Basin Seismic sequences I2000/I3000, indicating an Early Miocene age for the studied coal. The Early Miocene age shows that the Mukah coal was formed during the collision between Luconia Block-Dangerous Grounds with the Borneo that lasted from Late Eocene to late Early Miocene. The rapid increase of deposition base-level caused by the collision is clearly reflected by the architecture of the Mukah coal seams that were generally thin, and also by the reverse order of the paleo-peat bodies.

  3. Development of a software concept for computer-aided technical detail planning for machines in German hard coal mining; Entwicklung eines Softwarekonzeptes fuer rechnergestuetzte maschinentechnische Datailplanung im deutschen Steinkohlenbergbau

    Energy Technology Data Exchange (ETDEWEB)

    Borstell, D

    1994-12-31

    CAD systems have long been an aid in German hard coal mining for reducing costs in all technical planning tasks. The use of computers will offer as yet unused possibilities for further cost savings in the future. For this purpose, this book introduces a new software concept for the technical planning for machines in the mines of the Ruhr. By the continued and thorough use of the potential of modern computer techniques, by the application of the knowledge of planning science orientated towards practice and by transferring computer-aided planning applications from other branches of industry, a further contribution is to be made to reducing costs in technical planning. The heart of the future technical planning workplace will be a graphically orientated surface with the three-dimensional representation of the pit structure on the screen. On this surface, after choosing the work area in the pit structure, there is access to the available software tools. These support the planning engineer in information and design work (connection to databank, 2D/3D-CAD, libraries of operating means and standard parts) and give support to the method of procedure (through expert systems, sample specifications, checklists). They will also offer help in inspection and decision-making (by simulation and calculation routines, expert systems) and in supporting publicity activities (text processing, desktop publishing). The computer-aided planning system of the future will develop from the two-dimensional design environment usual today into a comprehensive integrated 3D engineering system. (orig.) [Deutsch] CAD-Systeme sind im deutschen Steinkohlenbergbau seit Jahren wichtige Hilfsmittel zur Kostenreduzierung bei allen technischen Planungsaufgaben. Auch in Zukunft wird der Einsatz von Rechnern noch ungenutzte Moeglichkeiten fuer weitere Kosteneinsparungen bieten. Zu diesem Zweck wird in der vorliegenden Arbeit ein neues Softwarekonzept fuer die maschinentechnische Planung in den Stabsstellen

  4. Investigation into Mechanism of Floor Dynamic Rupture by Evolution Characteristics of Stress and Mine Tremors: A Case Study in Guojiahe Coal Mine, China

    Directory of Open Access Journals (Sweden)

    Guangjian Liu

    2018-01-01

    Full Text Available In order to explore the mechanism of floor dynamic rupture, the current study adopts a thin plate model to further investigate the condition of floor failure. One of the possible explanations could be floor buckling due to high horizontal stress and dynamic disturbance ultimately leading to rapid and massive release of elastic energy thus inducing dynamic rupture. Seismic computed tomography and 3D location were employed to explore the evolution characteristics of floor stress distribution and positions of mine tremors. In the regions of floor dynamic rupture, higher P-wave velocity was recorded prior to the dynamic rupture. On the contrary, relatively lower reading was observed after the dynamic rupture thus depicting a high stress concentration condition. Meanwhile, evolution of mine tremors revealed the accumulation and subsequent release of energy during the dynamic rupture process. It was further revealed that dynamic rupture was induced due to the superposition of static and dynamic stresses: (i the high static stress concentration due to frontal and lateral abutment stress from coal pillar and (ii dynamic stress from the fracture and caving of coal pillar, hard roof, and key stratum. In the later part of this study, the floor dynamic rupture occurrence process would be reproduced through numerical simulations within a 0.6 sec time frame. The above-mentioned findings would be used to propose a feasible mechanism for prewarning and prevention of floor dynamic rupture using seismic computed tomography and mine tremors 3D location.

  5. EPA Region 8, Memo on Desktop Printer Ink Cartridges Policy & Voluntary Printer Turn-in

    Science.gov (United States)

    This memo requests EPA Region 8 users to voluntarily turn-in their desktop printers and notifies users of the Region 8 policy to not provide maintenance or ink and toner cartridges for desktop printers.

  6. Carbon burnout of pulverised coal in power station furnaces

    Energy Technology Data Exchange (ETDEWEB)

    R.I. Backreedy; L.M. Fletcher; J.M. Jones; L. Ma; M. Pourkashanian; A. Williams; K. Johnson; D.J. Waldron; P. Stephenson [University of Leeds, Leeds (United Kingdom)

    2003-07-01

    The degree of carbon burnout in pulverised fuel fired power stations is important because it is linked with power plant efficiency and coal ash suitability for construction purposes. The use of computational methods to calculate carbon burnout in such systems has been aided by the increasing availability of fast computers and improvements in computational methodologies. Despite recent advances in fluid flow, coal devolatilisation and coal combustion models, the use of CFD methods for detailed design purposes or for the selection of commercial coals is still limited. In parallel, industrial engineering codes, which combine simplified thermal models with advanced coal combustion models, are still undergoing development since they provide economic advantages over detailed CFD analysis. Although the major coal combustion processes are well established, an understanding regarding the role of coal macerals and the influence of ash on the combustion process is still lacking. A successful coal model must be able to handle all the complexities of combustion, from the details of the burner geometry through to the formation of unburnt carbon as well as NOx. The development of such a model is described here.

  7. Geodynamic methods for assessing methane distribution in bituminous coal deposits and measures to intensify methane fluxes during mine gas drainage

    Directory of Open Access Journals (Sweden)

    Е. В. Гончаров

    2016-12-01

    Full Text Available This paper explores states of methane within the coal bearing stratum and shows heavy dependency of the intrastratal gas migration on the forms of porous space and petrographic properties of coal. The adsorbed methane is found to be predominant in the coal of Kuznetsk Basin. Different forms of coal diffusion and filtration are described revealing their dependency on geological and thermodynamic conditions. The paper provides justification for the primary focus on geodynamic processes when designing gas drainage systems and applicability of morphometric methods and remote sensing data for their identification. The significance of researches into the processes activating exothermic reactions resulting in methane transition to free state is explained. The paper presents the results of using seismic-acoustic stimulation techniques as one of the practical approaches to addressing this issue. Results of successful industrial testing have been compared with the results of numerical modelling of stress-strain state, which can also be managed through seismic-acoustic stimulation.

  8. Warm Hearts/Cold Type: Desktop Publishing Arrives.

    Science.gov (United States)

    Kramer, Felix

    1991-01-01

    Describes desktop publishing (DTP) that may be suitable for community, activist, and nonprofit groups and discusses how it is changing written communication. Topics discussed include costs; laser printers; time savings; hardware and software selection; and guidelines to consider when establishing DTP capability. (LRW)

  9. Coal Transition in Spain. An historical case study for the project 'Coal Transitions: Research and Dialogue on the Future of Coal'

    International Nuclear Information System (INIS)

    Del Rio, Pablo

    2017-01-01

    This is one of the 6 country case-studies commissioned to collect experience on past coal transitions. The 6 countries are: Czech Republic, the Netherlands, Poland, Spain, UK, USA. Their role in the Coal Transitions project was to provide background information for a Synthesis Report for decision makers, and provide general lessons for national project teams to take into account in developing their coal transitions pathways for the future. Spain has had a long tradition of coal mining at least since the 18. century. However, it is also one of the jurisdictions committing to phase-out of subsidies and implementing it in recent times. This case study discusses the main features of the coal transition in Spain, the factors influencing this transition as well as the policies which both drove it and accompanied their detrimental socioeconomic effects on the workers and regions. The analysis is based on a desktop research of relevant documents, including official communications from the Ministry of Industry (MINETUR) and the European Commission as well as statements of position from the industry association (CARBUNION) and labour unions (UGT and CCOO). Documents on national coal from other institutions (Foundations, NGOs) have also been consulted. Finally, an analysis of articles in the mass media has been carried out. This contains useful statements from different types of stakeholders. A strong reduction in production and employment in the coal industry has been experienced at least in the last two decades in this country. Successive plans by the government have aimed at reducing coal production, early retirement of workers and closing mines. Caught in the middle of the mining coalition on the one hand and EU legislation and public opinion on the other, the government has had to approve drastic measures leading to phase out. On the other hand, it has tried to accompany the phase out with measures which have tried to mitigate the negative impact on the affected zones

  10. EXPERIMENTS AND COMPUTATIONAL MODELING OF PULVERIZED-COAL IGNITION; FINAL

    International Nuclear Information System (INIS)

    Samuel Owusu-Ofori; John C. Chen

    1999-01-01

    Under typical conditions of pulverized-coal combustion, which is characterized by fine particles heated at very high rates, there is currently a lack of certainty regarding the ignition mechanism of bituminous and lower rank coals as well as the ignition rate of reaction. furthermore, there have been no previous studies aimed at examining these factors under various experimental conditions, such as particle size, oxygen concentration, and heating rate. Finally, there is a need to improve current mathematical models of ignition to realistically and accurately depict the particle-to-particle variations that exist within a coal sample. Such a model is needed to extract useful reaction parameters from ignition studies, and to interpret ignition data in a more meaningful way. The authors propose to examine fundamental aspects of coal ignition through (1) experiments to determine the ignition temperature of various coals by direct measurement, and (2) modeling of the ignition process to derive rate constants and to provide a more insightful interpretation of data from ignition experiments. The authors propose to use a novel laser-based ignition experiment to achieve their first objective. Laser-ignition experiments offer the distinct advantage of easy optical access to the particles because of the absence of a furnace or radiating walls, and thus permit direct observation and particle temperature measurement. The ignition temperature of different coals under various experimental conditions can therefore be easily determined by direct measurement using two-color pyrometry. The ignition rate-constants, when the ignition occurs heterogeneously, and the particle heating rates will both be determined from analyses based on these measurements

  11. Detection of induced seismicity effects on ground surface using data from Sentinel 1A/1B satellites

    Science.gov (United States)

    Milczarek, W.

    2017-12-01

    Induced seismicity is the result of human activity and manifests itself in the form of shock and vibration of the ground surface. One of the most common factors causing the occurrence of induced shocks is underground mining activity. Sufficiently strong high-energy shocks may cause displacements of the ground surface. This type of shocks can have a significant impact on buildings and infrastructure. Assessment of the size and influence of induced seismicity on the ground surface is one of the major problems associated with mining activity. In Poland (Central Eastern Europe) induced seismicity occurs in the area of hard coal mining in the Upper Silesian Coal Basin and in the area of the Legnica - Głogów Copper Basin.The study presents an assessment of the use of satellite radar data (SAR) for the detection influence of induced seismicity in mining regions. Selected induced shocks from the period 2015- 2017 which occurred in the Upper Silesian Coal Basin and the Legnica - Głogów Copper Basin areas have been analyzed. In the calculations SAR data from the Sentinel 1A and Sentinel 1B satellites have been used. The results indicate the possibility of quickly and accurate detection of ground surface displacements after an induced shock. The results of SAR data processing were compared with the results from geodetic measurements. It has been shown that SAR data can be used to detect ground surface displacements on the relative small regions.

  12. Desktop Publishing: The New Wave in Business Education.

    Science.gov (United States)

    Huprich, Violet M.

    1989-01-01

    Discusses the challenges of teaching desktop publishing (DTP); the industry is in flux with the software packages constantly being updated. Indicates that the demand for those with DTP skills is great. (JOW)

  13. US and world coal trade

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, B

    1988-07-01

    This paper reviews the US's coal trade with other countries in the world. Despite being pressed to support domestic coal producers, US utilities are looking towards Colombia for more of their supplies. Whilst the amount of Colombian coal imported into the US is small, it is a combination of this and coal imported from Australia, Canada and China which is causing concern. Studies indicate that the volume of coal imported into the US may rise to 3 Mt/year within three years. Coal exports may suffer if Brazil bans the import of significant quantities of US coking coal in retaliation against American trade sanctions against Brazilian computer import barriers. Also, Romania is expected to impose tariffs on US imports which will have an impact on US coal exported to Romania. US remains the top coal exporter to the European Communities but its lead was cut back due to a big rise of Australian export. A portion of EC market has also been lost to the USSR and Poland. Meanwhile, Japan is resisting buying US's steam coal because it is too expensive.

  14. Research and implementation of a Web-based remote desktop image monitoring system

    International Nuclear Information System (INIS)

    Ren Weijuan; Li Luofeng; Wang Chunhong

    2010-01-01

    It studied and implemented an ISS (Image Snapshot Server) system based on Web, using Java Web technology. The ISS system consisted of client web browser and server. The server part could be divided into three modules as the screen shots software, web server and Oracle database. Screen shots software intercepted the desktop environment of the remote monitored PC and sent these pictures to a Tomcat web server for displaying on the web at real time. At the same time, these pictures were also saved in an Oracle database. Through the web browser, monitor person can view the real-time and historical desktop pictures of the monitored PC during some period. It is very convenient for any user to monitor the desktop image of remote monitoring PC. (authors)

  15. Seismic monitoring: a unified system for research and verifications

    International Nuclear Information System (INIS)

    Thigpen, L.

    1979-01-01

    A system for characterizing either a seismic source or geologic media from observational data was developed. This resulted from an examination of the forward and inverse problems of seismology. The system integrates many seismic monitoring research efforts into a single computational capability. Its main advantage is that it unifies computational and research efforts in seismic monitoring. 173 references, 9 figures, 3 tables

  16. The GLOBE-Consortium: The Erasmus Computing Grid – Building a Super-Computer at Erasmus MC for FREE

    NARCIS (Netherlands)

    T.A. Knoch (Tobias)

    2005-01-01

    textabstractTo meet the enormous computational needs of live-science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop computing grids in the world – The Erasmus Computing Grid.

  17. Computer usage and national energy consumption: Results from a field-metering study

    Energy Technology Data Exchange (ETDEWEB)

    Desroches, Louis-Benoit [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Greenblatt, Jeffery [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Pratt, Stacy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Willem, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Claybaugh, Erin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Beraki, Bereket [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Nagaraju, Mythri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division; Young, Scott [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis & Environmental Impacts Dept., Environmental Energy Technologies Division

    2014-12-01

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Bay Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power

  18. Reduction of computing time for seismic applications based on the Helmholtz equation by Graphics Processing Units

    NARCIS (Netherlands)

    Knibbe, H.P.

    2015-01-01

    The oil and gas industry makes use of computational intensive algorithms to provide an image of the subsurface. The image is obtained by sending wave energy into the subsurface and recording the signal required for a seismic wave to reflect back to the surface from the Earth interfaces that may have

  19. Desktop Publishing: Things Gutenberg Never Taught You.

    Science.gov (United States)

    Bowman, Joel P.; Renshaw, Debbie A.

    1989-01-01

    Provides a desktop publishing (DTP) overview, including: advantages and disadvantages; hardware and software requirements; and future development. Discusses cost-effectiveness, confidentiality, credibility, effects on volume of paper-based communication, and the need for training in layout and design which DTP creates. Includes a glossary of DTP…

  20. Seismic safety margins research program. Phase I final report - Overview

    International Nuclear Information System (INIS)

    Smith, P.D.; Dong, R.G.; Bernreuter, D.L.; Bohn, M.P.; Chuang, T.Y.; Cummings, G.E.; Johnson, J.J.; Mensing, R.W.; Wells, J.E.

    1981-04-01

    The Seismic Safety Margins Research Program (SSMRP) is a multiyear, multiphase program whose overall objective is to develop improved methods for seismic safety assessments of nuclear power plants, using a probabilistic computational procedure. The program is being carried out at the Lawrence Livermore National Laboratory and is sponsored by the U.S. Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. Phase I of the SSMRP was successfully completed in January 1981: A probabilistic computational procedure for the seismic risk assessment of nuclear power plants has been developed and demonstrated. The methodology is implemented by three computer programs: HAZARD, which assesses the seismic hazard at a given site, SMACS, which computes in-structure and subsystem seismic responses, and SEISIM, which calculates system failure probabilities and radioactive release probabilities, given (1) the response results of SMACS, (2) a set of event trees, (3) a family of fault trees, (4) a set of structural and component fragility descriptions, and (5) a curve describing the local seismic hazard. The practicality of this methodology was demonstrated by computing preliminary release probabilities for Unit 1 of the Zion Nuclear Power Plant north of Chicago, Illinois. Studies have begun aimed at quantifying the sources of uncertainty in these computations. Numerous side studies were undertaken to examine modeling alternatives, sources of error, and available analysis techniques. Extensive sets of data were amassed and evaluated as part of projects to establish seismic input parameters and to produce the fragility curves. (author)

  1. pySeismicFMM: Python based Travel Time Calculation in Regular 2D and 3D Grids in Cartesian and Geographic Coordinates using Fast Marching Method

    Science.gov (United States)

    Wilde-Piorko, M.; Polkowski, M.

    2016-12-01

    Seismic wave travel time calculation is the most common numerical operation in seismology. The most efficient is travel time calculation in 1D velocity model - for given source, receiver depths and angular distance time is calculated within fraction of a second. Unfortunately, in most cases 1D is not enough to encounter differentiating local and regional structures. Whenever possible travel time through 3D velocity model has to be calculated. It can be achieved using ray calculation or time propagation in space. While single ray path calculation is quick it is complicated to find the ray path that connects source with the receiver. Time propagation in space using Fast Marching Method seems more efficient in most cases, especially when there are multiple receivers. In this presentation final release of a Python module pySeismicFMM is presented - simple and very efficient tool for calculating travel time from sources to receivers. Calculation requires regular 2D or 3D velocity grid either in Cartesian or geographic coordinates. On desktop class computer calculation speed is 200k grid cells per second. Calculation has to be performed once for every source location and provides travel time to all receivers. pySeismicFMM is free and open source. Development of this tool is a part of authors PhD thesis. Source code of pySeismicFMM will be published before Fall Meeting. National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  2. 75 FR 32803 - Notice of Issuance of Final Determination Concerning a GTX Mobile+ Hand Held Computer

    Science.gov (United States)

    2010-06-09

    ... shall be published in the Federal Register within 60 days of the date the final determination is issued..., involved various scenarios pertaining to the assembly of a desktop computer in the U.S. and the Netherlands... finished desktop computers depending on the model included an additional floppy drive, CD ROM disk, and...

  3. Desktop publishing: a useful tool for scientists.

    Science.gov (United States)

    Lindroth, J R; Cooper, G; Kent, R L

    1994-01-01

    Desktop publishing offers features that are not available in word processing programs. The process yields an impressive and professional-looking document that is legible and attractive. It is a simple but effective tool to enhance the quality and appearance of your work and perhaps also increase your productivity.

  4. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  5. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    Science.gov (United States)

    Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  6. Experimental and computational study and development of the bituminous coal entrained-flow air-blown gasifier for IGCC

    International Nuclear Information System (INIS)

    Abaimov, N A; Osipov, P V; Ryzhkov, A F

    2016-01-01

    In the paper the development of the advanced bituminous coal entrained-flow air- blown gasifier for the high power integrated gasification combined cycle is considered. The computational fluid dynamics technique is used as the basic development tool. The experiment on the pressurized entrained-flow gasifier was performed by “NPO CKTI” JSC for the thermochemical processes submodel verification. The kinetic constants for Kuznetsk bituminous coal (flame coal), obtained by thermal gravimetric analysis method, are used in the model. The calculation results obtained by the CFD model are in satisfactory agreements with experimental data. On the basis of the verified model the advanced gasifier structure was suggested which permits to increase the hydrogen content in the synthesis gas and consequently to improve the gas turbine efficiency. In order to meet the specified requirements vapor is added on the second stage of MHI type gasifier and heat necessary for air gasification is compensated by supplemental heating of the blasting air. (paper)

  7. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  8. Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench

    Science.gov (United States)

    Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan

    2016-04-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.

  9. Oxy-coal Combustion Studies

    Energy Technology Data Exchange (ETDEWEB)

    Wendt, J. [Univ. of Utah, Salt Lake City, UT (United States); Eddings, E. [Univ. of Utah, Salt Lake City, UT (United States); Lighty, J. [Univ. of Utah, Salt Lake City, UT (United States); Ring, T. [Univ. of Utah, Salt Lake City, UT (United States); Smith, P. [Univ. of Utah, Salt Lake City, UT (United States); Thornock, J. [Univ. of Utah, Salt Lake City, UT (United States); Y Jia, W. Morris [Univ. of Utah, Salt Lake City, UT (United States); Pedel, J. [Univ. of Utah, Salt Lake City, UT (United States); Rezeai, D. [Univ. of Utah, Salt Lake City, UT (United States); Wang, L. [Univ. of Utah, Salt Lake City, UT (United States); Zhang, J. [Univ. of Utah, Salt Lake City, UT (United States); Kelly, K. [Univ. of Utah, Salt Lake City, UT (United States)

    2012-01-06

    The objective of this project is to move toward the development of a predictive capability with quantified uncertainty bounds for pilot-scale, single-burner, oxy-coal operation. This validation research brings together multi-scale experimental measurements and computer simulations. The combination of simulation development and validation experiments is designed to lead to predictive tools for the performance of existing air fired pulverized coal boilers that have been retrofitted to various oxy-firing configurations. In addition, this report also describes novel research results related to oxy-combustion in circulating fluidized beds. For pulverized coal combustion configurations, particular attention is focused on the effect of oxy-firing on ignition and coal-flame stability, and on the subsequent partitioning mechanisms of the ash aerosol.

  10. A Computer-Controlled SEM-EDX Routine for Characterizing Respirable Coal Mine Dust

    Directory of Open Access Journals (Sweden)

    Victoria Johann-Essex

    2017-01-01

    Full Text Available A recent resurgence in coal workers’ pneumoconiosis (or “black lung” and concerns over other related respiratory illnesses have highlighted the need to elucidate characteristics of airborne particulates in occupational environments. A better understanding of particle size, aspect ratio, or chemical composition may offer new insights regarding causal factors of such illnesses. Scanning electron microscopy analysis using energy dispersive X-ray (SEM-EDX can be used to estimate these particle characteristics. If conducted manually, such work can be very time intensive, limiting the number of particles that can be analyzed. Moreover, potential exists for user bias in interpretation of EDX spectra. A computer-controlled (CC routine, on the other hand, can allow similar analysis at a much faster rate, increasing total particle counts and reproducibility of results. This paper describes a CCSEM-EDX routine specifically developed for analysis of respirable dust samples from coal mines. The routine is verified based on reliability of results obtained on samples of known materials, and reproducibility of results obtained on a set of 10 dust samples collected in the field. The characteristics of the field samples are also discussed with respect to mine occupational environments.

  11. Stop the Presses! An Update on Desktop Publishing.

    Science.gov (United States)

    McCarthy, Robert

    1988-01-01

    Discusses educational applications of desktop publishing at the elementary, secondary, and college levels. Topics discussed include page design capabilities; hardware requirements; software; the production of school newsletters and newspapers; cost factors; writing improvement; university departmental publications; and college book publishing. A…

  12. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms.

    Science.gov (United States)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-12-21

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards.

  13. Testing the quality of images for permanent magnet desktop MRI systems using specially designed phantoms

    International Nuclear Information System (INIS)

    Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng

    2013-01-01

    Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards. (paper)

  14. Comparing Web Applications with Desktop Applications: An Empirical Study

    DEFF Research Database (Denmark)

    Pop, Paul

    2002-01-01

    In recent years, many desktop applications have been ported to the world wide web in order to reduce (multiplatform) development, distribution and maintenance costs. However, there is little data concerning the usability of web applications, and the impact of their usability on the total cost...... of developing and using such applications. In this paper we present a comparison of web and desktop applications from the usability point of view. The comparison is based on an empirical study that investigates the performance of a group of users on two calendaring applications: Yahoo!Calendar and Microsoft...... Calendar. The study shows that in the case of web applications the performance of the users is significantly reduced, mainly because of the restricted interaction mechanisms provided by current web browsers....

  15. SONATINA-2V: a computer program for seismic analysis of the two-dimensional vertical slice HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1982-07-01

    A computer program SONATINA-2V has been developed for predicting the behavior of a two-dimensional vertical slice HTGR core under seismic excitation. SONATINA-2V is a general two-dimensional computer program capable of analyzing the vertical slice HTGR core with the permanent side reflector blocks and its restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Coulomb friction is taken into account between blocks and between dowel pin and hole. A spring dashpot model is used for the collision process between adjacent blocks. The core support structure is represented by a single block. The computer program SONATINA-2V is capable of analyzing the core behavior for an excitation input applied simultaneously to both vertical and horizontal directions. Analytical results obtained from SONATINA-2V are compared with experimental results and are found to be in good agreement. The computer program can thus be used to predict with a good accuracy the behavior of the HTGR core under seismic excitation. In the present report are given, the theoretical formulation of the analytical model, a user's manual to describe the input and output format, and sample problems. (author)

  16. Digital video for the desktop

    CERN Document Server

    Pender, Ken

    1999-01-01

    Practical introduction to creating and editing high quality video on the desktop. Using examples from a variety of video applications, benefit from a professional's experience, step-by-step, through a series of workshops demonstrating a wide variety of techniques. These include producing short films, multimedia and internet presentations, animated graphics and special effects.The opportunities for the independent videomaker have never been greater - make sure you bring your understanding fully up to date with this invaluable guide.No prior knowledge of the technology is assumed, with explanati

  17. [Teaching Desktop] Video Conferencing in a Collaborative and Problem Based Setting

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Mouritzen, Per

    2013-01-01

    , teachers and assistant teachers wanted to find ways in the design for learning that enables the learners to acquire knowledge about the theories, models and concepts of the subject, as well as hands‐on competencies in a learning‐by‐doing manner. In particular we address the area of desktop video...... shows that the students experiment with various pedagogical situations, and that during the process of design, teaching, and reflection they acquire experiences at both a concrete specific and a general abstract level. The desktop video conference system creates challenges, with technical issues...

  18. COMPUTER HARDWARE MARKING

    CERN Multimedia

    Groupe de protection des biens

    2000-01-01

    As part of the campaign to protect CERN property and for insurance reasons, all computer hardware belonging to the Organization must be marked with the words 'PROPRIETE CERN'.IT Division has recently introduced a new marking system that is both economical and easy to use. From now on all desktop hardware (PCs, Macintoshes, printers) issued by IT Division with a value equal to or exceeding 500 CHF will be marked using this new system.For equipment that is already installed but not yet marked, including UNIX workstations and X terminals, IT Division's Desktop Support Service offers the following services free of charge:Equipment-marking wherever the Service is called out to perform other work (please submit all work requests to the IT Helpdesk on 78888 or helpdesk@cern.ch; for unavoidable operational reasons, the Desktop Support Service will only respond to marking requests when these coincide with requests for other work such as repairs, system upgrades, etc.);Training of personnel designated by Division Leade...

  19. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  20. Aquatic Habitats: Exploring Desktop Ponds. Teacher's Guide.

    Science.gov (United States)

    Barrett, Katharine; Willard, Carolyn

    This book, for grades 2-6, is designed to provide students with a highly motivating and unique opportunity to investigate an aquatic habitat. Students set up, observe, study, and reflect upon their own "desktop ponds." Accessible plants and small animals used in these activities include Elodea, Tubifex worms, snails, mosquito larvae, and fish.…

  1. Seismic analysis for the ALMR

    International Nuclear Information System (INIS)

    Tajirian, F.F.

    1992-01-01

    The Advanced Liquid Metal Reactor (ALMR) design uses seismic isolation as a cost effective approach for simplifying seismic design of the reactor module, and for enhancing margins to handle beyond design basis earthquakes (BDBE). A comprehensive seismic analysis plan has been developed to confirm the adequacy of the design and to support regulatory licensing activities. In this plan state-of-the-art computer programs are used to evaluate the system response of the ALMR. Several factors that affect seismic response will be investigated. These include variability in the input earthquake mechanism, soil-structure interaction effects, and nonlinear response of the isolators. This paper reviews the type of analyses that are planned, and discuses the approach that will be used for validating the specific features of computer programs that are required in the analysis of isolated structures. To date, different linear and nonlinear seismic analyses have been completed. The results of recently completed linear analyses have been summarized elsewhere. The findings of three-dimensional seismic nonlinear analyses are presented in this paper. These analyses were performed to evaluate the effect of changes of isolator horizontal stiffness with horizontal displacement on overall response, to develop an approach for representing BDBE events with return periods exceeding 10,000 years, and to assess margins in the design for BDBEs. From the results of these analyses and bearing test data, it can be concluded that a properly designed and constructed seismic isolation system can accommodate displacements several times the design safe shutdown earthquake (SSE) for the ALMR. (author)

  2. Micro-seismic waveform matching inversion based on gravitational search algorithm and parallel computation

    Science.gov (United States)

    Jiang, Y.; Xing, H. L.

    2016-12-01

    Micro-seismic events induced by water injection, mining activity or oil/gas extraction are quite informative, the interpretation of which can be applied for the reconstruction of underground stress and monitoring of hydraulic fracturing progress in oil/gas reservoirs. The source characterises and locations are crucial parameters that required for these purposes, which can be obtained through the waveform matching inversion (WMI) method. Therefore it is imperative to develop a WMI algorithm with high accuracy and convergence speed. Heuristic algorithm, as a category of nonlinear method, possesses a very high convergence speed and good capacity to overcome local minimal values, and has been well applied for many areas (e.g. image processing, artificial intelligence). However, its effectiveness for micro-seismic WMI is still poorly investigated; very few literatures exits that addressing this subject. In this research an advanced heuristic algorithm, gravitational search algorithm (GSA) , is proposed to estimate the focal mechanism (angle of strike, dip and rake) and source locations in three dimension. Unlike traditional inversion methods, the heuristic algorithm inversion does not require the approximation of green function. The method directly interacts with a CPU parallelized finite difference forward modelling engine, and updating the model parameters under GSA criterions. The effectiveness of this method is tested with synthetic data form a multi-layered elastic model; the results indicate GSA can be well applied on WMI and has its unique advantages. Keywords: Micro-seismicity, Waveform matching inversion, gravitational search algorithm, parallel computation

  3. Cogeneration Technology Alternatives Study (CTAS). Volume 6: Computer data. Part 1: Coal-fired nocogeneration process boiler, section A

    Science.gov (United States)

    Knightly, W. F.

    1980-01-01

    Various advanced energy conversion systems (ECS) are compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. About fifty industrial processes from the largest energy consuming sectors were used as a basis for matching a similar number of energy conversion systems that are considered as candidates which can be made available by the 1985 to 2000 time period. The sectors considered included food, textiles, lumber, paper, chemicals, petroleum, glass, and primary metals. The energy conversion systems included steam and gas turbines, diesels, thermionics, stirling, closed cycle and steam injected gas turbines, and fuel cells. Fuels considered were coal, both coal and petroleum based residual and distillate liquid fuels, and low Btu gas obtained through the on-site gasification of coal. Computer generated reports of the fuel consumption and savings, capital costs, economics and emissions of the cogeneration energy conversion systems (ECS's) heat and power matched to the individual industrial processes are presented for coal fired process boilers. National fuel and emissions savings are also reported for each ECS assuming it alone is implemented.

  4. SHEAT: a computer code for probabilistic seismic hazard analysis, user's manual

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Kondo, Masaaki; Abe, Kiyoharu; Tanaka, Toshiaki; Takani, Michio.

    1994-08-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. Seismic hazard is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site. With the SHEAT code, seismic hazard is calculated by the following two steps: (1) Modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquakes) is modelled based on the historical earthquake records, active fault data and expert judgement. (2) Calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT code. It includes: (1) Outlines of the code, which include overall concept, logical process, code structure, data file used and special characteristics of the code, (2) Functions of subprograms and analytical models in them, (3) Guidance of input and output data, and (4) Sample run results. The code has widely been used at JAERI to analyze seismic hazard at various nuclear power plant sites in japan. (author)

  5. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  6. Realization of a Desktop Flight Simulation System for Motion-Cueing Studies

    Directory of Open Access Journals (Sweden)

    Berkay Volkaner

    2016-05-01

    Full Text Available Parallel robotic mechanisms are generally used in flight simulators with a motion-cueing algorithm to create an unlimited motion feeling of a simulated medium in a bounded workspace of the simulator. A major problem in flight simulators is that the simulation has an unbounded space and the manipulator has a limited one. Using a washout filter in the motion-cueing algorithm overcomes this. In this study, a low-cost six degrees of freedom (DoF desktop parallel manipulator is used to test a classical motion-cueing algorithm; the algorithm's functionality is confirmed with a Simulink real-time environment. Translational accelerations and angular velocities of the simulated medium obtained from FlightGear flight simulation software are processed through a generated washout filter algorithm and the simulated medium's motion information is transmitted to the desktop parallel robotic mechanism as a set point for each leg. The major issues of this paper are designing a desktop simulation system, controlling the parallel manipulator, communicating between the flight simulation and the platform, designing a motion-cueing algorithm and determining the parameters of the washout filters.

  7. Processing grounded-wire TEM signal in time-frequency-pseudo-seismic domain: A new paradigm

    Science.gov (United States)

    Khan, M. Y.; Xue, G. Q.; Chen, W.; Huasen, Z.

    2017-12-01

    Grounded-wire TEM has received great attention in mineral, hydrocarbon and hydrogeological investigations for the last several years. Conventionally, TEM soundings have been presented as apparent resistivity curves as function of time. With development of sophisticated computational algorithms, it became possible to extract more realistic geoelectric information by applying inversion programs to 1-D & 3-D problems. Here, we analyze grounded-wire TEM data by carrying out analysis in time, frequency and pseudo-seismic domain supported by borehole information. At first, H, K, A & Q type geoelectric models are processed using a proven inversion program (1-D Occam inversion). Second, time-to-frequency transformation is conducted from TEM ρa(t) curves to magneto telluric MT ρa(f) curves for the same models based on all-time apparent resistivity curves. Third, 1-D Bostick's algorithm was applied to the transformed resistivity. Finally, EM diffusion field is transformed into propagating wave field obeying the standard wave equation using wavelet transformation technique and constructed pseudo-seismic section. The transformed seismic-like wave indicates that some reflection and refraction phenomena appear when the EM wave field interacts with geoelectric interface at different depth intervals due to contrast in resistivity. The resolution of the transformed TEM data is significantly improved in comparison to apparent resistivity plots. A case study illustrates the successful hydrogeophysical application of proposed approach in recovering water-filled mined-out area in a coal field located in Ye county, Henan province, China. The results support the introduction of pseudo-seismic imaging technology in short-offset version of TEM which can also be an useful aid if integrated with seismic reflection technique to explore possibilities for high resolution EM imaging in future.

  8. Intercomparison of liquid metal fast reactor seismic analysis codes. V. 3: Comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Proceedings of a final research co-ordination meeting

    International Nuclear Information System (INIS)

    1996-05-01

    This publication contains the final papers summarizing the validation of the codes on the basis of comparison of observed effects with computer simulated effects on reactor cores from seismic disturbances. Refs, figs tabs

  9. Laptops vs. Desktops in a Google Groups Environment: A Study on Collaborative Learning

    Directory of Open Access Journals (Sweden)

    Steven Lopes Abrantes

    2011-01-01

    Full Text Available Current literature on m-learning refers to the lack of studies on real use of m-learning applications and how they can compete with current desktop counterparts. The study consists of an experiment involving one hundred and twelve students of higher education and a set of learning activities that they have to accomplish. This study has the main objective to validate if the students that use laptops or desktops are in the flow experience and which of them are more in the flow experience, when using Google Groups. The used approach is based on the flow experience introduced by [1]. It was possible to conclude that students have experienced the flow state both by students using laptops or desktops, but having the laptop students a more positive effect in the flow experience.

  10. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  11. Effects of Dual Monitor Computer Work Versus Laptop Work on Cervical Muscular and Proprioceptive Characteristics of Males and Females.

    Science.gov (United States)

    Farias Zuniga, Amanda M; Côté, Julie N

    2017-06-01

    The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p computer workstation designs.

  12. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  13. Seismic soil-structure interaction with consideration of spatial incoherence of seismic ground motions: A case study

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, Wen S., E-mail: wen.tseng@rizzoassoc.com [Paul C. Rizzo Associates, Inc., Western Region, 2201 Broadway, Suite 400, Oakland, CA 94612 (United States); Lilhanand, Kiat; Hamasaki, Don; Garcia, Julio A. [Paul C. Rizzo Associates, Inc., Western Region, 2201 Broadway, Suite 400, Oakland, CA 94612 (United States); Srinivasan, Ram [AREVA, NP, Inc., 6399 San Ignacio Avenue, San Jose, CA 95119 (United States)

    2014-04-01

    This paper presents a case study of seismic soil-structure interaction (SSI) analysis with consideration of spatial incoherence of seismic input ground motions. The SSI analyses were performed using the SASSI computer program for the Auxiliary Control Building (ACB) structure of an existing nuclear power plant on a hard rock site located in the Center and Eastern United States (CEUS) region. The incoherent seismic input motions for the hard rock site used for the analyses were generated using the computer program INCOH that works together with SASSI. The objective of the analyses was to generate maximum seismic response parameters for assessment of potential impact of newly developed site-specific (ground motion) response spectra (SSRS) on the seismic design of the ACB and potential benefits that could be gained by considering spatial incoherence of seismic input motions. Maximum seismic response values for selected response parameters of interest were generated with both SSRS-compatible coherent and incoherent seismic input motions. Comparisons were made of the corresponding maximum response parameter values and in-structure (acceleration) response spectra (ISRS) generated for both the coherent and incoherent motion inputs. These comparisons indicate that, by incorporating incoherence of ground motions in the seismic input, the maximum response values reduces and the ISRS peak amplitudes in the high frequency range (>10 Hz) also reduce from the corresponding response values resulting from the coherent motion input. The amount of ISRS-amplitude reduction increases as the spectral frequency increases, as expected. Such reductions can be as much as 20–50%. This case study demonstrates that, for a CEUS hard rock site where relatively high high-frequency in the seismic input response spectra exist, consideration of spatial incoherence of input motions would result in substantial benefits in reducing the high-frequency seismic responses. Such benefits are especially

  14. Upside to downsizing : Acceleware's graphic processor technology propels seismic data processing revolution

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.

    2009-11-15

    Accelware has developed a graphic processor technology (GPU) that is transforming the petroleum industry. The benefits of the technology are its small-footprint, low-wattage, and high speed. The software brings supercomputing speed to the desktop by leveraging the massive parallel processing capacity to the very latest in GPU technology. This article discussed the GPU technology and its emergence as a powerful supercomputing tool. Accelware's partnering with California-based NVIDIA was also outlined. The advantages of the technology were also discussed including its smaller footprint. Accelware's hardware takes up a fraction of the space and uses up to 70 per cent less power than a traditional central processing unit. By combining Accelware's core knowledge in making complex algorithms run in parallel with an in-house team of seismic industry experts, the company provides software solutions for seismic data processors that access the massively parallel processing capabilities of GPUs. 1 fig.

  15. SAMP: Application Messaging for Desktop and Web Applications

    Science.gov (United States)

    Taylor, M. B.; Boch, T.; Fay, J.; Fitzpatrick, M.; Paioro, L.

    2012-09-01

    SAMP, the Simple Application Messaging Protocol, is a technology which allows tools to communicate. It is deployed in a number of desktop astronomy applications including ds9, Aladin, TOPCAT, World Wide Telescope and numerous others, and makes it straightforward for a user to treat a selection of these tools as a loosely-integrated suite, combining the most powerful features of each. It has been widely used within Virtual Observatory contexts, but is equally suitable for non-VO use. Enabling SAMP communication from web-based content has long been desirable. An obvious use case is arranging for a click on a web page link to deliver an image, table or spectrum to a desktop viewer, but more sophisticated two-way interaction with rich internet applications would also be possible. Use from the web however presents some problems related to browser sandboxing. We explain how the SAMP Web Profile, introduced in version 1.3 of the SAMP protocol, addresses these issues, and discuss the resulting security implications.

  16. Coal Quality Expert: Status and software specifications

    International Nuclear Information System (INIS)

    Harrison, C.D.

    1992-01-01

    Under the Clean Coal Technology Program (Clean Coal Round 1), the US Department of Energy (DOE) and the Electric Power Research Institute (EPRI) are funding the development and demonstration of a computer program called the Coal Quality Expert (CQE trademark). When finished, the CQE will be a comprehensive PC-based program which can be used to evaluate several potential coal cleaning, blending, and switching options to reduce power plant emissions while minimizing generation costs. The CQE will be flxible in nature and capable of evaluating various qualities of coal, available transportation options, performance issues, and alternative emissions control strategies. This allows the CQE to determine the most cost-effective coal and the least expensive emissions control strategy for a given plant. To accomplish this, the CQE will be composed of technical models to evaluate performance issues; environmental models to evaluate environmental and regulatory issues; and cost estimating models to predict costs for installations of new and retrofit coal cleaning processes, power production equipment, and emissions control systems as well as other production costs such as consumables (fuel, scrubber additive, etc.), waste disposal, operating and maintenance, and replacement energy costs. These technical, environmental, and economic models as well as a graphical user interface will be developed for the CQE. And, in addition, to take advantage of already existing capability, the CQE will rely on seamless integration of already proven and extensively used computer programs such as the EPRI Coal Quality Information Systems, Coal Quality Impact Model (CQIM trademark), and NO x Pert. 2 figs

  17. The 20 Tera flop Erasmus Computing Grid (ECG).

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop computing

  18. The 20 Tera flop Erasmus Computing Grid (ECG)

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2009-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop computing

  19. The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector

    Science.gov (United States)

    Axani, S. N.; Frankiewicz, K.; Conrad, J. M.

    2018-03-01

    The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.

  20. Monitoring coal conversion processes by IR-spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Hobert, H.; Kempe, J.; Stephanowitz, H. (Friedrich-Schiller-Universitaet, Jena (German Democratic Republic))

    1990-01-01

    Explains application of infrared spectroscopy combined with multivariate data analysis by an on-line computer system for assessing coal quality and suitability of brown coal for conversion processes. Coal samples were pelletized under addition of KBr and analyzed using an IRF 180 Fourier transform spectrometer in the spectral range of 400 to 2,000 cm{sup -1}. Components of spectra are presented; the oil yield from coal hydrogenation is calculated by regression analysis. Covariance spectra of carbon, organic hydrogen and sulfur are shown. It is concluded that the field of application for the method includes industrial coal liquefaction, gasification as well as briquetting and coking. 8 refs.

  1. Coal background paper. Coal demand

    International Nuclear Information System (INIS)

    1997-01-01

    Statistical data are presented on coal demands in IEA and OECD member countries and in other countries. Coal coaking and coaking coal consumption data are tabulated, and IEA secretariat's coal demand projections are summarized. Coal supply and production data by countries are given. Finally, coal trade data are presented, broken down for hard coal, steam coal, coking coal (imports and export). (R.P.)

  2. Automatic Seismic-Event Classification with Convolutional Neural Networks.

    Science.gov (United States)

    Bueno Rodriguez, A.; Titos Luzón, M.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Active volcanoes exhibit a wide range of seismic signals, providing vast amounts of unlabelled volcano-seismic data that can be analyzed through the lens of artificial intelligence. However, obtaining high-quality labelled data is time-consuming and expensive. Deep neural networks can process data in their raw form, compute high-level features and provide a better representation of the input data distribution. These systems can be deployed to classify seismic data at scale, enhance current early-warning systems and build extensive seismic catalogs. In this research, we aim to classify spectrograms from seven different seismic events registered at "Volcán de Fuego" (Colima, Mexico), during four eruptive periods. Our approach is based on convolutional neural networks (CNNs), a sub-type of deep neural networks that can exploit grid structure from the data. Volcano-seismic signals can be mapped into a grid-like structure using the spectrogram: a representation of the temporal evolution in terms of time and frequency. Spectrograms were computed from the data using Hamming windows with 4 seconds length, 2.5 seconds overlapping and 128 points FFT resolution. Results are compared to deep neural networks, random forest and SVMs. Experiments show that CNNs can exploit temporal and frequency information, attaining a classification accuracy of 93%, similar to deep networks 91% but outperforming SVM and random forest. These results empirically show that CNNs are powerful models to classify a wide range of volcano-seismic signals, and achieve good generalization. Furthermore, volcano-seismic spectrograms contains useful discriminative information for the CNN, as higher layers of the network combine high-level features computed for each frequency band, helping to detect simultaneous events in time. Being at the intersection of deep learning and geophysics, this research enables future studies of how CNNs can be used in volcano monitoring to accurately determine the detection and

  3. Using Desktop Publishing To Enhance the "Writing Process."

    Science.gov (United States)

    Millman, Patricia G.; Clark, Margaret P.

    1997-01-01

    Describes the development of an instructional technology course at Fairmont State College (West Virginia) for education majors that included a teaching module combining steps of the writing process to provide for the interdisciplinary focus of writing across the curriculum. Discusses desktop publishing, the National Writing Project, and student…

  4. Geodesy in construction of the Belchatow brown coal mine. Geodezja w budowie KWB Belchatow

    Energy Technology Data Exchange (ETDEWEB)

    Poltoranos, J.

    1984-01-01

    Nine papers were delivered at the conference on geodesy in construction of the Belchatow brown coal mine held in October 1984 in Belchatow. Participants representing the Belchatow mine, Technical Institutes in Warsaw and Wroclaw, the Academy of Mining and Metallurgy im. Stanislaw Staszic in Cracow, the Central Mining Institute in Katowice, other research institutes in Poland and the Ministry of Mining and Power Generation attended the conference, sponsored by the Committee of Geodesy of the Polish Academy of Sciences. The following problems were discussed: types of geodetic measuring networks used in coal surface mining, criteria for optimization of geodetic measuring networks, kinematic problems in surveying displacements in coal mines, investigating strata movement in slopes of large and deep coal surface mines using geodetic surveying, mine surveying in the Belchatow mine, recommendations for amendment of regulations for geodetic surveying in coal surface mines in Poland, character of coal deposit in the Belchatow fault valley, its origin and geology, and causes of seismicity induced by mining in Belchatow. Eight papers have been abstracted separately.

  5. Performance of underground coal mines during the 1976 Tangshan earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C.F.

    1987-01-01

    The Tangshan earthquake of 1976 costs 242 000 lives and was responsible for 164 000 serious injuries and structural damage of immense proportion. The area has eight coal mines, which together form the largest underground coal mining operation in China. Approximately 10 000 miners were working underground at the time of the earthquake. With few exceptions they survived and returned safely to the surface, only to find their families and belongings largely destroyed. Based on a comprehensive survey of the miners' observations, subsurface intensity profiles were drawn up. The profiles clearly indicated that seismic damage in the underground mines was far less severe than at the surface. 16 refs., 4 figs., 2 tabs.

  6. Fast principal component analysis for stacking seismic data

    Science.gov (United States)

    Wu, Juan; Bai, Min

    2018-04-01

    Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.

  7. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    International Nuclear Information System (INIS)

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    A quantitative local computed tomography combined with data-constrained modelling has been developed. The method could improve distinctly the spatial resolution and the composition resolution in a sample larger than the field of view, for quantitative characterization of three-dimensional distributions of material compositions and void. Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials

  8. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  9. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    Science.gov (United States)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  10. Computational Fluid Dynamics (CFD) Modeling for High Rate Pulverized Coal Injection (PCI) to Blast Furnaces

    International Nuclear Information System (INIS)

    Zhou, Chenn

    2008-01-01

    Pulverized coal injection (PCI) into the blast furnace (BF) has been recognized as an effective way to decrease the coke and total energy consumption along with minimization of environmental impacts. However, increasing the amount of coal injected into the BF is currently limited by the lack of knowledge of some issues related to the process. It is therefore important to understand the complex physical and chemical phenomena in the PCI process. Due to the difficulty in attaining trus BF measurements, Computational fluid dynamics (CFD) modeling has been identified as a useful technology to provide such knowledge. CFD simulation is powerful for providing detailed information on flow properties and performing parametric studies for process design and optimization. In this project, comprehensive 3-D CFD models have been developed to simulate the PCI process under actual furnace conditions. These models provide raceway size and flow property distributions. The results have provided guidance for optimizing the PCI process

  11. Seismic fragility of a reinforced concrete structure

    Energy Technology Data Exchange (ETDEWEB)

    Kurmann, Davide [Axpo Power AG, Baden (Switzerland); Proske, Dirk [Axpo Power AG, Doettingen (Switzerland); Cervenka, Jan [Cervenka Consulting, Prague (Czech Republic)

    2013-05-15

    Structures can be exposed to seismic loading. For structures of major importance, extreme seismic loadings have to be considered. The proof of safety for such loadings requires sophisticated analysis. This paper introduces an analysis method which of course still includes simplifications, but yields to a far more realistic estimation of the seismic load bearing capacity of reinforced concrete structures compared to common methods. It is based on the development of pushover curves and the application of time-histories for the dynamic model to a representative harmonic oscillator. Dynamic parameters of the oscillator, such as modal mass and damping are computed using a soil-structure-interaction analysis. Based on the pushover-curve nonlinear force-deformation-capacities are applied to the oscillator including hysteresis behaviour characteristics. The oscillator is then exposed to time-histories of several earthquakes. Based on this computation the ductility is computed. The ductility can be scaled based upon the scaling of the time-histories. Since both, the uncertainty of the earthquake by using different timehistories and the uncertainty of the structure by using characteristic and mean material values, are considered, the uncertainty of the structure under seismic loading can be explicitly represented by a fragility. (orig.)

  12. No effect of ambient odor on the affective appraisal of a desktop virtual environment with signs of disorder.

    Directory of Open Access Journals (Sweden)

    Alexander Toet

    Full Text Available Desktop virtual environments (VEs are increasingly deployed to study the effects of environmental qualities and interventions on human behavior and safety related concerns in built environments. For these applications it is essential that users appraise the affective qualities of the VE similar to those of its real world counterpart. Previous studies have shown that factors like simulated lighting, sound and dynamic elements all contribute to the affective appraisal of a desktop VE. Since ambient odor is known to affect the affective appraisal of real environments, and has been shown to increase the sense of presence in immersive VEs, it may also be an effective tool to tune the affective appraisal of desktop VEs. This study investigated if exposure to ambient odor can modulate the affective appraisal of a desktop VE with signs of public disorder.Participants explored a desktop VE representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime, while being exposed to either room air or subliminal levels of unpleasant (tar or pleasant (cut grass ambient odor. Whenever they encountered signs of disorder they reported their safety related concerns and associated affective feelings.Signs of crime in the desktop VE were associated with negative affective feelings and concerns for personal safety and personal property. However, there was no significant difference between reported safety related concerns and affective connotations in the control (no-odor and in each of the two ambient odor conditions.Ambient odor did not affect safety related concerns and affective connotations associated with signs of disorder in the desktop VE. Thus, semantic congruency between ambient odor and a desktop VE may not be sufficient to influence its affective appraisal, and a more realistic simulation in which simulated objects appear to emit scents may be required to achieve this goal.

  13. Selective coal mining of intercalated lignite deposits

    Energy Technology Data Exchange (ETDEWEB)

    Zunic, R [Kolubara-Projekt, Lazarevac (Yugoslavia)

    1991-01-01

    Describes selective coal mining in the Tamnava-Istocno Polje coal surface coal mine (Yugoslavia), designed for an annual coal production of 11.4 Mt. Until 1991, this mine exploited one thick lignite seam, without spoil intercalations, using a bucket wheel excavator-conveyor-spreader system both for coal mining and removal of overburden. In the future, several spoil intercalations of up to 1.0 m and thicker will appear with a total volume of 22 million m{sup 3}. These intercalations have to be selectively excavated in order to guarantee the calorific value of coal for the Nikola Tesla power plant. Computer calculations were carried out to determine the decrease in excavator coal production due to selective mining of spoil strata. Calculations found that the annual surface mine capacity will be lower by at most 9%, depending on thickness of spoil intercalations. The useful operation time of excavators will be reduced by 98 hours per year. The planned annual coal production will nevertheless be fulfilled. 3 refs.

  14. Determination of Kinetic Parameters of Coal Pyrolysis to Simulate the Process of Underground Coal Gasification (UCG

    Directory of Open Access Journals (Sweden)

    Beata Urych

    2014-01-01

    Originality/value: The devolatilization of a homogenous lump of coal is a complex issue. Currently, the CFD technique (Computational Fluid Dynamics is commonly used for the multi-dimensional and multiphase phenomena modelling. The mathematical models, describing the kinetics of the decomposition of coal, proposed in the article can, therefore, be an integral part of models based on numerical fluid mechanics.

  15. Versatile Desktop Experiment Module (DEMo) on Heat Transfer

    Science.gov (United States)

    Minerick, Adrienne R.

    2010-01-01

    This paper outlines a new Desktop Experiment Module (DEMo) engineered for a chemical engineering junior-level Heat Transfer course. This new DEMo learning tool is versatile, fairly inexpensive, and portable such that it can be positioned on student desks throughout a classroom. The DEMo system can illustrate conduction of various materials,…

  16. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  17. Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application

    Directory of Open Access Journals (Sweden)

    Lia Duarte

    2017-03-01

    Full Text Available Geographical Information Systems (GIS are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment—GIS-ECOAL (freely available. The performance of the application was evaluated with distributed temperature measurements gathered in the S. Pedro da Cova (Portugal coal waste pile. In order to obtain the temperature data, an optical fiber cable was disposed over the affected area of the pile, with 42 location stakes acting as precisely-located control points for the temperature measurement. A monthly data set from July (15 min of interval was fed into the application and a video composed by several layouts with temperature measurements was created allowing for recognizing two main areas with higher temperatures. The field observations also allow the identification of these zones; however, the identification of an area with higher temperatures in the top of the studied area was only possible through the visualization of the images created by this application. The generated videos make possible the dynamic and continuous visualization of the combustion process in the monitored area.

  18. The desktop muon detector: A simple, physics-motivated machine- and electronics-shop project for university students

    Science.gov (United States)

    Axani, S. N.; Conrad, J. M.; Kirby, C.

    2017-12-01

    This paper describes the construction of a desktop muon detector, an undergraduate-level physics project that develops machine-shop and electronics-shop technical skills. The desktop muon detector is a self-contained apparatus that employs a plastic scintillator as the detection medium and a silicon photomultiplier for light collection. This detector can be battery powered and is used in conjunction with the provided software. The total cost per detector is approximately 100. We describe physics experiments we have performed, and then suggest several other interesting measurements that are possible, with one or more desktop muon detectors.

  19. Seismic design of piping systems

    International Nuclear Information System (INIS)

    Anglaret, G.; Beguin, J.L.

    1986-01-01

    This paper deals with the method used in France for the PWR nuclear plants to derive locations and types of supports of auxiliary and secondary piping systems taking earthquake in account. The successive steps of design are described, then the seismic computation method and its particular conditions of applications for piping are presented. The different types of support (and especially seismic ones) are described and also their conditions of installation. The method used to compare functional tests results and computation results in order to control models is mentioned. Some experiments realised on site or in laboratory, in order to validate models and methods, are presented [fr

  20. Computer modelling of the combined effects of plant conditions and coal quality on burnout in utility furnaces

    Energy Technology Data Exchange (ETDEWEB)

    P. Stephenson [RWE npower Engineering, Swindon (United Kingdom)

    2007-09-15

    The aim of this paper is to describe the latest steps in the development of a computer model to predict the combined effects of plant conditions and coal quality on burnout. The work was conducted as part of RWE's contribution to the recent ECSC project 'Development of a carbon-in-ash notification system (CARNO)'. A burnout predictor code has been developed and validated; it includes both coal and plant effects and includes a burnout model based closely on CBK8. The agreement between predicted C-in-ash and plant data is encouraging, but further improvements are still desirable. The predictions obtained from the burnout predictor show that the calculated sensitivities to changes in plant condition can be very dependent on state of plant. 7 refs., 7 figs., 1 tab.

  1. Frozen Gaussian approximation for 3D seismic tomography

    Science.gov (United States)

    Chai, Lihui; Tong, Ping; Yang, Xu

    2018-05-01

    Three-dimensional (3D) wave-equation-based seismic tomography is computationally challenging in large scales and high-frequency regime. In this paper, we apply the frozen Gaussian approximation (FGA) method to compute 3D sensitivity kernels and seismic tomography of high-frequency. Rather than standard ray theory used in seismic inversion (e.g. Kirchhoff migration and Gaussian beam migration), FGA is used to compute the 3D high-frequency sensitivity kernels for travel-time or full waveform inversions. Specifically, we reformulate the equations of the forward and adjoint wavefields for the purpose of convenience to apply FGA, and with this reformulation, one can efficiently compute the Green’s functions whose convolutions with source time function produce wavefields needed for the construction of 3D kernels. Moreover, a fast summation method is proposed based on local fast Fourier transform which greatly improves the speed of reconstruction as the last step of FGA algorithm. We apply FGA to both the travel-time adjoint tomography and full waveform inversion (FWI) on synthetic crosswell seismic data with dominant frequencies as high as those of real crosswell data, and confirm again that FWI requires a more sophisticated initial velocity model for the convergence than travel-time adjoint tomography. We also numerically test the accuracy of applying FGA to local earthquake tomography. This study paves the way to directly apply wave-equation-based seismic tomography methods into real data around their dominant frequencies.

  2. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  3. Specific issues for seismic performance of power plant equipment

    Energy Technology Data Exchange (ETDEWEB)

    Nawrotzki, Peter [GERB Vibration Control Systems, Berlin (Germany)

    2010-01-15

    Power plant machinery can be dynamically decoupled from the substructure by the effective use of helical steel springs and viscous dampers. Turbine foundations, coal mills, boiler feed pumps and other machine foundations benefit from this type of elastic support systems to mitigate the transmission of operational vibration. The application of these devices may also be used to protect against earthquakes and other catastrophic events, i.e. airplane crash, of particular importance in nuclear facilities. This article illustrates basic principles of elastic support systems and applications on power plant equipment and buildings in medium and high seismic areas. Spring damper combinations with special stiffness properties are used to reduce seismic acceleration levels of turbine components and other safety or non-safety related structures. For turbine buildings, the integration of the turbine sub-structure into the machine building can further reduce stress levels in all structural members. The application of this seismic protection strategy for a spent fuel storage tank in a high seismic area is also discussed. Safety in nuclear facilities is of particular importance and recent seismic events and the resulting damage in these facilities again brings up the discussion. One of the latest events is the 2007 Chuetsu earthquake in Japan. The resulting damage in the Kashiwazaki Kariwa Nuclear Power Plant can be found in several reports, e.g. in Yamashita. (orig.)

  4. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  5. An Exercise in Desktop Publishing: Using the "Newsroom."

    Science.gov (United States)

    Kiteka, Sebastian F.

    This guide provides a description and step-by-step instructions for the use of "Newsroom," a desktop-publishing program for the Apple II series of microcomputers produced by Springboard Software Inc. Based on the 1984 version of the program, this two-hour exercise focuses on the design and production of a newsletter with text and…

  6. Control of the extraction, transport and quality of coal in sections in actual time intervals

    Energy Technology Data Exchange (ETDEWEB)

    Prochazka, P; Sladek, J

    1981-01-01

    This paper describes the design of a system for the automatic, semiautomatic and manual control of the extraction, transport and quality of the coal in two sections of the Severo-Cheshsk brown coal basin using computers. The coal in these sections is transported along a joint transport main line which consists of three conveyor lines to two grinding works and from there to 3 thermoelectric power plants. Based on information about the coal quality in the mining sections of individual excavators, about their productivity and about the throughput of the conveyor lines, the computer determines in a quite short time the maximally possible throughput of the conveyor lines for ensuring the required coal quality. Programs are written in the ALGOL language. The information in the SM-3 computer from the excavators will be transmitted using a Tesla Radom wireless communications apparatus through a JPR-12 computer. A terminal will be mounted on each excavator which will report to the computer the number of ledges subject to mining, the type of coal in them, the distance of the excavator from the coal loading point and the size of required and actual productivity of the excavator.

  7. Development of seismic hazard analysis in Japan

    International Nuclear Information System (INIS)

    Itoh, T.; Ishii, K.; Ishikawa, Y.; Okumura, T.

    1987-01-01

    In recent years, seismic risk assessment of the nuclear power plant have been conducted increasingly in various countries, particularly in the United States to evaluate probabilistically the safety of existing plants under earthquake loading. The first step of the seismic risk assessment is the seismic hazard analysis, in which the relationship between the maximum earthquake ground motions at the plant site and their annual probability of exceedance, i.e. the seismic hazard curve, is estimated. In this paper, seismic hazard curves are evaluated and examined based on historical earthquake records model, in which seismic sources are modeled with area-sources, for several different sites in Japan. A new evaluation method is also proposed to compute the response spectra of the earthquake ground motions in connection with estimating the probabilistic structural response. Finally the numerical result of probabilistic risk assessment for a base-isolated three story RC structure, in which the frequency of seismic induced structural failure is evaluated combining the seismic hazard analysis, is described briefly

  8. FY 2000 report on the survey of the overseas geological structure. Japan-China joint coal exploration - Yu Xian project; 2000 nendo kaigai chishitsu kozo nado chosa hokokusho. Nippon Chugoku sekitan kyodo tansa Yu Xian project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    The geological survey was carried out which is needed for coal mine design in the Yu Xian coal mine area, Yu Xian coal field, Hebei province, China. The term of survey was 5 years from 1996 to 2000. Activities are mainly for seismic survey and boring survey. Japan was in charge of the seismic survey, and China in charge of the boring survey. Both attained the goal. The results of the activities were summed up in the following 7 items: 1) outline of the survey; 2) general investigation; 3) state of the exploration related materials/machinery; 4) field survey; 5) items of survey; 6) results of the survey; 7) conclusion. In 6), the geological analysis, coal quality survey and coal amount survey were conducted. In the geological analysis, analyzed were the succession of strata, geological structure, and the situation of existence of coal seams. In 7), the following were made clear: geological structure of the survey area, coal seam, coal quality, hydrological geology, other conditions of drilling technology, and coal amount. The coal amount was 328.34 million tons in a total of A/B/C class coals. The total coal amount of Nos. 1 and 5 coal seams was 259.79 million tons, which was 79.1% of the total coal amount in all area. The average thickness of Nos. 1 and 5 coal seams, which are the main minable coal seams, was 3.10m and 2.66m, respectively. (NEDO)

  9. Fast multifrequency focal beam analysis for 3D seismic acquisition geometry

    NARCIS (Netherlands)

    Wei, W.; Fu, L.; Blacquiere, G.

    2012-01-01

    A method for the efficient computation of multifrequency focal beams for 3D seismic acquisition geometry analysis has been developed. By computing them for all the frequency components of seismic data, single-frequency focal beams can be extended to multifrequency focal beams. However, this

  10. Seismic Barrier Protection of Critical Infrastructure from Earthquakes

    Science.gov (United States)

    2017-05-01

    We observe that such barrier structures reduce seismic wave powers by 10 – 40 dB that would otherwise reach the foundation location. Moreover, the... structure composed of opposing boreholes or trenches to mitigate seismic waves from diffracting and traveling in the vertical plane. Computational...seismic wave propagation models suggest that air or fluid filled subsurface V- shaped muffler structures are critical to the redirection and self

  11. Radiant-and-plasma technology for coal processing

    Directory of Open Access Journals (Sweden)

    Vladimir Messerle

    2012-12-01

    Full Text Available Radiant-and-plasma technology for coal processing is presented in the article. Thermodynamic computation and experiments on plasma processing of bituminous coal preliminary electron-beam activated were fulfilled in comparison with plasma processing of the coal. Positive influence of the preliminary electron-beam activation of coal on synthesis gas yield was found. Experiments were carried out in the plasma gasifier of 100 kW power. As a result of the measurements of material and heat balance of the process gave the following integral indicators: weight-average temperature of 2200-2300 K, and carbon gasification degree of 82,4-83,2%. Synthesis gas yield at thermochemical preparation of raw coal dust for burning was 24,5% and in the case of electron-beam activation of coal synthesis gas yield reached 36,4%, which is 48% higher.

  12. BDE-209 in the Australian Environment: Desktop review

    Energy Technology Data Exchange (ETDEWEB)

    English, Karin, E-mail: k.english@uq.edu.au [School of Medicine, The University of Queensland, Brisbane (Australia); Children’s Health and Environment Program, Child Health Research Centre, The University of Queensland, Brisbane (Australia); Queensland Children’s Medical Research Institute, Children’s Health Research Centre, Brisbane (Australia); Toms, Leisa-Maree L. [School of Public Health and Social Work, and Institute of Health and Biomedical Innovation, Queensland University of Technology, Brisbane (Australia); Gallen, Christie; Mueller, Jochen F. [The University of Queensland, National Research Centre for Environmental Toxicology (Entox), Brisbane (Australia)

    2016-12-15

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  13. BDE-209 in the Australian Environment: Desktop review

    International Nuclear Information System (INIS)

    English, Karin; Toms, Leisa-Maree L.; Gallen, Christie; Mueller, Jochen F.

    2016-01-01

    The commercial polybrominated diphenyl ether (PBDE) flame retardant mixture c-decaBDE is now being considered for listing on the Stockholm Convention on Persistent Organic Pollutants. The aim of our study was to review the literature regarding the use and detection of BDE-209, a major component of c-decaBDE, in consumer products and provide a best estimate of goods that are likely to contain BDE-209 in Australia. This review is part of a larger study, which will include quantitative testing of items to assess for BDE-209. The findings of this desktop review will be used to determine which items should be prioritized for quantitative testing. We identified that electronics, particularly televisions, computers, small household appliances and power boards, were the items that were most likely to contain BDE-209 in Australia. Further testing of these items should include items of various ages. Several other items were identified as high priority for future testing, including transport vehicles, building materials and textiles in non-domestic settings. The findings from this study will aid in the development of appropriate policies, should listing of c-decaBDE on the Stockholm Convention and Australia’s ratification of that listing proceed.

  14. A GIS-based multi-criteria seismic vulnerability assessment using the integration of granular computing rule extraction and artificial neural networks

    NARCIS (Netherlands)

    Sheikhian, Hossein; Delavar, Mahmoud Reza; Stein, Alfred

    2017-01-01

    This study proposes multi‐criteria group decision‐making to address seismic physical vulnerability assessment. Granular computing rule extraction is combined with a feed forward artificial neural network to form a classifier capable of training a neural network on the basis of the rules provided by

  15. Nielsen PrimeLocation Web/Desktop: Assessing and GIS Mapping Market Area

    Data.gov (United States)

    Social Security Administration — Nielsen PrimeLocation Web and Desktop Software Licensed for Internal Use only: Pop-Facts Demographics Database, Geographic Mapping Data Layers, Geo-Coding locations.

  16. Seismic isolation floor and vibration control equipment for nuclear power plant

    International Nuclear Information System (INIS)

    Niwa, H.; Fujimoto, S.; Aida, Y.; Miyano, H.

    1996-01-01

    We have developed a seismic isolation floor to improve protection against earthquakes for process computer systems, and a magnetic dynamic damper to reduce the mechanical vibrations of piping systems and pumps in nuclear power plants. Seismic excitation tests of the seismic isolation floor, on which process computer systems were installed, were performed using large earthquake simulators. The test results proved that the seismic isolation floor significantly reduced seismic forces. To control mechanical vibrations, a magnetic dynamic damper was designed using permanent magnets. This magnetic dynamic damper does not require mechanical springs, dampers and supports in the floors and walls of the building. Vibration tests using a rotating machine model confirmed that the magnetic dynamic damper effectively controlled vibrations in such a rotating machine model. (author)

  17. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, J. B.

    2016-01-01

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant

  18. Development of a Desktop Simulator for APR1400 Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. B. [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    It is essential for utilities to possess a full-scope simulator for operator training and operation test for operators. But it is very expensive and sometimes lack of fidelity if processes of developing the simulator and designing the plant are in parallel. It is due to the situation that simulator development stage sometimes precedes the plant design stage and modifications may occur to the design of the plant in construction stage. In an attempt to build a low cost and efficient simulator, a desktop simulator has been developed. This model is described herein. Using desktop simulators for training operators is an efficient method for familiarizing operators with their plant’s operation. A low cost and efficient desktop simulator for APR1400 has been developed, and brief features are introduced here. It is configured to mimic a full-scale simulator, and can be used for operators to be familiarized to their plant’s operation. Since the size of the simulator is small enough to be fit in a desk, it can be used in a classroom or in an office at any time. It can also be used to evaluate design changes or modifications of the plant before implementing them to the plant.

  19. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  20. Using Seismic Interferometry to Investigate Seismic Swarms

    Science.gov (United States)

    Matzel, E.; Morency, C.; Templeton, D. C.

    2017-12-01

    Seismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Hundreds of small earthquakes often occur along a fault during a seismic swarm. This seismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the fault, itself. Here we focus on two methods of seismic interferometry, ambient noise correlation (ANC) and the virtual seismometer method (VSM). ANC is based on the observation that the Earth's background noise includes coherent energy, which can be recovered by observing over long time periods and allowing the incoherent energy to cancel out. The cross correlation of ambient noise between a pair of stations results in a waveform that is identical to the seismogram that would result if an impulsive source located at one of the stations was recorded at the other, the Green function (GF). The calculation of the GF is often stable after a few weeks of continuous data correlation, any perturbations to the GF after that point are directly related to changes in the subsurface and can be used for 4D monitoring.VSM is a style of seismic interferometry that provides fast, precise, high frequency estimates of the Green's function (GF) between earthquakes. VSM illuminates the subsurface precisely where the pressures are changing and has the potential to image the evolution of seismicity over time, including changes in the style of faulting. With hundreds of earthquakes, we can calculate thousands of waveforms. At the same time, VSM collapses the computational domain, often by 2-3 orders of magnitude. This allows us to do high frequency 3D modeling in the fault region. Using data from a swarm of earthquakes near the Salton Sea, we demonstrate the power of these techniques, illustrating our ability to scale from the far field, where sources are well separated, to the near field where their locations fall within each other

  1. Automated cost modeling for coal combustion systems

    International Nuclear Information System (INIS)

    Rowe, R.M.; Anast, K.R.

    1991-01-01

    This paper reports on cost information developed at AMAX R and D Center for coal-water slurry production implemented in an automated spreadsheet (Lotus 123) for personal computer use. The spreadsheet format allows the user toe valuate impacts of various process options, coal feedstock characteristics, fuel characteristics, plant location sites, and plant sizes on fuel cost. Model flexibility reduces time and labor required to determine fuel costs and provides a basis to compare fuels manufactured by different processes. The model input includes coal characteristics, plant flowsheet definition, plant size, and market location. Based on these inputs, selected unit operations are chosen for coal processing

  2. In Situ Test Study of Characteristics of Coal Mining Dynamic Load

    Directory of Open Access Journals (Sweden)

    Jiang He

    2015-01-01

    Full Text Available Combination of coal mining dynamic load and high static stress can easily induce such dynamic disasters as rock burst, coal and gas outburst, roof fall, and water inrush. In order to obtain the characteristic parameters of mining dynamic load and dynamic mechanism of coal and rock, the stress wave theory is applied to derive the relation of mining dynamic load strain rate and stress wave parameters. The in situ test was applied to study the stress wave propagation law of coal mine dynamic load by using the SOS microseismic monitoring system. An evaluation method for mining dynamic load strain rate was proposed, and the statistical evaluation was carried out for the range of strain rate. The research results show that the loading strain rate of mining dynamic load is in direct proportion to the seismic frequency of coal-rock mass and particle peak vibration velocity and is in inverse proportion to wave velocity. The high-frequency component damps faster than the low-frequency component in the shockwave propagating process; and the peak particle vibration velocity has a power functional relationship with the transmitting distance. The loading strain rate of mining dynamic load is generally less than class 10−1/s.

  3. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  4. Field test investigation of high sensitivity fiber optic seismic geophone

    Science.gov (United States)

    Wang, Meng; Min, Li; Zhang, Xiaolei; Zhang, Faxiang; Sun, Zhihui; Li, Shujuan; Wang, Chang; Zhao, Zhong; Hao, Guanghu

    2017-10-01

    Seismic reflection, whose measured signal is the artificial seismic waves ,is the most effective method and widely used in the geophysical prospecting. And this method can be used for exploration of oil, gas and coal. When a seismic wave travelling through the Earth encounters an interface between two materials with different acoustic impedances, some of the wave energy will reflect off the interface and some will refract through the interface. At its most basic, the seismic reflection technique consists of generating seismic waves and measuring the time taken for the waves to travel from the source, reflect off an interface and be detected by an array of geophones at the surface. Compared to traditional geophones such as electric, magnetic, mechanical and gas geophone, optical fiber geophones have many advantages. Optical fiber geophones can achieve sensing and signal transmission simultaneously. With the development of fiber grating sensor technology, fiber bragg grating (FBG) is being applied in seismic exploration and draws more and more attention to its advantage of anti-electromagnetic interference, high sensitivity and insensitivity to meteorological conditions. In this paper, we designed a high sensitivity geophone and tested its sensitivity, based on the theory of FBG sensing. The frequency response range is from 10 Hz to 100 Hz and the acceleration of the fiber optic seismic geophone is over 1000pm/g. sixteen-element fiber optic seismic geophone array system is presented and the field test is performed in Shengli oilfield of China. The field test shows that: (1) the fiber optic seismic geophone has a higher sensitivity than the traditional geophone between 1-100 Hz;(2) The low frequency reflection wave continuity of fiber Bragg grating geophone is better.

  5. XRD and FT–IR investigations of sub-bituminous Assam coals

    Indian Academy of Sciences (India)

    TECS

    extremely difficult and therefore, research on coal structure is still a ... the basis of development of spectroscopic methods. These are FTIR ... such as natural gas or imported coals to satisfy the energy requirement .... A new computer program in C++ was developed .... tool for characterization of coal and its products as it fur-.

  6. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  7. Desktop Publishing in the University: Current Progress, Future Visions.

    Science.gov (United States)

    Smith, Thomas W.

    1989-01-01

    Discussion of the workflow involved in desktop publishing focuses on experiences at the College of Engineering at the University of Wisconsin at Madison. Highlights include cost savings and productivity gains in page layout and composition; editing, translation, and revision issues; printing and distribution; and benefits to the reader. (LRW)

  8. Big Memory Elegance: HyperCard Information Processing and Desktop Publishing.

    Science.gov (United States)

    Bitter, Gary G.; Gerson, Charles W., Jr.

    1991-01-01

    Discusses hardware requirements, functions, and applications of five information processing and desktop publishing software packages for the Macintosh: HyperCard, PageMaker, Cricket Presents, Power Point, and Adobe illustrator. Benefits of these programs for schools are considered. (MES)

  9. Applications and a three-dimensional desktop environment for an immersive virtual reality system

    International Nuclear Information System (INIS)

    Kageyama, Akira; Masada, Youhei

    2013-01-01

    We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes ''teleportation'' into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE

  10. Seismic forecast using geostatistics

    International Nuclear Information System (INIS)

    Grecu, Valeriu; Mateiciuc, Doru

    2007-01-01

    The main idea of this research direction consists in the special way of constructing a new type of mathematical function as being a correlation between a computed statistical quantity and another physical quantity. This type of function called 'position function' was taken over by the authors of this study in the field of seismology with the hope of solving - at least partially - the difficult problem of seismic forecast. The geostatistic method of analysis focuses on the process of energy accumulation in a given seismic area, completing this analysis by a so-called loading function. This function - in fact a temporal function - describes the process of energy accumulation during a seismic cycle from a given seismic area. It was possible to discover a law of evolution of the seismic cycles that was materialized in a so-called characteristic function. This special function will help us to forecast the magnitude and the occurrence moment of the largest earthquake in the analysed area. Since 2000, the authors have been evolving to a new stage of testing: real - time analysis, in order to verify the quality of the method. There were five large earthquakes forecasts. (authors)

  11. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party

  12. International Coal Report's coal year 1991

    Energy Technology Data Exchange (ETDEWEB)

    McCloskey, G [ed.

    1991-05-31

    Following introductory articles on factors affecting trade in coal and developments in the freight market, tables are given for coal exports and coal imports for major countries worldwide for 1989 and 1990. Figures are also included for coal consumption in Canada and the Eastern bloc,, power station consumption in Japan, coal supply and demand in the UK, electric utility coal consumption and stocks in the USA, coal production in Australia, Canada and USA by state, and world hard coal production. A final section gives electricity production and hard coal deliveries in the EEC, sales of imported and local coal and world production of pig iron and steel.

  13. A simple grid implementation with Berkeley Open Infrastructure for Network Computing using BLAST as a model

    Directory of Open Access Journals (Sweden)

    Watthanai Pinthong

    2016-07-01

    Full Text Available Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software.

  14. A Real-World Project for a Desktop Publishing Course.

    Science.gov (United States)

    Marsden, James D.

    1994-01-01

    Describes a project in a desktop publishing course in which students work with nonprofit and campus organizations to design brochures that fulfill important needs. Discusses specific tools students use. Describes the brochure project, project criteria, clients, text and graphics for the project, how to evaluate the project, and guidelines for…

  15. Investigation of karst collapse based on 3-D seismic technique and DDA method at Xieqiao coal mine, China

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Jian-Ping; Chen, Zhong-Hui [State Key Laboratory of Coal Resources and Safe Mining, China University of Mining and Technology, Beijing 100083 (China); Institute of Rock Mechanical and Fractals, China University of Mining and Technology, Beijing 100083 (China); Peng, Su-Ping; Li, Yong-Jun [State Key Laboratory of Coal Resources and Safe Mining, China University of Mining and Technology, Beijing 100083 (China); Xie, He-Ping [Institute of Rock Mechanical and Fractals, China University of Mining and Technology, Beijing 100083 (China)

    2009-06-01

    Karst collapse is a serious geological problem in most of the coal mines in the north of China, but recently it has been found in the south as well. The present study is aimed at investigating subsidence mechanism and deformation field of a karst collapse column at Xieqiao, in the south of China. A method of three-dimensional (3-D) seismic technique has been successful in exploring the spatial morphology of the karst collapse at Xieqiao, and the discontinuous deformation analysis (DDA) method is used to calculate the deformation field and analyze the subsidence mechanism. The results indicated that DDA could approximately simulate and back analyze the subsidence process and strata deformation fields. The subsidence processes of the collapse column depend on the sizes of the karst caves. With the continuous expansion of the karst caves, a semi-elliptic stress field, local separation strata and fracture zone will be formed around the karst cave. Moreover, they will gradually expand upwards along the vertical direction. The paper also indicates that the subsidence failure stage may trigger a sudden collapse of the karst column because of the sudden energy release. Also, it will make a great impact on the vicinity working face so as to cause a rock burst. The effects of the friction angle of rock strata on the subsidence mechanism were reported firstly based on DDA. (author)

  16. ACID Astronomical and Physics Cloud Interactive Desktop: A Prototype of VUI for CTA Science Gateway

    Science.gov (United States)

    Massimino, P.; Costa, A.; Becciani, U.; Vuerli, C.; Bandieramonte, M.; Petta, C.; Riggi, S.; Sciacca, E.; Vitello, F.; Pistagna, C.

    2014-05-01

    The Astronomical & Physics Cloud Interactive Desktop, developed for the prototype of CTA Science Gateway in Catania, Italy, allows to use many software packages without any installation on the local desktop. The users will be able to exploit, if applicable, the native Graphical User Interface (GUI) of the programs that are available in the ACID environment. For using interactively the remote programs, ACID exploits an "ad hoc" VNC-based User Interface (VUI).

  17. Printing in Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Karapantelakis, Athanasios; Delvic, Alisa; Zarifi Eslami, Mohammed; Khamit, Saltanat

    Document printing has long been considered an indispensable part of the workspace. While this process is considered trivial and simple for environments where resources are ample (e.g. desktop computers connected to printers within a corporate network), it becomes complicated when applied in a mobile

  18. A nautical study of towed marine seismic streamer cable configurations

    Energy Technology Data Exchange (ETDEWEB)

    Pedersen, Egil

    1996-12-31

    This study concerns marine seismic surveying and especially the towed in-sea hardware which is dominated by recording cables (streamers) that are extremely long compared to their diameter, neutrally buoyant and depth controlled. The present work aims to examine the operations from a nautical viewpoint, and the final objective is to propose improvements to the overall efficiency of marine seismic operations. Full-scale data were gathered from seismic vessels in order to identify which physical parameters affect the dynamic motion of the towing vessel and its in-sea hardware. Experimental test programmes have been carried out, and data bases with the hydrodynamic characteristics of the test equipment have been established at speeds comparable to those used in seismic operations. A basic analysis tool to provide dynamic simulations of a seismic streamer cable has been developed by tailoring the computer program system Riflex, and the validation and accuracy of this modified Riflex system are evaluated by performing uncertainty analyses of measurements and computations. Unexpected, low-frequency depth motions in towed seismic streamer cables occasionally take place when seismic data are being acquired. The phenomenon is analysed and discussed. 99 refs., 116 figs., 5 tabs.

  19. A nautical study of towed marine seismic streamer cable configurations

    Energy Technology Data Exchange (ETDEWEB)

    Pedersen, Egil

    1997-12-31

    This study concerns marine seismic surveying and especially the towed in-sea hardware which is dominated by recording cables (streamers) that are extremely long compared to their diameter, neutrally buoyant and depth controlled. The present work aims to examine the operations from a nautical viewpoint, and the final objective is to propose improvements to the overall efficiency of marine seismic operations. Full-scale data were gathered from seismic vessels in order to identify which physical parameters affect the dynamic motion of the towing vessel and its in-sea hardware. Experimental test programmes have been carried out, and data bases with the hydrodynamic characteristics of the test equipment have been established at speeds comparable to those used in seismic operations. A basic analysis tool to provide dynamic simulations of a seismic streamer cable has been developed by tailoring the computer program system Riflex, and the validation and accuracy of this modified Riflex system are evaluated by performing uncertainty analyses of measurements and computations. Unexpected, low-frequency depth motions in towed seismic streamer cables occasionally take place when seismic data are being acquired. The phenomenon is analysed and discussed. 99 refs., 116 figs., 5 tabs.

  20. Processing of seismic signals from a seismometer network

    International Nuclear Information System (INIS)

    Key, F.A.; Warburton, P.J.

    1983-08-01

    A description is given of the Seismometer Network Analysis Computer (SNAC) which processes short period data from a network of seismometers (UKNET). The nine stations of the network are distributed throughout the UK and their outputs are transmitted to a control laboratory (Blacknest) where SNAC monitors the data for seismic signals. The computer gives an estimate of the source location of the detected signals and stores the waveforms. The detection logic is designed to maintain high sensitivity without excessive ''false alarms''. It is demonstrated that the system is able to detect seismic signals at an amplitude level consistent with a network of single stations and, within the limitations of signal onset time measurements made by machine, can locate the source of the seismic disturbance. (author)

  1. An Investigation of Seismicity for Western Anatolia

    International Nuclear Information System (INIS)

    Sayil, N.

    2007-01-01

    In order to determine the seismicity of western Anatolia limited with the coordinates of 36degree-40degreeN, 26degree-32degreeE, Gutenberg-Richter magnitude-frequency relation, seismic risk and recurrence period have been computed. The data belonging to both the historical period before 1900 (I0 3 6.0 corresponding to MS 3 5.0) and the instrumental period until 2005 (MS 3 4.0) have been used in the analysis. The study area has been divided into 13 sub-regions due to certain seismotectonic characteristics, plate tectonic models and geology of the region. Computations from a and b parameters and seismic risk and recurrence period for each sub-regions have showed that subregions 1 and 8 (Balikesir and Izmir-Sakiz Island), where have the lowest b values, have the highest risks and the shortest recurrence periods

  2. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The objective of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines.

  3. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  4. Designing for Communication: The Key to Successful Desktop Publishing.

    Science.gov (United States)

    McCain, Ted D. E.

    Written for those who are new to design and page layout, this book focuses on providing novice desktop publishers with an understanding of communication, graphic design, typography, page layout, and page layout techniques. The book also discusses how people read, design as a consequence of understanding, and the principles of page layout. Chapters…

  5. What Desktop Publishing Can Teach Professional Writing Students about Publishing.

    Science.gov (United States)

    Dobberstein, Michael

    1992-01-01

    Points out that desktop publishing is a metatechnology that allows professional writing students access to the production phase of publishing, giving students hands-on practice in preparing text for printing and in learning how that preparation affects the visual meaning of documents. (SR)

  6. Visual attention for a desktop virtual environment with ambient scent

    NARCIS (Netherlands)

    Toet, A.; Schaik, M.G. van

    2013-01-01

    In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with

  7. NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development

    Science.gov (United States)

    Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.

    2009-12-01

    NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are

  8. Seismic data are rich in information about subsurface formations and fluids

    Energy Technology Data Exchange (ETDEWEB)

    Farfour, Mohammed; Yoon, Wang Jung; Kim, Dongshin [Geophysical Prospecting Lab, Energy & Resources Eng., Dept., Chonnam National University, Gwangju (Korea, Republic of); Lee, Jeong-Hwan [Petroleum Engineering Lab, Energy & Resources Eng., Dept., Chonnam National University, Gwangju (Korea, Republic of)

    2016-06-08

    Seismic attributes are defined as any measured or computed information derived from seismic data. Throughout the last decades extensive work has been done in developing variety of mathematical approaches to extract maximum information from seismic data. Nevertheless, geoscientists found that seismic is still mature and rich in information. In this paper a new seismic attribute is introduced. Instantaneous energy seismic attribute is an amplitude based attribute that has the potential to emphasize anomalous amplitude associated with hydrocarbons. Promising results have been obtained from applying the attribute on seismic section traversing hydrocarbon filled sand from Alberta, Canada.

  9. Computer aided seismic and fire retrofitting analysis of existing high rise reinforced concrete buildings

    CERN Document Server

    Hussain, Raja Rizwan; Hasan, Saeed

    2016-01-01

    This book details the analysis and design of high rise buildings for gravity and seismic analysis. It provides the knowledge structural engineers need to retrofit existing structures in order to meet safety requirements and better prevent potential damage from such disasters as earthquakes and fires. Coverage includes actual case studies of existing buildings, reviews of current knowledge for damages and their mitigation, protective design technologies, and analytical and computational techniques. This monograph also provides an experimental investigation on the properties of fiber reinforced concrete that consists of natural fibres like coconut coir and also steel fibres that are used for comparison in both Normal Strength Concrete (NSC) and High Strength Concrete (HSC). In addition, the authors examine the use of various repair techniques for damaged high rise buildings. The book will help upcoming structural design engineers learn the computer aided analysis and design of real existing high rise buildings ...

  10. Seismic wavefield modeling based on time-domain symplectic and Fourier finite-difference method

    Science.gov (United States)

    Fang, Gang; Ba, Jing; Liu, Xin-xin; Zhu, Kun; Liu, Guo-Chang

    2017-06-01

    Seismic wavefield modeling is important for improving seismic data processing and interpretation. Calculations of wavefield propagation are sometimes not stable when forward modeling of seismic wave uses large time steps for long times. Based on the Hamiltonian expression of the acoustic wave equation, we propose a structure-preserving method for seismic wavefield modeling by applying the symplectic finite-difference method on time grids and the Fourier finite-difference method on space grids to solve the acoustic wave equation. The proposed method is called the symplectic Fourier finite-difference (symplectic FFD) method, and offers high computational accuracy and improves the computational stability. Using acoustic approximation, we extend the method to anisotropic media. We discuss the calculations in the symplectic FFD method for seismic wavefield modeling of isotropic and anisotropic media, and use the BP salt model and BP TTI model to test the proposed method. The numerical examples suggest that the proposed method can be used in seismic modeling of strongly variable velocities, offering high computational accuracy and low numerical dispersion. The symplectic FFD method overcomes the residual qSV wave of seismic modeling in anisotropic media and maintains the stability of the wavefield propagation for large time steps.

  11. Feasibility of Bioprinting with a Modified Desktop 3D Printer.

    Science.gov (United States)

    Goldstein, Todd A; Epstein, Casey J; Schwartz, John; Krush, Alex; Lagalante, Dan J; Mercadante, Kevin P; Zeltsman, David; Smith, Lee P; Grande, Daniel A

    2016-12-01

    Numerous studies have shown the capabilities of three-dimensional (3D) printing for use in the medical industry. At the time of this publication, basic home desktop 3D printer kits can cost as little as $300, whereas medical-specific 3D bioprinters can cost more than $300,000. The purpose of this study is to show how a commercially available desktop 3D printer could be modified to bioprint an engineered poly-l-lactic acid scaffold containing viable chondrocytes in a bioink. Our bioprinter was used to create a living 3D functional tissue-engineered cartilage scaffold. In this article, we detail the design, production, and calibration of this bioprinter. In addition, the bioprinted cells were tested for viability, proliferation, biochemistry, and gene expression; these tests showed that the cells survived the printing process, were able to continue dividing, and produce the extracellular matrix expected of chondrocytes.

  12. Seismic response analyses for reactor facilities at Savannah River

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Xu, J.

    1991-01-01

    The reactor facilities at the Savannah River Plant (SRP) were designed during the 1950's. The original seismic criteria defining the input ground motion was 0.1 G with UBC [uniform building code] provisions used to evaluate structural seismic loads. Later ground motion criteria have defined the free field seismic motion with a 0.2 G ZPA [free field acceleration] and various spectral shapes. The spectral shapes have included the Housner spectra, a site specific spectra, and the US NRC [Nuclear Regulatory Commission] Reg. Guide 1.60 shape. The development of these free field seismic criteria are discussed in the paper. The more recent seismic analyses have been of the following type: fixed base response spectra, frequency independent lumped parameter soil/structure interaction (SSI), frequency dependent lumped parameter SSI, and current state of the art analyses using computer codes such as SASSI. The results from these computations consist of structural loads and floor response spectra (used for piping and equipment qualification). These results are compared in the paper and the methods used to validate the results are discussed. 14 refs., 11 figs

  13. CarbonSAFE Rocky Mountain Phase I : Seismic Characterization of the Navajo Reservoir, Buzzard Bench, Utah

    Science.gov (United States)

    Haar, K. K.; Balch, R. S.; Lee, S. Y.

    2017-12-01

    The CarbonSAFE Rocky Mountain project team is in the initial phase of investigating the regulatory, financial and technical feasibility of commercial-scale CO2 capture and storage from two coal-fired power plants in the northwest region of the San Rafael Swell, Utah. The reservoir interval is the Jurassic Navajo Sandstone, an eolian dune deposit that at present serves as the salt water disposal reservoir for Ferron Sandstone coal-bed methane production in the Drunkards Wash field and Buzzard Bench area of central Utah. In the study area the Navajo sandstone is approximately 525 feet thick and is at an average depth of about 7000 feet below the surface. If sufficient porosity and permeability exist, reservoir depth and thickness would provide storage for up to 100,000 metric tonnes of CO2 per square mile, based on preliminary estimates. This reservoir has the potential to meet the DOE's requirement of having the ability to store at least 50 million metric tons of CO2 and fulfills the DOE's initiative to develop protocols for commercially sequestering carbon sourced from coal-fired power plants. A successful carbon storage project requires thorough structural and stratigraphic characterization of the reservoir, seal and faults, thereby allowing the creation of a comprehensive geologic model with subsequent simulations to evaluate CO2/brine migration and long-term effects. Target formation lithofacies and subfacies data gathered from outcrop mapping and laboratory analysis of core samples were developed into a geologic model. Synthetic seismic was modeled from this, allowing us to seismically characterize the lithofacies of the target formation. This seismic characterization data was then employed in the interpretation of 2D legacy lines which provided stratigraphic and structural control for more accurate model development of the northwest region of the San Rafael Swell. Developing baseline interpretations such as this are crucial toward long-term carbon storage

  14. Lock It Up! Computer Security.

    Science.gov (United States)

    Wodarz, Nan

    1997-01-01

    The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…

  15. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  16. Deep unsupervised learning on a desktop PC: A primer for cognitive scientists

    Directory of Open Access Journals (Sweden)

    Alberto eTestolin

    2013-05-01

    Full Text Available Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programming parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low-cost graphic cards (GPUs without any specific programming effort, thanks to the use of high-level programming routines (available in MATLAB or Python. We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  17. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists

    Science.gov (United States)

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617

  18. Addition to the Lewis Chemical Equilibrium Program to allow computation from coal composition data

    Science.gov (United States)

    Sevigny, R.

    1980-01-01

    Changes made to the Coal Gasification Project are reported. The program was developed by equilibrium combustion in rocket engines. It can be applied directly to the entrained flow coal gasification process. The particular problem addressed is the reduction of the coal data into a form suitable to the program, since the manual process is involved and error prone. A similar problem in relating the normal output of the program to parameters meaningful to the coal gasification process is also addressed.

  19. Nanometre-sized pores in coal: Variations between coal basins and coal origin

    Science.gov (United States)

    Sakurovs, Richard; Koval, Lukas; Grigore, Mihaela; Sokolava, Anna; Ruppert, Leslie F.; Melnichenko, Yuri B.

    2018-01-01

    We have used small angle neutron scattering (SANS) to investigate the differences in methane and hexane penetration in pores in bituminous coal samples from the U.S., Canada, South Africa, and China, and maceral concentrates from Australian coals. This work is an extension of previous work that showed consistent differences between the extent of penetration by methane into 10–20 nm size pores in inertinite in bituminous coals from Australia, North America and Poland.In this study we have confirmed that there are differences in the response of inertinite to methane and hexane penetration in coals sourced from different coal basins. Inertinite in Permian Australian coals generally has relatively high numbers of pores in the 2.5–250 nm size range and the pores are highly penetrable by methane and hexane; coals sourced from Western Canada had similar penetrability to these Australian coals. However, the penetrability of methane and hexane into inertinite from the Australian Illawarra Coal Measures (also Permian) is substantially less than that of the other Australian coals; there are about 80% fewer 12 nm pores in Illawarra inertinite compared to the other Australian coals examined. The inertinite in coals sourced from South Africa and China had accessibility intermediate between the Illawarra coals and the other Australian coals.The extent of hexane penetration was 10–20% less than CD4 penetration into the same coal and this difference was most pronounced in the 5–50 nm pore size range. Hexane and methane penetrability into the coals showed similar trends with inertinite content.The observed variations in inertinite porosity between coals from different coal regions and coal basins may explain why previous studies differ in their observations of the relationships between gas sorption behavior, permeability, porosity, and maceral composition. These variations are not simply a demarcation between Northern and Southern Hemisphere coals.

  20. Seismic image watermarking using optimized wavelets

    International Nuclear Information System (INIS)

    Mufti, M.

    2010-01-01

    Geotechnical processes and technologies are becoming more and more sophisticated by the use of computer and information technology. This has made the availability, authenticity and security of geo technical data even more important. One of the most common methods of storing and sharing seismic data images is through standardized SEG- Y file format.. Geo technical industry is now primarily data centric. The analytic and detection capability of seismic processing tool is heavily dependent on the correctness of the contents of the SEG-Y data file. This paper describes a method through an optimized wavelet transform technique which prevents unauthorized alteration and/or use of seismic data. (author)

  1. Elk Valley Coal innovation paving the way

    Energy Technology Data Exchange (ETDEWEB)

    Chen, C.; Ednie, H.; Weldon, H.

    2006-09-15

    Elk Valley Coal maintains performance optimization across its six metallurgical coal operations. Performance, personnel issues, and training are discussed. Programmes at Fording River, Greenhills, and Coal Mountain are described. Fording River is implementing new computer systems and high-speed wireless networks. The pit control system and the equipment maintenance and remote maintenance programmes are being improved. The Glider Kit program to rebuild major equipment is described. Safety and productivity measures at Greenhills include testing and evaluation of innovations such as the Drilling and Blasting System (DABS), a payload monitor on a shovel, and two GPS-based systems. Blasting methods, a timing study that examines wall stability, fragmentation simulation, and the Six Mine structure at Coal Mountain are described. 5 photos.

  2. Seismic, magnetic, and geotechnical properties of a landslide and clinker deposits, Powder River basin, Wyoming and Montana

    Science.gov (United States)

    Miller, C.H.

    1979-01-01

    Exploitation of vast coal and other resources in the Powder River Basin has caused recent, rapid increases in population and in commercial and residential development and has prompted land utilization studies. Two aspects of land utilization were studied for this report: (1) the seismic and geotechnical properties of a landslide and (2) the seismic, magnetic, and geotechnical properties of clinker deposits. (1) The landslide seismic survey revealed two layers in the slide area. The upper (low-velocity) layer is a relatively weak mantle of colluvium and unconsolidated and weathered bedrock that ranges in thickness from 3.0 to 7.5 m and has an average seismic velocity of about 390 m/s. It overlies high-velocity, relatively strong sedimentary bedrock that has velocities greater than about 1330 m/s. The low-velocity layer is also present at the other eight seismic refraction sites in the basin; a similar layer has also been reported in the Soviet Union in a landslide area over similar bedrock. The buried contact of the low- and high-velocity layers is relatively smooth and is nearly parallel with the restored topographic surface. There is no indication that any of the high-velocity layer (bedrock) has been displaced or removed. The seismic data also show that the shear modulus of the low-velocity layer is only about one-tenth that of the high-velocity layer and the shear strength (at failure) is only about one-thirtieth. Much of the slide failure is clearly in the shear mode, and failure is, therefore, concluded to be confined to the low-velocity layer. The major immediate factor contributing to landslide failure is apparently the addition of moisture to the low-velocity layer. The study implies that the low-velocity layer can be defined over some of the basin by seismic surveys and that they can help predict or delineate potential slides. Preventative actions that could then be taken include avoidance, dewatering, prevention of saturation, buttressing the toe, and

  3. Seismic Broadband Full Waveform Inversion by shot/receiver refocusing

    NARCIS (Netherlands)

    Haffinger, P.R.

    2013-01-01

    Full waveform inversion is a tool to obtain high-resolution property models of the subsurface from seismic data. However, the technique is computationally expens- ive and so far no multi-dimensional implementation exists to achieve a resolution that can directly be used for seismic interpretation

  4. Summary of seismic activity and its relation to geology and mining in the Sunnyside mining district, Carbon and Emery Counties, Utah, during 1967-1970

    Science.gov (United States)

    Dunrud, C. Richard; Osterwald, Frank W.; Hernandez, Jerome

    1973-01-01

    In the Sunnyside mining district, Utah, coal is mined under thick and variable overburden which is locally weakened by faults and other structural discontinuities. Stress changes and local stress concentrations produced by mining under these conditions often cause sudden and violent ruptures in the coal and surrounding rock mass. The strain energy released by this type of failure, which can produce shock waves and may discharge coal and rock with explosive force, is often a serious threat to life and property. These releases of strain energy are called bumps or bounces by miners if they occur in the coal, and rock bursts if they occur in the surrounding rock mass. Many of these releases are so violent that they generate seismic waves that can be felt, or at least detected by seismic instruments, miles from the site of the rupture, whereas others are smaller and can be detected only by those sensitive seismic instruments within a few thousand feet of the site of the rupture. In 1969 and 1970, about 27,000 and about 15,000 earth tremors, respectively, were recorded by the five-station seismic monitoring network that is located at the surface and encompasses most of the mine workings in the district. Of these totals, 512 and 524 earth tremors, respectively, were of sufficient magnitude (greater than 1. 5 on the Richter scale) so that the hypocenters could be accurately located. In 1968 about 20,000 tremors were recorded, with 281 large enough to plot, but in 1967 over 50,000 were recorded, of which 540 were plotted. In this report we discuss the way in which seismic activity, geology, and mining are related or seem to be related for the period 1967 through 1970, with emphasis on the period 1969-70. We also suggest certain mining procedures which, based on studies during the period, might increase the safety and efficiency of mining operations in the Sunnyside district. A complete tabulation of the larger magnitude earth tremors which occurred during 1969-70 and

  5. Seismic Shot Processing on GPU

    OpenAIRE

    Johansen, Owe

    2009-01-01

    Today s petroleum industry demand an ever increasing amount of compu- tational resources. Seismic processing applications in use by these types of companies have generally been using large clusters of compute nodes, whose only computing resource has been the CPU. However, using Graphics Pro- cessing Units (GPU) for general purpose programming is these days becoming increasingly more popular in the high performance computing area. In 2007, NVIDIA corporation launched their framework for develo...

  6. Computer-Aided Analysis of Flow in Water Pipe Networks after a Seismic Event

    Directory of Open Access Journals (Sweden)

    Won-Hee Kang

    2017-01-01

    Full Text Available This paper proposes a framework for a reliability-based flow analysis for a water pipe network after an earthquake. For the first part of the framework, we propose to use a modeling procedure for multiple leaks and breaks in the water pipe segments of a network that has been damaged by an earthquake. For the second part, we propose an efficient system-level probabilistic flow analysis process that integrates the matrix-based system reliability (MSR formulation and the branch-and-bound method. This process probabilistically predicts flow quantities by considering system-level damage scenarios consisting of combinations of leaks and breaks in network pipes and significantly reduces the computational cost by sequentially prioritizing the system states according to their likelihoods and by using the branch-and-bound method to select their partial sets. The proposed framework is illustrated and demonstrated by examining two example water pipe networks that have been subjected to a seismic event. These two examples consist of 11 and 20 pipe segments, respectively, and are computationally modeled considering their available topological, material, and mechanical properties. Considering different earthquake scenarios and the resulting multiple leaks and breaks in the water pipe segments, the water flows in the segments are estimated in a computationally efficient manner.

  7. Rapid and Robust Cross-Correlation-Based Seismic Phase Identification Using an Approximate Nearest Neighbor Method

    Science.gov (United States)

    Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.

    2016-12-01

    The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.

  8. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    Energy Technology Data Exchange (ETDEWEB)

    E.N. Lindner

    2004-12-03

    evaluated and identified. This document supersedes the seismic classifications, assignments, and computations in ''Seismic Analysis for Preclosure Safety'' (BSC 2004a).

  9. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    International Nuclear Information System (INIS)

    E.N. Lindner

    2004-01-01

    evaluated and identified. This document supersedes the seismic classifications, assignments, and computations in ''Seismic Analysis for Preclosure Safety'' (BSC 2004a)

  10. User's manual of SECOM2: a computer code for seismic system reliability analysis

    International Nuclear Information System (INIS)

    Uchiyama, Tomoaki; Oikawa, Tetsukuni; Kondo, Masaaki; Tamura, Kazuo

    2002-03-01

    This report is the user's manual of seismic system reliability analysis code SECOM2 (Seismic Core Melt Frequency Evaluation Code Ver.2) developed at the Japan Atomic Energy Research Institute for systems reliability analysis, which is one of the tasks of seismic probabilistic safety assessment (PSA) of nuclear power plants (NPPs). The SECOM2 code has many functions such as: Calculation of component failure probabilities based on the response factor method, Extraction of minimal cut sets (MCSs), Calculation of conditional system failure probabilities for given seismic motion levels at the site of an NPP, Calculation of accident sequence frequencies and the core damage frequency (CDF) with use of the seismic hazard curve, Importance analysis using various indicators, Uncertainty analysis, Calculation of the CDF taking into account the effect of the correlations of responses and capacities of components, and Efficient sensitivity analysis by changing parameters on responses and capacities of components. These analyses require the fault tree (FT) representing the occurrence condition of the system failures and core damage, information about response and capacity of components and seismic hazard curve for the NPP site as inputs. This report presents the models and methods applied in the SECOM2 code and how to use those functions. (author)

  11. Seismic attributes and advanced computer algorithm to predict formation pore pressure: Qalibah formation of Northwest Saudi Arabia

    Science.gov (United States)

    Nour, Abdoulshakour M.

    Oil and gas exploration professionals have long recognized the importance of predicting pore pressure before drilling wells. Pre-drill pore pressure estimation not only helps with drilling wells safely but also aids in the determination of formation fluids migration and seal integrity. With respect to the hydrocarbon reservoirs, the appropriate drilling mud weight is directly related to the estimated pore pressure in the formation. If the mud weight is lower than the formation pressure, a blowout may occur, and conversely, if it is higher than the formation pressure, the formation may suffer irreparable damage due to the invasion of drilling fluids into the formation. A simple definition of pore pressure is the pressure of the pore fluids in excess of the hydrostatic pressure. In this thesis, I investigated the utility of advance computer algorithm called Support Vector Machine (SVM) to learn the pattern of high pore pressure regime, using seismic attributes such as Instantaneous phase, t*Attenuation, Cosine of Phase, Vp/Vs ratio, P-Impedance, Reflection Acoustic Impedance, Dominant frequency and one well attribute (Mud-Weigh) as the learning dataset. I applied this technique to the over pressured Qalibah formation of Northwest Saudi Arabia. The results of my research revealed that in the Qalibah formation of Northwest Saudi Arabia, the pore pressure trend can be predicted using SVM with seismic and well attributes as the learning dataset. I was able to show the pore pressure trend at any given point within the geographical extent of the 3D seismic data from which the seismic attributes were derived. In addition, my results surprisingly showed the subtle variation of pressure within the thick succession of shale units of the Qalibah formation.

  12. Seismic behaviour of geotechnical structures

    Directory of Open Access Journals (Sweden)

    F. Vinale

    2002-06-01

    Full Text Available This paper deals with some fundamental considerations regarding the behaviour of geotechnical structures under seismic loading. First a complete definition of the earthquake disaster risk is provided, followed by the importance of performing site-specific hazard analysis. Then some suggestions are provided in regard to adequate assessment of soil parameters, a crucial point to properly analyze the seismic behaviour of geotechnical structures. The core of the paper is centered on a critical review of the analysis methods available for studying geotechnical structures under seismic loadings. All of the available methods can be classified into three main classes, including the pseudo-static, pseudo-dynamic and dynamic approaches, each of which is reviewed for applicability. A more advanced analysis procedure, suitable for a so-called performance-based design approach, is also described in the paper. Finally, the seismic behaviour of the El Infiernillo Dam was investigated. It was shown that coupled elastoplastic dynamic analyses disclose some of the important features of dam behaviour under seismic loading, confirmed by comparing analytical computation and experimental measurements on the dam body during and after a past earthquake.

  13. Probabilistic seismic hazard assessment. Gentilly 2

    International Nuclear Information System (INIS)

    1996-03-01

    Results of this probabilistic seismic hazard assessment were determined using a suite of conservative assumptions. The intent of this study was to perform a limited hazard assessment that incorporated a range of technically defensible input parameters. To best achieve this goal, input selected for the hazard assessment tended to be conservative with respect to selection of attenuation modes, and seismicity parameters. Seismic hazard estimates at Gentilly 2 were most affected by selection of the attenuation model. Alternative definitions of seismic source zones had a relatively small impact on seismic hazard. A St. Lawrence Rift model including a maximum magnitude of 7.2 m b in the zone containing the site had little effect on the hazard estimate relative to other seismic source zonation models. Mean annual probabilities of exceeding the design peak ground acceleration, and the design response spectrum for the Gentilly 2 site were computed to lie in the range of 0.001 to 0.0001. This hazard result falls well within the range determined to be acceptable for nuclear reactor sites located throughout the eastern United States. (author) 34 refs., 6 tabs., 28 figs

  14. Microseismic Precursory Characteristics of Rock Burst Hazard in Mining Areas Near a Large Residual Coal Pillar: A Case Study from Xuzhuang Coal Mine, Xuzhou, China

    Science.gov (United States)

    Cao, An-ye; Dou, Lin-ming; Wang, Chang-bin; Yao, Xiao-xiao; Dong, Jing-yuan; Gu, Yu

    2016-11-01

    Identification of precursory characteristics is a key issue for rock burst prevention. The aim of this research is to provide a reference for assessing rock burst risk and determining potential rock burst risk areas in coal mining. In this work, the microseismic multidimensional information for the identification of rock bursts and spatial-temporal pre-warning was investigated in a specific coalface which suffered high rock burst risk in a mining area near a large residual coal pillar. Firstly, microseismicity evolution prior to a disastrous rock burst was qualitatively analysed, and the abnormal clustering of seismic sources, abnormal variations in daily total energy release, and event counts can be regarded as precursors to rock burst. Secondly, passive tomographic imaging has been used to locate high seismic activity zones and assess rock burst hazard when the coalface passes through residual pillar areas. The results show that high-velocity or velocity anomaly regions correlated well with strong seismic activities in future mining periods and that passive tomography has the potential to describe, both quantitatively and periodically, hazardous regions and assess rock burst risk. Finally, the bursting strain energy index was further used for short-term spatial-temporal pre-warning of rock bursts. The temporal sequence curve and spatial contour nephograms indicate that the status of the danger and the specific hazardous zones, and levels of rock burst risk can be quantitatively and rapidly analysed in short time and in space. The multidimensional precursory characteristic identification of rock bursts, including qualitative analysis, intermediate and short-time quantitative predictions, can guide the choice of measures implemented to control rock bursts in the field, and provides a new approach to monitor and forecast rock bursts in space and time.

  15. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  16. Weight factors in diffraction transformation of seismic recordings

    Energy Technology Data Exchange (ETDEWEB)

    Bulatov, M.G.; Lyevyy, N.V.; Telegin, A.N.

    1980-01-01

    In diffraction transformation of seismic recordings made using amplitude regulators, there is a distortion of the dynamics of the result-producing depth log. To eliminate distortions, it is suggested that weight factors be used. A formula is given for computing the factors, and the effectiveness of their use is confirmed using test and production seismic materials.

  17. Post-seismic velocity changes following the 2010 Mw 7.1 Darfield earthquake, New Zealand, revealed by ambient seismic field analysis

    Science.gov (United States)

    Heckels, R. EG; Savage, M. K.; Townend, J.

    2018-05-01

    Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.

  18. A Desktop Publishing Course: An Alternative to Internships for Rural Universities.

    Science.gov (United States)

    Flammia, Madelyn

    1992-01-01

    Suggests that a course in desktop publishing can provide students at rural schools with experience equivalent to internships. Notes that the course provided students with real-world experience and benefited the university in terms of services and public relations. (RS)

  19. Fiscal 1998 geological survey overseas. Report on Tanjung Enim project for Japan-Indonesia joint coal exploration; 1998 nendo kaigai chishitsu kozo nado chosa hokokusho. Nippon Indonesia sekitan kyodo tansa Tanjung Enim project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For disclosing stratigraphy and lithofacies and for clarifying the status of coal beds in the southern Arahan area (55km{sup 2}), South Sumatra, ground surface exploration, borehole drilling, borehole geophysical logging, and seismic reflection monitoring were carried out. The survey covered a rectangular area, 11km from east to west and 5km from north to south and 60m to 130m in elevation. In the ground surface exploration effort, trenching was performed at three spots in coal bed outcrops in an quasi-accurate survey with route gaps of roughly 1,200m. Ten boreholes were drilled. The deepest one was 282.20m deep and the shallowest one 117.80m deep, with the total drilled length measuring 1,643.70m. Seismic reflection monitoring was implemented using three traverse lines (two running north to south and one running from east to west), with the three lines measuring 10.92km in total. VSP (vertical seismic profiling) was carried for borehole ASN17 only. The findings are that there are four kinds of coal beds, that is, A2, B, C, and E spreading all over the area, that their thicknesses are estimated to be 12m, 18m, 7m, and 6-8m, respectively, and that coal beds A2, B, and C are packed into between 80m-deep and 150m-deep levels. (NEDO)

  20. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 2: Computational implementation and first results

    Science.gov (United States)

    Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina

    2017-11-01

    This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M risk reduction.

  1. The X-Files: Investigating Alien Performance in a Thin-client World

    OpenAIRE

    Gunther, Neil J.

    2000-01-01

    Many scientific applications use the X11 window environment; an open source windows GUI standard employing a client/server architecture. X11 promotes: distributed computing, thin-client functionality, cheap desktop displays, compatibility with heterogeneous servers, remote services and administration, and greater maturity than newer web technologies. This paper details the author's investigations into close encounters with alien performance in X11-based seismic applications running on a 200-n...

  2. Predicted mineral melt formation by BCURA Coal Sample Bank coals: Variation with atmosphere and comparison with reported ash fusion test data

    Energy Technology Data Exchange (ETDEWEB)

    D. Thompson [University of Sheffield (United Kingdom). Department of Engineering Materials

    2010-08-15

    The thermodynamic equilibrium phases formed under ash fusion test and excess air combustion conditions by 30 coals of the BCURA Coal Sample Bank have been predicted from 1100 to 2000 K using the MTDATA computational suite and the MTOX database for silicate melts and associated phases. Predicted speciation and degree of melting varied widely from coal to coal. Melting under an ash fusion test atmosphere of CO{sub 2}:H{sub 2} 1:1 was essentially the same as under excess air combustion conditions for some coals, and markedly different for others. For those ashes which flowed below the fusion test maximum temperature of 1773 K flow coincided with 75-100% melting in most cases. Flow at low predicted melt formation (46%) for one coal cannot be attributed to any one cause. The difference between predicted fusion behaviours under excess air and fusion test atmospheres becomes greater with decreasing silica and alumina, and increasing iron, calcium and alkali metal content in the coal mineral. 22 refs., 7 figs., 3 tabs.

  3. DISTRIBUTED COMPUTING SUPPORT CONTRACT USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link ...

  4. DISTRIBUTED COMPUTING SUPPORT SERVICE USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire, which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link...

  5. Seismic waveform modeling over cloud

    Science.gov (United States)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  6. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  7. Seismic hazard assessment of the Hanford region, Eastern Washington State

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.; Power, M.S.; Swan, F.H. III

    1985-01-01

    A probabilistic seismic hazard assessment was made for a site within the Hanford region of eastern Washington state, which is characterized as an intraplate region having a relatively low rate of seismic activity. Probabilistic procedures, such as logic trees, were utilized to account for the uncertainties in identifying and characterizing the potential seismic sources in the region. Logic trees provide a convenient, flexible means of assessing the values and relative likelihoods of input parameters to the hazard model that may be dependent upon each other. Uncertainties accounted for in this way include the tectonic model, segmentation, capability, fault geometry, maximum earthquake magnitude, and earthquake recurrence rate. The computed hazard results are expressed as a distribution from which confidence levels are assessed. Analysis of the results show the contributions to the total hazard from various seismic sources and due to various earthquake magnitudes. In addition, the contributions of uncertainties in the various source parameters to the uncertainty in the computed hazard are assessed. For this study, the major contribution to uncertainty in the computed hazard are due to uncertainties in the applicable tectonic model and the earthquake recurrence rate. This analysis serves to illustrate some of the probabilistic tools that are available for conducting seismic hazard assessments and for analyzing the results of these studies. 5 references, 7 figures

  8. European database of nuclear-seismic profiles. Final report

    International Nuclear Information System (INIS)

    Wenzel, F.; Fuchs, K.; Tittgemeyer, M.

    1997-01-01

    The project serves the purpose of conserving the nuclear-seismic data collections of the former USSR by suitable processing of data as well as digitization, and incorporation into the databases stored at Potsdam (Germany) and Moscow (Russia). In a joint activity assisted by the Russian State Committee for Geology, a complete set of nuclear-seismic data has been integrated into the data collections of an international computer center which is one of the centers participating in the EUROPROBE project. Furthermore, computer stations are being established across Russia so that Russian geoscientists will have at their disposal the required data processing facilities. (DG) [de

  9. Coal geopolitics

    International Nuclear Information System (INIS)

    Giraud, P.N.; Suissa, A.; Coiffard, J.; Cretin, D.

    1991-01-01

    This book divided into seven chapters, describes coal economic cycle. Chapter one: coals definition; the principle characteristics and properties (origin, calorific power, international classification...) Chapter two: the international coal cycle: coal mining, exploration, coal reserves estimation, coal handling coal industry and environmental impacts. Chapter three: the world coal reserves. Chapter four: the consumptions, productions and trade. Chapter five: the international coal market (exporting mining companies; importing companies; distributors and spot market operators) chapter six: the international coal trade chapter seven: the coal price formation. 234 refs.; 94 figs. and tabs [fr

  10. Asymptotic Co- and Post-seismic displacements in a homogeneous Maxwell sphere

    Science.gov (United States)

    Tang, He; Sun, Wenke

    2018-05-01

    The deformations of the Earth caused by internal and external forces are usually expressed through Green's functions or the superposition of normal modes, i.e. via numerical methods, which are applicable for computing both co- and post-seismic deformations. It is difficult to express these deformations in an analytical form, even for a uniform viscoelastic sphere. In this study, we present a set of asymptotic solutions for computing co- and post-seismic displacements; these solutions can be further applied to solving co- and post-seismic geoid, gravity, and strain changes. Expressions are derived for a uniform Maxwell Earth by combining the reciprocity theorem, which links earthquake, tidal, shear and loading deformations, with the asymptotic solutions of these three external forces (tidal, shear and loading) and analytical inverse Laplace transformation formulae. Since the asymptotic solutions are given in a purely analytical form without series summations or extra convergence skills, they can be practically applied in an efficient way, especially when computing post-seismic deformations and glacial isotactic adjustments of the Earth over long timescales.

  11. Design and construction of a desktop AC susceptometer using an Arduino and a Bluetooth for serial interface

    Science.gov (United States)

    Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José

    2018-05-01

    We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.

  12. Computational fluid dynamics simulation for chemical looping combustion of coal in a dual circulation fluidized bed

    International Nuclear Information System (INIS)

    Su, Mingze; Zhao, Haibo; Ma, Jinchen

    2015-01-01

    Highlights: • CFD simulation of a 5 kW_t_h CLC reactor of coal was conducted. • Gas leakage, flow pattern and combustion efficiency of the reactor was analyzed. • Optimal condition was achieved based on operation characteristics understanding. - Abstract: A dual circulation fluidized bed system is widely accepted for chemical looping combustion (CLC) for enriching CO_2 from the utilization of fossil fuels. Due to the limitations of the measurement, the details of multiphase reactive flows in the interconnected fluidized bed reactors are difficult to obtain. Computational Fluid Dynamics (CFD) simulation provides a promising method to understand the hydrodynamics, chemical reaction, and heat and mass transfers in CLC reactors, which are very important for the rational design, optimal operation, and scaling-up of the CLC system. In this work, a 5 kW_t_h coal-fired CLC dual circulation fluidized bed system, which was developed by our research group, was first simulated for understanding gas leakage, flow pattern and combustion efficiency. The simulation results achieved good agreement with the experimental measurements, which validates the simulation model. Subsequently, to improve the combustion efficiency, a new operation condition was simulated by increasing the reactor temperature and decreasing the coal feeding. An improvement in the combustion efficiency was attained, and the simulation results for the new operation condition were also validated by the experimental measurements in the same CLC combustor. All of the above processes demonstrated the validity and usefulness of the simulation results to improve the CLC reactor operation.

  13. Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface

    Science.gov (United States)

    Ratib, Osman M.; Huang, H. K.

    1989-05-01

    A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.

  14. Explosion Monitoring with Machine Learning: A LSTM Approach to Seismic Event Discrimination

    Science.gov (United States)

    Magana-Zook, S. A.; Ruppert, S. D.

    2017-12-01

    The streams of seismic data that analysts look at to discriminate natural from man- made events will soon grow from gigabytes of data per day to exponentially larger rates. This is an interesting problem as the requirement for real-time answers to questions of non-proliferation will remain the same, and the analyst pool cannot grow as fast as the data volume and velocity will. Machine learning is a tool that can solve the problem of seismic explosion monitoring at scale. Using machine learning, and Long Short-term Memory (LSTM) models in particular, analysts can become more efficient by focusing their attention on signals of interest. From a global dataset of earthquake and explosion events, a model was trained to recognize the different classes of events, given their spectrograms. Optimal recurrent node count and training iterations were found, and cross validation was performed to evaluate model performance. A 10-fold mean accuracy of 96.92% was achieved on a balanced dataset of 30,002 instances. Given that the model is 446.52 MB it can be used to simultaneously characterize all incoming signals by researchers looking at events in isolation on desktop machines, as well as at scale on all of the nodes of a real-time streaming platform. LLNL-ABS-735911

  15. Seismic test for safety evaluation of low level radioactive wastes containers

    International Nuclear Information System (INIS)

    Ohoka, Makoto; Horikiri, Morito

    1998-08-01

    Seismic safety of three-piled container system used in Tokai reprocessing center was confirmed by seismic test and computational analysis. Two types of container were evaluated, for low level noninflammable radioactive solid wastes, and for used filters wrapped by large plastic bags. Seismic integrity of three-piled containers was confirmed by evaluating response characteristics such as acceleration and displacement under the design earthquake condition S1, which is the maximum earthquake expected at the stored site during the storage time. Computational dynamic analysis was also performed, and several conclusions described below were made. (1) Response characteristics of the bottom board and the side board were different. The number of pile did not affect the response characteristics of the bottom board of each container. They behaved as a rigid body. (2) The response of the side board was larger than that of the bottom board. (3) The response depended on the direction in each board, either side or bottom. The response acceleration became larger to the seismic wave perpendicular to the plane which has the entrance for fork lift and the radioactive warning mark. (4) The maximum horizontal response displacement under the S1 seismic wave was approximately 10 mm. It is so small that it does not affect the seismic safety. (5) The stoppers to prevent fall down had no influence to the response acceleration. (6) There was no fall down to the S1 seismic wave and 2 times of S1 seismic wave, which was the maximum input condition of the test. (7) The response of the bottom board of the containers, which are main elements of fall down, had good agreements both in the test and in the computational analysis. (author)

  16. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  17. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  18. Reliability of lifeline networks under seismic hazard

    International Nuclear Information System (INIS)

    Selcuk, A. Sevtap; Yuecemen, M. Semih

    1999-01-01

    Lifelines, such as pipelines, transportation, communication and power transmission systems, are networks which extend spatially over large geographical regions. The quantification of the reliability (survival probability) of a lifeline under seismic threat requires attention, as the proper functioning of these systems during or after a destructive earthquake is vital. In this study, a lifeline is idealized as an equivalent network with the capacity of its elements being random and spatially correlated and a comprehensive probabilistic model for the assessment of the reliability of lifelines under earthquake loads is developed. The seismic hazard that the network is exposed to is described by a probability distribution derived by using the past earthquake occurrence data. The seismic hazard analysis is based on the 'classical' seismic hazard analysis model with some modifications. An efficient algorithm developed by Yoo and Deo (Yoo YB, Deo N. A comparison of algorithms for terminal pair reliability. IEEE Transactions on Reliability 1988; 37: 210-215) is utilized for the evaluation of the network reliability. This algorithm eliminates the CPU time and memory capacity problems for large networks. A comprehensive computer program, called LIFEPACK is coded in Fortran language in order to carry out the numerical computations. Two detailed case studies are presented to show the implementation of the proposed model

  19. Introduction to Xgrid: Cluster Computing for Everyone

    OpenAIRE

    Breen, Barbara J.; Lindner, John F.

    2010-01-01

    Xgrid is the first distributed computing architecture built into a desktop operating system. It allows you to run a single job across multiple computers at once. All you need is at least one Macintosh computer running Mac OS X v10.4 or later. (Mac OS X Server is not required.) We provide explicit instructions and example code to get you started, including examples of how to distribute your computing jobs, even if your initial cluster consists of just two old laptops in your basement.

  20. A Desktop Virtual Reality Earth Motion System in Astronomy Education

    Science.gov (United States)

    Chen, Chih Hung; Yang, Jie Chi; Shen, Sarah; Jeng, Ming Chang

    2007-01-01

    In this study, a desktop virtual reality earth motion system (DVREMS) is designed and developed to be applied in the classroom. The system is implemented to assist elementary school students to clarify earth motion concepts using virtual reality principles. A study was conducted to observe the influences of the proposed system in learning.…

  1. Report on fundamental survey on developing coal resources in fiscal 1999 - summarized edition. Survey and development of new exploration technology (exploration of shallow land area beds); 1999 nendo shintansa gijutsu chosa kaihatsu (rikuiki senso tansa) hokokusho (yoyakuban)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Coal resource surveys have been performed using mainly the physical exploration method on the NSW State of Australia as the object. The Caroona area has a Permian period ground bed distributed, in which multiple number of coal layers exist. The ground bed is covered with sedimentary layers of the Triassic period inconsistently, and this Triassic period system is covered by volcanic rocks of the Jurassic period inconsistently. Faults are going through the coal beds in their upper or lower layers at locations having soft rock beds. The faults were identified by seismic exploration using the two-dimensional reflection method, assisted by the result of the physical logging. The results of pursuit on reflective events and the result of test drilling were unified to identify the summary of basset lines of the major coal beds. Furthermore, the seismic exploration using the three-dimensional reflective method capable of high-level imaging of underground structures was applied to coal beds existing in depths less than 480 m. Multiple number of local and small-scale sinking were detected with time difference of 5 ms and depth conversion to 7 m. Locations, runs, and inclination were interpreted also on faults having small fall whose details have been unclear in the exploration using the two-dimensional method. The seismic exploration using the three-dimensional reflective method was found capable of identifying micro structural changes and fault runs that cannot be tracked by the two-dimensional method. (NEDO)

  2. Seismic data processing for domestic seismic survey over the continental shelf of Korea using the Geobit

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jin Yong [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    The `Geobit`, a new seismic data processing software introduced by the Korea Institute of Geology, Mining and Materials recently, is the token of the achievement for the development of technology in the oil exploration over the Korean continental shelf. In comparison with the foreign seismic data processing systems previously used in Korea, the Geobit system has some advanced facilities; it provides an interactive mode which makes the seismic processing easier and has the user-friendly programs which allow the construction of a job control file simpler. Most of all, the Geobit can be run with many computer hardware systems, from PC to supercomputer. The current version of the Geobit can take care of the two-dimensional multi-channel seismic data and is open to the public for an education tool and a research purpose. To demonstrate the ability of the Geobit, a multi-channel field data acquired in the domestic continental shelf over the Yellow Sea in 1970 has been selected and processed with standard seismic data processing techniques. In this report, the Geobit job files and the corresponding results for the construction of a stack are provided. (author). 8 refs., 14 figs., 1 tab.

  3. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  4. Squaring the circle on thermal coal

    International Nuclear Information System (INIS)

    Grossman, S.

    2008-01-01

    Participants in the Pacific market have much to gain by understanding how and why coal prices fluctuate. This presentation addressed market issues pertaining to the supply and demand for coal. It discussed commoditisation, the process by which a product moves from being a unique or differentiated product to a commodity. While price volatility is a measurement of the change in price over a given time period, it is often expressed as a percentage and computed as the annualized standard deviation of percentage change in daily price. This price volatility and its causes was also reviewed. The driving forces of commoditisation include demand for price transparency; change from traditional buying and selling patterns; and external factors. Price volatility occurs when logistics have not kept pace with product flow. Volatility can be attributed to supply and demand speculation, margin calls and the link between the price of coal and other fossil fuels, especially in Europe. The demand for price transparency as well as the change from traditional buying and selling patterns were discussed. It was concluded that the volatility of coal prices will increase as coal becomes increasingly affected by the global energy market. tabs., figs

  5. What about coal? Interactions between climate policies and the global steam coal market until 2030

    International Nuclear Information System (INIS)

    Haftendorn, C.; Kemfert, C.; Holz, F.

    2012-01-01

    Because of economic growth and a strong increase in global energy demand the demand for fossil fuels and therefore also greenhouse gas emissions are increasing, although climate policy should lead to the opposite effect. The coal market is of special relevance as coal is available in many countries and often the first choice to meet energy demand. In this paper we assess possible interactions between climate policies and the global steam coal market. Possible market adjustments between demand regions through market effects are investigated with a numerical model of the global steam coal market: the “COALMOD-World” model. This equilibrium model computes future trade flows, infrastructure investments and prices until 2030. We investigate three specific designs of climate policy: a unilateral European climate policy, an Indonesian export-limiting policy and a fast-roll out of carbon capture and storage (CCS) in the broader context of climate policy and market constraints. We find that market adjustment effects in the coal market can have significant positive and negative impacts on the effectiveness of climate policies. - Highlights: ► Interactions between climate policy and the global coal market until 2030 modeled. ► Analysis with the numerical model: “COALMOD-World”. ► Unilateral European climate policy partly compensated by market adjustment effects. ► A fast roll-out of CCS can lead to positive market adjustment effects. ► An export restricting supply-side policy generates virtuous market adjustments.

  6. Using M@th Desktop Notebooks and Palettes in the Classroom

    Science.gov (United States)

    Simonovits, Reinhard

    2011-01-01

    This article explains the didactical design of M@th Desktop (MD), a teaching and learning software application for high schools and universities. The use of two types of MD resources is illustrated: notebooks and palettes, focusing on the topic of exponential functions. The handling of MD in a blended learning approach and the impact on the…

  7. Report on fiscal 2000 basic survey for coal resource exploration. Survey for development of new exploration technology (Exploration of shallow layers on the land); 2000 nendo sekitan shigen kaihatsu kiso chosa hokokusho. Shintansa gijutsu chosa kaihatsu (rikuiki senso tansa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-09-01

    Efforts are made to develop a high-precision high-resolution seismic reflection method, high-efficiency method for measurement in the bored hole, coal potentiality assessment system, and so forth. For the development of the seismic reflection method, studies are conducted to properly deal with a situation where there are high-velocity layers at levels shallower than the object coal bed, and a conclusion is reached that quake generation based on the pseudorandom binary sequence code will be the best for the purpose. The system was tested for verification in a producing coal mine. As for measurement in the bored hole, an on-site test was conducted for a geophysical logging system capable of determining the total sulfur content, ash, and the like, and the system was found to supply high-quality data. In developing the coal potentiality assessment system, studies were made about the basic concept of the coordination of the system with the coal GIS (geographical information system) dealing with spatial data and about the functions of the system, and a conceptual design was prepared. (NEDO)

  8. HTGR core seismic analysis using an array processor

    International Nuclear Information System (INIS)

    Shatoff, H.; Charman, C.M.

    1983-01-01

    A Floating Point Systems array processor performs nonlinear dynamic analysis of the high-temperature gas-cooled reactor (HTGR) core with significant time and cost savings. The graphite HTGR core consists of approximately 8000 blocks of various shapes which are subject to motion and impact during a seismic event. Two-dimensional computer programs (CRUNCH2D, MCOCO) can perform explicit step-by-step dynamic analyses of up to 600 blocks for time-history motions. However, use of two-dimensional codes was limited by the large cost and run times required. Three-dimensional analysis of the entire core, or even a large part of it, had been considered totally impractical. Because of the needs of the HTGR core seismic program, a Floating Point Systems array processor was used to enhance computer performance of the two-dimensional core seismic computer programs, MCOCO and CRUNCH2D. This effort began by converting the computational algorithms used in the codes to a form which takes maximum advantage of the parallel and pipeline processors offered by the architecture of the Floating Point Systems array processor. The subsequent conversion of the vectorized FORTRAN coding to the array processor required a significant programming effort to make the system work on the General Atomic (GA) UNIVAC 1100/82 host. These efforts were quite rewarding, however, since the cost of running the codes has been reduced approximately 50-fold and the time threefold. The core seismic analysis with large two-dimensional models has now become routine and extension to three-dimensional analysis is feasible. These codes simulate the one-fifth-scale full-array HTGR core model. This paper compares the analysis with the test results for sine-sweep motion

  9. Underground coal mining in the Karviná region and its impact on the human environment (Czech Republic)

    Czech Academy of Sciences Publication Activity Database

    Doležalová, Hana; Holub, Karel; Kaláb, Zdeněk

    2008-01-01

    Roč. 16, č. 2 (2008), s. 14-24 ISSN 1210-8812 R&D Projects: GA ČR(CZ) GA105/07/0878; GA ČR(CZ) GA105/07/1586 Institutional research plan: CEZ:AV0Z30860518 Keywords : Ostrava-Karviná Coal Basin * induced seismicity * surface subsidence Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  10. SONATINA-2H: a computer program for seismic analysis of the two-dimensional horizontal slice HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1990-02-01

    A Computer program SONATINA-2H has been developed for predicting the behavior of a two-dimensional horizontal HTGR core under seismic excitation. SONATINA-2H is a general two-dimensional computer program capable of analyzing the horizontal slice HTGR core with the fixed side reflector blocks and its restraint structures and the core support structure. In the analytical model, each block is treated as a rigid body and represent one column of the reactor core and is connected to the core support structure by means of column springs and viscous dampers. A single dashpot model is used for the collision process between adjacent blocks. The core support structure is represented by a single block. The computer program SONATINA-2H is capable of analyzing the core behavior for an excitation input applied simultaneously in two mutually perpendicular horizontal directions. In the present report are given, the theoretical formulation of the analytical model, an user's manual to describe the input and output format and sample problems. (author)

  11. Coal 1992

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    ACR's Coal 1992, the successor to the ACR Coal Marketing Manual, contains a comprehensive set of data on many aspects of the Australian coal industry for several years leading up to 1992. Tables and text give details of coal production and consumption in New South Wales, Queensland and other states. Statistics of the Australian export industry are complemented by those of South Africa, USA, New Zealand, Canada, Indonesia, China, Colombia, Poland and ex-USSR. Also listed are prices of Australian coking and non-coking coal, Australian coal stocks (and those of other major countries), loading port capacities, freight rates and coal quality requirements (analysis of coals by brand and supplier). A listing of Australian coal exporting companies is provided. A description of the spot Coal Screen Dealing System is given. World hard coal imports are listed by country and coal imports by major Asian countries tabulated. A forecast of demand by coal type and country up to the year 2000 is included.

  12. Characterization of Coal Porosity for Naturally Tectonically Stressed Coals in Huaibei Coal Field, China

    Science.gov (United States)

    Li, Xiaoshi; Hou, Quanlin; Li, Zhuo; Wei, Mingming

    2014-01-01

    The enrichment of coalbed methane (CBM) and the outburst of gas in a coal mine are closely related to the nanopore structure of coal. The evolutionary characteristics of 12 coal nanopore structures under different natural deformational mechanisms (brittle and ductile deformation) are studied using a scanning electron microscope (SEM) and low-temperature nitrogen adsorption. The results indicate that there are mainly submicropores (2~5 nm) and supermicropores (coal and mesopores (10~100 nm) and micropores (5~10 nm) in brittle deformed coal. The cumulative pore volume (V) and surface area (S) in brittle deformed coal are smaller than those in ductile deformed coal which indicates more adsorption space for gas. The coal with the smaller pores exhibits a large surface area, and coal with the larger pores exhibits a large volume for a given pore volume. We also found that the relationship between S and V turns from a positive correlation to a negative correlation when S > 4 m2/g, with pore sizes coal. The nanopore structure (coal. PMID:25126601

  13. Model of environmental life cycle assessment for coal mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Burchart-Korol, Dorota, E-mail: dburchart@gig.eu; Fugiel, Agata, E-mail: afugiel@gig.eu; Czaplicka-Kolarz, Krystyna, E-mail: kczaplicka@gig.eu; Turek, Marian, E-mail: mturek@gig.eu

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  14. Model of environmental life cycle assessment for coal mining operations

    International Nuclear Information System (INIS)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-01-01

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  15. Determining the degree of break up of coal strata and clay interlayers by looseners at the Chukurovo pit

    Energy Technology Data Exchange (ETDEWEB)

    Atanasov, A B; Stoyanov, D S; Yordanov, Y M

    1983-01-01

    One of the rational technologies for selective mining of thin coal strata with complex geological structure is that one in which different types of mechanical looseners are used. Identification of the degree of break up of the rock is accomplished most rapidly and precisely using seismoacoustical methods. The evaluation is conducted relative to the speed of propagation of elastic waves in the mass. SVM seismic receivers were used at the Chukurovo pit to determine the speed of propagation of an elastic wave in coals and clays. Shafts 10 meters long were drilled in three experimental fields using the SVB-2 machine. The speed of propagation of an elastic wave in the coals and in sandy clays was determined for different shaft depths every meter. After comparing the obtained results with standard tables from different companies, the possibility of breaking up the coal and clay interlayers in the experimental sectors by looseners with motors different in type and capacity was determined.

  16. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    Science.gov (United States)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The

  17. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  18. Guidelines for drafting national and international seismic standards

    International Nuclear Information System (INIS)

    Podrouzek, J.

    1989-01-01

    The main principles of engineering reliability are discussed in relation to the formation of seismic standards. The basic recommendations of the International Association of Earthquake Engineering in the field of inspection and earthquake resistance evaluation of engineering structures and systems are characterized. Attention is also paid to efforts aimed at a unification of standards and regulations, based on the fact that quasistatic and response spectra methods are largely common to the standards amd regulations. However, as the potential of computer techniques increases, more complex computer programs appear and the amount of tenuous input data increases, and this can affect the quality of seismic inspections. (Z.M.). 5 figs., 1 ref

  19. Multicomponent ensemble models to forecast induced seismicity

    Science.gov (United States)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels

  20. The Role of Wireless Computing Technology in the Design of Schools.

    Science.gov (United States)

    Nair, Prakash

    This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…

  1. Coal upgrading

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, S. [IEA Clean Coal Centre, London (United Kingdom)

    2009-10-15

    This report examines current technologies and those likely to be used to produce cleaner coal and coal products, principally for use in power generation and metallurgical applications. Consideration is also given to coal production in the leading coal producing countries, both with developed and developing industries. A range of technologies are considered. These include the coal-based liquid fuel called coal water mixture (CWM) that may compete with diesel, the production of ultra-clean coal (UCC) and coal liquefaction which competes with oil and its products. Technologies for upgrading coal are considered, especially for low rank coals (LRC), since these have the potential to fill the gap generated by the increasing demand for coal that cannot be met by higher quality coals. Potential advantages and downsides of coal upgrading are outlined. Taking into account the environmental benefits of reduced pollution achieved through cleaner coal and reduced transport costs, as well as other positive aspects such as a predictable product leading to better boiler design, the advantages appear to be significant. The drying of low rank coals improves the energy productively released during combustion and may also be used as an adjunct or as part of other coal processing procedures. Coal washing technologies vary in different countries and the implications of this are outlined. Dry separation technologies, such as dry jigging and electrostatic separation, are also described. The demonstration of new technologies is key to their further development and demonstrations of various clean coal technologies are considered. A number of approaches to briquetting and pelletising are available and their use varies from country to country. Finally, developments in upgrading low rank coals are described in the leading coal producing countries. This is an area that is developing rapidly and in which there are significant corporate and state players. 81 refs., 32 figs., 3 tabs.

  2. Computerization of administration and operation management in Polish black coal mines

    Energy Technology Data Exchange (ETDEWEB)

    Mastej, R.; Syrkiewicz, J. (Centralny Osrodek Informatyki Gornictwa (Poland))

    1990-08-01

    Characterizes main solutions of the computerized management model adopted in Poland for the mining industry and the technical and oganizational structure of computer system application. Computer systems for black coal mines and the range of microprocessor application are shown in block diagrams. The COIG mining information center makes about 45 computer system modules with independent implementation available for black coal mines. The general concept foresees central data processing in the COIG center on the ODRA 1305 and ICL 297 computers with the G-3 operating system in the first stage and an ICL series 39 computer with the VME operating system in the second stage. For mines where no transmission lines are available local solutions based on smaller ICL computers, minicomputers or computer networks with the NOVELL network operating system or multi-access systems with the UNIX operating system are planned. 3 refs.

  3. Environmental control implications of generating electric power from coal. 1977 technology status report. Appendix A (Part 2). Coal preparation and cleaning assessment study appendix

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    This report presents the results of integrating coal washability and coal reserves data obtained from the U.S. Bureau of Mines. Two computer programs were developed to match the appropriate entries in each data set and then merge the data into the form presented in this report. Approximately 18% of the total demonstrated coal reserves were matched with washability data. However, about 35% of the reserves that account for 80% of current production were successfully matched. Each computer printout specifies the location and size of the reserve, and then describes the coal with data on selected physical and chemical characteristics. Washability data are presented for three crush sizes (1.5 in., /sup 3///sub 8/ in., and 14 mesh) and several specific gravities. In each case, the percent recovery, Btu/lb, percent ash, percent sulfur, lb SO/sub 2//10/sup 6/ Btu, and reserves available at 1.2 lb SO/sub 2//10/sup 6/ Btu are given. The sources of the original data and the methods used in the integration are discussed briefly.

  4. Desktop Video: Multi-Media on the NeXT Computer.

    Science.gov (United States)

    Stammen, Ronald M.; Richardson, Jolene

    A new course, Independent Study Research and Writing via Telecommunications, is being developed by the Division of Independent Study (DIS) of the North Dakota Department of Public Instruction to teach telepublishing skills utilizing the NeXT telecommunicating (interpersonal computing) techniques, i.e., NeXT Mail. This multimedia electronic-mail…

  5. Seismic behaviour of gas cooled reactor components

    International Nuclear Information System (INIS)

    1990-08-01

    On invitation of the French Government the Specialists' Meeting on the Seismic Behaviour of Gas-Cooled Reactor Components was held at Gif-sur-Yvette, 14-16 November 1989. This was the second Specialists' Meeting on the general subject of gas-cooled reactor seismic design. There were 27 participants from France, the Federal Republic of Germany, Israel, Japan, Spain, Switzerland, the United Kingdom, the Soviet Union, the United States, the CEC and IAEA took the opportunity to present and discuss a total of 16 papers reflecting the state of the art of gained experiences in the field of their seismic qualification approach, seismic analysis methods and of the capabilities of various facilities used to qualify components and verify analytical methods. Since the first meeting, the sophistication and expanded capabilities of both the seismic analytical methods and the test facilities are apparent. The two main methods for seismic analysis, the impedance method and the finite element method, have been computer-programmed in several countries with the capability of each of the codes dependent on the computer capability. The correlations between calculation and tests are dependent on input assumptions such as boundary conditions, soil parameters and various interactions between the soil, the buildings and the contained equipment. The ability to adjust these parameters and match experimental results with calculations was displayed in several of the papers. The expanded capability of some of the new test facilities was graphically displayed by the description of the SAMSON vibration test facility at Juelich, FRG, capable of dynamically testing specimens weighing up to 25 tonnes, and the TAMARIS facility at the CEA laboratories in Gif-sur-Yvette where the largest table is capable of testing specimens weighing up to 100 tonnes. The proceedings of this meeting contain all 16 presented papers. A separate abstract was prepared for each of these papers. Refs, figs and tabs

  6. Integrating multi-objective optimization with computational fluid dynamics to optimize boiler combustion process of a coal fired power plant

    International Nuclear Information System (INIS)

    Liu, Xingrang; Bansal, R.C.

    2014-01-01

    Highlights: • A coal fired power plant boiler combustion process model based on real data. • We propose multi-objective optimization with CFD to optimize boiler combustion. • The proposed method uses software CORBA C++ and ANSYS Fluent 14.5 with AI. • It optimizes heat flux transfers and maintains temperature to avoid ash melt. - Abstract: The dominant role of electricity generation and environment consideration have placed strong requirements on coal fired power plants, requiring them to improve boiler combustion efficiency and decrease carbon emission. Although neural network based optimization strategies are often applied to improve the coal fired power plant boiler efficiency, they are limited by some combustion related problems such as slagging. Slagging can seriously influence heat transfer rate and decrease the boiler efficiency. In addition, it is difficult to measure slag build-up. The lack of measurement for slagging can restrict conventional neural network based coal fired boiler optimization, because no data can be used to train the neural network. This paper proposes a novel method of integrating non-dominated sorting genetic algorithm (NSGA II) based multi-objective optimization with computational fluid dynamics (CFD) to decrease or even avoid slagging inside a coal fired boiler furnace and improve boiler combustion efficiency. Compared with conventional neural network based boiler optimization methods, the method developed in the work can control and optimize the fields of flue gas properties such as temperature field inside a boiler by adjusting the temperature and velocity of primary and secondary air in coal fired power plant boiler control systems. The temperature in the vicinity of water wall tubes of a boiler can be maintained within the ash melting temperature limit. The incoming ash particles cannot melt and bond to surface of heat transfer equipment of a boiler. So the trend of slagging inside furnace is controlled. Furthermore, the

  7. Summarized report on fiscal 2000 basic survey for coal resource exploration. Survey for development of new exploration technology (Exploration of shallow layers on the land); 2000 nendo sekitan shigen kaihatsu kiso chosa hokokusho (yoyakuban). Shintansa gijutsu chosa kaihatsu (rikuiki senso tansa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-09-01

    Efforts are made to develop a high-precision high-resolution seismic reflection method, high-efficiency method for measurement in the bored hole, coal potentiality assessment system, and so forth. For the development of the seismic reflection method, studies are conducted to properly deal with a situation where there are high-velocity layers at levels shallower than the object coal bed, and a conclusion is reached that quake generation based on the pseudorandom binary sequence code will be the best for the purpose. The system was tested for verification in a producing coal mine. As for measurement in the bored hole, an on-site test was conducted for a geophysical logging system capable of determining the total sulfur content, ash, and the like, and the system was found to supply high-quality data. In developing the coal potentiality assessment system, studies were made about the basic concept of the coordination of the system with the coal GIS (geographical information system) dealing with spatial data and about the functions of the system, and a conceptual design was prepared. (NEDO)

  8. Comparison of chest radiography and high-resolution computed tomography findings in early and low-grade coal worker's pneumoconiosis

    Energy Technology Data Exchange (ETDEWEB)

    Savranlar, A.; Altin, R.; Mahmutyazicioglu, K.; Ozdemir, H.; Kart, L.; Ozer, T.; Gundogdu, S. [Zonguldak Karaelmas University, Zonguldak (Turkey). Faculty of Medicine

    2004-08-01

    High-resolution computed tomography (HRCT) is more sensitive than chest X-ray (CXR) in the depiction of parenchymal abnormalities. The paper presents and compares CXR and HRCT findings in coal workers with and without early and low-grade coal worker's pneumoconiosis (CWP). 71 coal workers were enrolled in the study. HRCT's were graded according to Hosoda and Shida's Japanese classification. After grading, 67 workers with CXR profusion 0/0-2/2 were included in the study. Four patients with major opacity were excluded. Profusion 0/1 to 1/1 cases were accepted as early and profusion and 2/2 cases as low-grade pneumoconiosis. Discordance rate was found to be higher in the early pneumoconiosis cases with negative CXR than low-grade pneumoconiosis (60, 36 and 8%, respectively). When coal miners with normal CXR were evaluated by HRCT, six out of 10 cases were diagnosed as positive. In low-grade pneumoconiosis group, the number of patients with positive CXR but negative HRCT were low in comparison to patients with CXR negative and early pneumoconiosis findings. Most of the CXR category 0 patients (10/16) were diagnosed as category 1 by HRCT. Eleven cases diagnosed as CXR category 1 were diagnosed as category 0 (7/11) and category 2 (4/11) by HRCT. In CXR category 2 (eight cases), there were four cases diagnosed as category 1 by HRCT. Overall, discordance between CXR and HRCT was high, especially for CXR negative and early pneumoconiosis cases. The role of CXR in screening coal workers to detect early pneumoconiosis findings should be questioned. The authors suggest using HRCT as a standard screening method instead of CXR to distinguish between normal and early pneumoconiosis.

  9. The GLOBE-Consortium: The Erasmus Computing Grid and The Next Generation Genome Viewer

    NARCIS (Netherlands)

    T.A. Knoch (Tobias)

    2005-01-01

    markdownabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live-science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  10. Micro Tools with Pneumatic Actuators for Desktop Factories

    Directory of Open Access Journals (Sweden)

    Björn HOXHOLD

    2009-10-01

    Full Text Available This paper presents the design, the simulation and the fabrication process of two novel pneumatically driven auxiliary micro tools that can be used to improve and to speed up assembling processes in desktop factories. The described micro systems are designed to function as centrifugal feeders for small glass balls or active clamping devices with small external dimensions. They are able to deliver more than six balls per second on demand to a gripper and align and clamp single chips in a fixed position.

  11. Seismic array processing and computational infrastructure for improved monitoring of Alaskan and Aleutian seismicity and volcanoes

    Science.gov (United States)

    Lindquist, Kent Gordon

    We constructed a near-real-time system, called Iceworm, to automate seismic data collection, processing, storage, and distribution at the Alaska Earthquake Information Center (AEIC). Phase-picking, phase association, and interprocess communication components come from Earthworm (U.S. Geological Survey). A new generic, internal format for digital data supports unified handling of data from diverse sources. A new infrastructure for applying processing algorithms to near-real-time data streams supports automated information extraction from seismic wavefields. Integration of Datascope (U. of Colorado) provides relational database management of all automated measurements, parametric information for located hypocenters, and waveform data from Iceworm. Data from 1997 yield 329 earthquakes located by both Iceworm and the AEIC. Of these, 203 have location residuals under 22 km, sufficient for hazard response. Regionalized inversions for local magnitude in Alaska yield Msb{L} calibration curves (logAsb0) that differ from the Californian Richter magnitude. The new curve is 0.2\\ Msb{L} units more attenuative than the Californian curve at 400 km for earthquakes north of the Denali fault. South of the fault, and for a region north of Cook Inlet, the difference is 0.4\\ Msb{L}. A curve for deep events differs by 0.6\\ Msb{L} at 650 km. We expand geographic coverage of Alaskan regional seismic monitoring to the Aleutians, the Bering Sea, and the entire Arctic by initiating the processing of four short-period, Alaskan seismic arrays. To show the array stations' sensitivity, we detect and locate two microearthquakes that were missed by the AEIC. An empirical study of the location sensitivity of the arrays predicts improvements over the Alaskan regional network that are shown as map-view contour plots. We verify these predictions by detecting an Msb{L} 3.2 event near Unimak Island with one array. The detection and location of four representative earthquakes illustrates the expansion

  12. Forecasting of Energy Expenditure of Induced Seismicity with Use of Artificial Neural Network

    Science.gov (United States)

    Cichy, Tomasz; Banka, Piotr

    2017-12-01

    Coal mining in many Polish mines in the Upper Silesian Coal Basin is accompanied by high levels of induced seismicity. In mining plants, the methods of shock monitoring are improved, allowing for more accurate localization of the occurring phenomena and determining their seismic energy. Equally important is the development of ways of forecasting seismic hazards that may occur while implementing mine design projects. These methods, depending on the length of time for which the forecasts are made, can be divided into: longterm, medium-term, short-term and so-called alarm. Long-term forecasts are particularly useful for the design of seam exploitations. The paper presents a method of predicting changes in energy expenditure of shock using a properly trained artificial neural network. This method allows to make long-term forecasts at the stage of the mine’s exploitation design, thus enabling the mining work plans to be reviewed to minimize the potential for tremors. The information given at the input of the neural network is indicative of the specific energy changes of the elastic deformation occurring in the selected, thick, resistant rock layers (tremor-prone layers). Energy changes, taking place in one or more tremor-prone layers are considered. These indicators describe only the specific energy changes of the elastic deformation accumulating in the rock as a consequence of the mining operation, but does not determine the amount of energy released during the destruction of a given volume of rock. In this process, the potential energy of elastic strain transforms into other, non-measurable energy types, including the seismic energy of recorded tremors. In this way, potential energy changes affect the observed induced seismicity. The parameters used are characterized by increases (declines) of specific energy with separation to occur before the hypothetical destruction of the rock and after it. Additional input information is an index characterizing the rate of

  13. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Science.gov (United States)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  14. Architectures for single-chip image computing

    Science.gov (United States)

    Gove, Robert J.

    1992-04-01

    This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.

  15. Bio-coal briquettes using low-grade coal

    Science.gov (United States)

    Estiaty, L. M.; Fatimah, D.; Widodo

    2018-02-01

    The technology in using briquettes for fuel has been widely used in many countries for both domestic and industrial purposes. Common types of briquette used are coal, peat, charcoal, and biomass. Several researches have been carried out in regards to the production and the use of briquettes. Recently, researches show that mixing coal and biomass will result in an environmentally friendly briquette with better combustion and physical characteristics. This type of briquette is known as bio-coal briquettes. Bio-coal briquettes are made from agriculture waste and coal, which are readily available, cheap and affordable. Researchers make these bio-coal briquettes with different aims and objectives, depending on the issues to address, e.g. utilizing agricultural waste as an alternative energy to replace fossil fuels that are depleting its reserves, adding coal to biomass in order to add calorific value to bio-coal briquette, and adding biomass to coal to improve its chemical and physical properties. In our research, biocoal briquettes are made to utilize low grade coal. The biomass we use, however, is different from the ones used in past researches because it has undergone fermentation. The benefits of using such biomass are 1. Fermentation turns the hemi cellulose into a simpler form, so that the burning activation energy decreases while the calorific value increases. 2. Enzym produced will bind to heavy metals from coal as co-factors, forming metals that are environmentally friendly.

  16. Physical properties of the crust along the seismic refraction profile Vrancea'99

    International Nuclear Information System (INIS)

    Bala, A.; Raileanu, V.; Popa, M.

    2002-01-01

    The seismic refraction project VRANCEA'99 is a contribution to the German-Romanian Collaborative Research Center (CRC) 461: 'Strong Earthquakes - a Challenge for Geosciences and Civil Engineering' which was launched by the University of Karlsruhe, Germany in collaboration with various research institutes in Romania. Preparations started already in 1998, but the actual field work was carried out in May 1999. The seismic project VRANCEA'99, was jointly organized by Geophysical Institute of Karlsruhe University, GeoForschungsZentrum (GFZ) Potsdam, Germany and National Institute for Earth Physics from Bucharest, Romania. Seismic and seismological data recorded on this profile are used to compute a reliable continuous distribution of seismic velocity ( P wave ) with depth. Nine of the explosions from seismic profile Vrancea'99 were also recorded at the seismological telemetered stations from the national seismic network. These explosions were localized as seismic events using seismologic data and also some data from recording sites with 3D sensors deployed along the profile. Time - distance graphs are used to derive the continuous distribution of velocity with depth between 5 km - 45 km depth by Wiechert - Herglotz method. The crustal 2D model from obtained on the profile VRANCEA'99 was introduced as starting model in a density modeling along the refraction profile. The measured values of Bouguer anomaly along the profile were introduced in the model in order to be compared with the computed Bouguer anomaly. After several iterations, the computed Bouguer anomaly is overlapping well enough the observed Bouguer anomaly along the profile. This certify the chosen model (obtained from seismic forward and inverse modeling) using another method of geophysical modeling. Geologic and tectonic implications of the obtained density model are discussed. (authors)

  17. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  18. Coal

    International Nuclear Information System (INIS)

    Teissie, J.; Bourgogne, D. de; Bautin, F.

    2001-12-01

    Coal world production represents 3.5 billions of tons, plus 900 millions of tons of lignite. 50% of coal is used for power generation, 16% by steel making industry, 5% by cement plants, and 29% for space heating and by other industries like carbo-chemistry. Coal reserves are enormous, about 1000 billions of tons (i.e. 250 years of consumption with the present day rate) but their exploitation will be in competition with less costly and less polluting energy sources. This documents treats of all aspects of coal: origin, composition, calorific value, classification, resources, reserves, production, international trade, sectoral consumption, cost, retail price, safety aspects of coal mining, environmental impacts (solid and gaseous effluents), different technologies of coal-fired power plants and their relative efficiency, alternative solutions for the recovery of coal energy (fuel cells, liquefaction). (J.S.)

  19. CoalVal-A coal resource valuation program

    Science.gov (United States)

    Rohrbacher, Timothy J.; McIntosh, Gary E.

    2010-01-01

    CoalVal is a menu-driven Windows program that produces cost-of-mining analyses of mine-modeled coal resources. Geological modeling of the coal beds and some degree of mine planning, from basic prefeasibility to advanced, must already have been performed before this program can be used. United States Geological Survey mine planning is done from a very basic, prefeasibility standpoint, but the accuracy of CoalVal's output is a reflection of the accuracy of the data entered, both for mine costs and mine planning. The mining cost analysis is done by using mine cost models designed for the commonly employed, surface and underground mining methods utilized in the United States. CoalVal requires a Microsoft Windows? 98 or Windows? XP operating system and a minimum of 1 gigabyte of random access memory to perform operations. It will not operate on Microsoft Vista?, Windows? 7, or Macintosh? operating systems. The program will summarize the evaluation of an unlimited number of coal seams, haulage zones, tax entities, or other area delineations for a given coal property, coalfield, or basin. When the reader opens the CoalVal publication from the USGS website, options are provided to download the CoalVal publication manual and the CoalVal Program. The CoalVal report is divided into five specific areas relevant to the development and use of the CoalVal program: 1. Introduction to CoalVal Assumptions and Concepts. 2. Mine Model Assumption Details (appendix A). 3. CoalVal Project Tutorial (appendix B). 4. Program Description (appendix C). 5. Mine Model and Discounted Cash Flow Formulas (appendix D). The tutorial explains how to enter coal resource and quality data by mining method; program default values for production, operating, and cost variables; and ones own operating and cost variables into the program. Generated summary reports list the volume of resource in short tons available for mining, recoverable short tons by mining method; the seam or property being mined

  20. The role of coal consumption in the economic growth of the Polish economy in transition

    International Nuclear Information System (INIS)

    Gurgul, Henryk; Lach, Lukasz

    2011-01-01

    The main goal of this paper is an analysis of the causal links between quarterly coal consumption in the Polish economy and GDP. For the sake of accurate computation an additional variable - employment - was also taken into account. Computations conducted for the period Q1 2000 to Q4 2009 by means of recent causality techniques confirmed the neutrality of hard coal usage with respect to economic growth. On the other hand, calculations for the pairs lignite-GDP and total coal consumption-GDP showed the existence of a significant nonlinear causality from coal usage to economic growth. This is clear evidence for claiming that lignite plays an important role in the economic growth of the Polish economy. Furthermore, each coal-related variable was found to have a nonlinear causal impact on employment. Because of the relatively short length of available time series we additionally applied bootstrap critical values. The empirical results computed by both methods did not exhibit significant differences. These results have important policy implications. In general, our findings support the hypothesis that closing hard coal mines in Poland should have no significant repercussions on economic growth. However, this does not seem to be true for lignite mines. - Research highlights: → The reduction of hard coal consumption should not hamper economic growth in Poland. → Lignite consumption is an important factor determining economic growth in Poland. → The usage of lignite and hard coal has a causal impact on employment in Poland.

  1. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    Science.gov (United States)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  2. Coking coal outlook from a coal producer's perspective

    International Nuclear Information System (INIS)

    Thrasher, E.

    2008-01-01

    Australian mine production is recovering from massive flooding while Canadian coal shipments are limited by mine and rail capacity. Polish, Czech, and Russian coking coal shipments have been reduced and United States coking coal shipments are reaching their maximum capacity. On the demand side, the Chinese government has increased export taxes on metallurgical coal, coking coal, and thermal coal. Customers seem to be purchasing in waves and steel prices are declining. This presentation addressed the global outlook for coal as well as the challenges ahead in terms of supply and demand. Supply challenges include regulatory uncertainty; environmental permitting; labor; and geology of remaining reserves. Demand challenges include global economic uncertainty; foreign exchange values; the effect of customers making direct investments in mining operations; and freight rates. Consolidation of the coal industry continued and several examples were provided. The presentation also discussed other topics such as coking coal production issues; delayed mining permits and environmental issues; coking coal contract negotiations; and stock values of coking coal producers in the United States. It was concluded that consolidation will continue throughout the natural resource sector. tabs., figs

  3. VLF surface-impedance modelling techniques for coal exploration

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G.; Thiel, D.; O' Keefe, S. [Central Queensland University, Rockhampton, Qld. (Australia). Faculty of Engineering and Physical Systems

    2000-10-01

    New and efficient computational techniques are required for geophysical investigations of coal. This will allow automated inverse analysis procedures to be used for interpretation of field data. In this paper, a number of methods of modelling electromagnetic surface impedance measurements are reviewed, particularly as applied to typical coal seam geology found in the Bowen Basin. At present, the Impedance method and the finite-difference time-domain (FDTD) method appear to offer viable solutions although both have problems. The Impedance method is currently slightly inaccurate, and the FDTD method has large computational demands. In this paper both methods are described and results are presented for a number of geological targets. 17 refs., 14 figs.

  4. Integrated software system for seismic evaluation of nuclear power plant structures

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.L.

    1993-01-01

    The computer software CARES (Computer Analysis for Rapid Evaluation of Structures) was developed by the Brookhaven National Laboratory for the U.S. Nuclear Regulatory Commission. It represents an effort to utilize established numerical methodologies commonly employed by industry for structural safety evaluations of nuclear power plant facilities and incorporates them into an integrated computer software package operated on personal computers. CARES was developed with the objective of including all aspects of seismic performance evaluation of nuclear power structures. It can be used to evaluate the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants by various utilities. CARES has a modular format, each module performing a specific type of analysis. The seismic module integrates all the steps of a complete seismic analysis into a single package with many user-friendly features such as interactiveness and quick turnaround. Linear structural theory and pseudo-linear convolution theory are utilized as the bases for the development with a special emphasis on the nuclear regulatory requirements for structural safety of nuclear plants. The organization of the seismic module is arranged in eight options, each performing a specific step of the analysis with most of input/output interfacing processed by the general manager. Finally, CARES provides comprehensive post-processing capability for displaying results graphically or in tabular form so that direct comparisons can be easily made. (author)

  5. Coal contract cost reduction through resale of coal

    International Nuclear Information System (INIS)

    Simon, R.

    1990-01-01

    The weak coal market of the 1980's has enabled utilities and other users of coal to enjoy stable or falling prices for coal supplies. Falling prices for coal stimulated the renegotiation of numerous coal contracts in recent years, as buyers look to take advantage of lower fuel prices available in the marketplace. This paper examines the use of coal resale transactions as a means of reducing fuel costs, and analyzes the benefits and risks associated with such transactions

  6. Mini-Sosie - a new concept in high-resolution seismic surveys

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, C J

    1977-12-01

    Mini-Sosie is a new approach to high-resolution reflection seismics using a nondynamite source. The basic principles is to use an ordinary earth tamper to produce a long duration pseudo-random input pulse train. Returning signals from suitable geophone arrays are decoded in real time by crosscorrelation with the reference signal recorded from a source-sensor attached to the tamper plate. Relatively weak signals are stacked until sufficient amplitude is obtained; most noise is phased out during the decoding process while in-phase seismic events are added, resulting in good signal-to-noise ratios. The resulting output is the standard field seismogram. The source is relatively quiet and surface damage is insignificant thereby avoiding environmental restrictions. Mini-Sosie is especially useful for shallow investigation to one second (two-way time) and has a wide range of applications from shallow oil and gas exploration, coal, and hard mineral exploration to hydrology and engineering studies.

  7. Monitoring of qualitative characteristics of coal in mines and power plants

    International Nuclear Information System (INIS)

    Cervenka, M.; Krouzek, J.

    1991-01-01

    The basic qualitative characteristic of coal is its heating value, which is dependent on its noncombustible content and moisture. Sensors which have been developed for coal quality monitoring include two-channel radiometric ash meters, moisture meters and neutron sulfur analyzers. They are complemented with integrating balances and automated samplers and computer techniques. A complex quality monitoring system has been implemented in the North Bohemian localities of Vrsany and Most. The gamma ash meter is fitted with a scintillation counter. The measurement is continuous and contactless. A German ash meter equipped with a Geiger-Mueller tube is also mentioned. A continuous neutron analyzer is used for measuring the sulfur content; it is based on radiative capture of thermal neutrons. Described are also the method of coal weighing, the automated samplers, the central computer system and the software used. The results obtained with the systems implemented are summarized. The poor reliability of the Czechoslovak computer hardware poses problems. (M.D.). 7 figs., 6 tabs., 5 refs

  8. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  9. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G.; Smoot, L.D.; Brewster, B.S. (Advanced Fuel Research, Inc., East Hartford, CT (United States) Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The overall objective of this program is the development of predictive capability for the design, scale up, simulation, control and feedstock evaluation in advanced coal conversion devices. This program will merge significant advances made in measuring and quantitatively describing the mechanisms in coal conversion behavior. Comprehensive computer codes for mechanistic modeling of entrained-bed gasification. Additional capabilities in predicting pollutant formation will be implemented and the technology will be expanded to fixed-bed reactors.

  10. Methodology to evaluate the site standard seismic motion to a nuclear facility

    International Nuclear Information System (INIS)

    Soares, W.A.

    1983-01-01

    For the seismic design of nuclear facilities, the input motion is normally defined by the predicted maximum ground horizontal acceleration and the free field ground response spectrum. This spectrum is computed on the basis of records of strong motion earthquakes. The pair maximum acceleration-response spectrum is called the site standard seismic motion. An overall view of the subjects involved in the determination of the site standard seismic motion to a nuclear facility is presented. The main topics discussed are: basic principles of seismic instrumentation; dynamic and spectral concepts; design earthquakes definitions; fundamentals of seismology; empirical curves developed from prior seismic data; available methodologies and recommended procedures to evaluate the site standard seismic motion. (Author) [pt

  11. Use of seismic attributes for sediment classification

    Directory of Open Access Journals (Sweden)

    Fabio Radomille Santana

    2015-04-01

    Full Text Available A study to understand the relationships between seismic attributes extracted from 2D high-resolution seismic data and the seafloor's sediments of the surveyed area. As seismic attributes are features highly influenced by the medium through which the seismic waves are propagated, the authors can assume that it would be possible to characterise the geological nature of the seafloor by using these attributes. Herein, a survey was performed on the continental margin of the South Shetland Islands in Antarctica, where both 2D high-resolution seismic data and sediment gravity cores samples were simultaneously acquired. A computational script was written to extract the seismic attributes from the data, which have been statistically analysed with clustering analyses, such as principal components analysis, dendrograms and k-means classification. The extracted seismic attributes are the amplitude, the instantaneous phase, the instantaneous frequency, the envelope, the time derivative of the envelope, the second derivative of the envelope and the acceleration of phase. Statistical evaluation showed that geological classification of the seafloor's sediments is possible by associating these attributes according to their coherence. The methodologies here developed seem to be appropriate for glacio-marine environment and coarse-to-medium silt sediment found in the study area and may be applied to other regions in the same geological conditions.

  12. Romanian seismic network

    International Nuclear Information System (INIS)

    Ionescu, Constantin; Rizescu, Mihaela; Popa, Mihaela; Grigore, Adrian

    2000-01-01

    The research in the field of seismology in Romania is mainly carried out by the National Institute for Earth Physics (NIEP). The NIEP activities are mainly concerned with the fundamental research financed by research contracts from public sources and the maintenance and operation of the Romanian seismic network. A three stage seismic network is now operating under NIEP, designed mainly to monitor the Vrancea seismic region in a magnitude range from microearthquakes to strong events: - network of 18 short-period seismometers (S13); - Teledyne Geotech Instruments (Texas); - network of 7 stations with local digital recording (PCM-5000) on magnetic tape, made up of, S13 geophone (T=2 s) on vertical component and SH1 geophone (T=5 s) on horizontal components; - network of 28 SMA-1 accelerometers and 30 digital accelerometers (Kinemetrics - K2) installed in the free field conditions in the framework of the joint German-Romanian cooperation program (CRC); the K2 instruments cover a magnitude range from 1.4 to 8.0. Since 1994, MLR (Muntele Rosu) station has become part of the GEOFON network and was provided with high performance broad band instruments. At Bucharest and Timisoara data centers, an automated and networked seismological system performs the on-line digital acquisition and processing of the telemetered data. Automatic processing includes discrimination between local and distant seismic events, earthquake location and magnitude computation, and source parameter determination for local earthquakes. The results are rapidly distributed via Internet, to several seismological services in Europe and USA, to be used in the association/confirmation procedures. Plans for new developments of the network include the upgrade from analog to digital telemetry and new stations for monitoring local seismicity. (authors)

  13. Fiscal 2000 basic survey for coal resource exploration. Survey for development of new exploration technology (Exploration of shallow layers on the land - Collection of data and materials); 2000 nendo sekitan shigen kaihatsu kiso chosa shiryoshu. Shintansa gijutsu chosa kaihatsu (rikuiki senso tansa)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-09-01

    As agreed upon between NEDO (New Energy and Industrial Technology Development Organization), Japan, and Queensland, Australia, joint research was conducted on new technology for coal exploration within Queensland, and data collected during the research and related materials are compiled into this book. The book contains the Agreement for the Joint Research of New Technology in the Geophysical Exploration of Coal Resources (Japanese and English), GPS (global positioning system) survey results along 2-dimensional seismic reflection method traverse lines, GPS survey results along 3-dimensional seismic reflection method traverse lines, seismic generator vehicle inspection and repair report, geophysical logging observer's logs and test bore dip measurement data sheets, examples of outputted shot records (2-dimensional seismic reflection method), examples of outputted shot records (3-dimensional seismic reflection method), analysis and testing report on Girrah layer samples, reference literature on PRBS (pseudorandom binary sequence), collections of photographs of cores sampled by test boring (BG001, BG002, BG003, BG004), collections of other photographs, and so forth. (NEDO)

  14. Phase 1 report: the 4D seismic market from 2000 to 2003

    International Nuclear Information System (INIS)

    Sagary, C.

    2004-01-01

    This report synthesizes the phase 1 results of the joint industrial project, called ''4D Seismic: Technologies, Economics and Issues''. This project was conducted by IFP between November 2003 and April 2004, in collaboration with Compagnie Generale de Geophysique (CGG) and sponsored by Gaz de France and 4. Wave Imaging. Phase 1 offers an objective view of the 4D seismic market over the period 2000-2003. The market has been assessed from IFP extensive databases, gathering 115 4D projects conducted worldwide and from interviews of seven oil companies, both representing 90% of the activity in time-lapse seismic. This study provides sales estimation and sales/projects breakdown by: in-house/subcontracted activity, geography, onshore/offshore, reservoir rocks and recovery methods, technology/methodology, oil companies and service companies. The market of 4D seismic has been split into 4 segments: acquisition, processing, reservoir studies - feasibility, interpretation and seismic history matching -, borehole seismic (acquisition and processing). In addition, the market of passive seismic monitoring, another technique of seismic reservoir monitoring has also been estimated. The main sources, used to build the IFP databases, were: Worldwide Global E and P Service Reports from IHS Energy, World Geophysical News, an extensive bibliographic study through more than 200 articles, abstracts and summaries, a collaboration with CGG. For all market estimations, numbers computed from IFP databases and from interviews of oil companies were extrapolated from 90% to 100%, to quantify the total 4D activity. The estimations obtained were not rounded in order to preserve trends with a consistent computation from one year to another and from one market segment to another, despite uncertainties of about 10%. Quality controls were performed to validate the final estimations: volumes of 4D seismic data, computed from IFP databases, were checked by comparing processed data with acquired data

  15. Use of a viscoelastic model for the seismic response of base-isolated buildings

    International Nuclear Information System (INIS)

    Uras, R.A.

    1994-01-01

    Due to recent developments in elastomer technology, seismic isolation using elastomer bearings is rapidly becoming an acceptable design tool to enhance structural seismic margins and to protect people and equipment from earthquake damage. With proper design of isolators, high-energy seismic input motions are transformed into low-frequency, low energy harmonic motions and the accelerations acting on the isolated building are significantly reduced. Several alternatives exist for the modeling of the isolators. This study is concerned with the use of a viscoelastic model to predict the seismic response of base-isolated buildings. The in-house finite element computer code has been modified to incorporate a viscoelastic spring element, and several simulations are performed. Then, the computed results have been compared with the corresponding observed data recorded at the test facility

  16. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  17. Time-dependent seismic tomography

    Science.gov (United States)

    Julian, B.R.; Foulger, G.R.

    2010-01-01

    Of methods for measuring temporal changes in seismic-wave speeds in the Earth, seismic tomography is among those that offer the highest spatial resolution. 3-D tomographic methods are commonly applied in this context by inverting seismic wave arrival time data sets from different epochs independently and assuming that differences in the derived structures represent real temporal variations. This assumption is dangerous because the results of independent inversions would differ even if the structure in the Earth did not change, due to observational errors and differences in the seismic ray distributions. The latter effect may be especially severe when data sets include earthquake swarms or aftershock sequences, and may produce the appearance of correlation between structural changes and seismicity when the wave speeds are actually temporally invariant. A better approach, which makes it possible to assess what changes are truly required by the data, is to invert multiple data sets simultaneously, minimizing the difference between models for different epochs as well as the rms arrival-time residuals. This problem leads, in the case of two epochs, to a system of normal equations whose order is twice as great as for a single epoch. The direct solution of this system would require twice as much memory and four times as much computational effort as would independent inversions. We present an algorithm, tomo4d, that takes advantage of the structure and sparseness of the system to obtain the solution with essentially no more effort than independent inversions require. No claim to original US government works Journal compilation ?? 2010 RAS.

  18. Australian Coal Company Risk Factors: Coal and Oil Prices

    OpenAIRE

    M. Zahid Hasan; Ronald A. Ratti

    2014-01-01

    Examination of panel data on listed coal companies on the Australian exchange over January 1999 to February 2010 suggests that market return, interest rate premium, foreign exchange rate risk, and coal price returns are statistically significant in determining the excess return on coal companies’ stock. Coal price return and oil price return increases have statistically significant positive effects on coal company stock returns. A one per cent rise in coal price raises coal company returns ...

  19. ROCKING. A computer program for seismic response analysis of radioactive materials transport AND/OR storage casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1995-11-01

    The computer program ROCKING has been developed for seismic response analysis, which includes rocking and sliding behavior, of radioactive materials transport and/or storage casks. Main features of ROCKING are as follows; (1) Cask is treated as a rigid body. (2) Rocking and sliding behavior are considered. (3) Impact forces are represented by the spring dashpot model located at impact points. (4) Friction force is calculated at interface between a cask and a floor. (5) Forces of wire ropes against tip-over work only as tensile loads. In the paper, the calculation model, the calculation equations, validity calculations and user's manual are shown. (author)

  20. Coal Tar and Coal-Tar Pitch

    Science.gov (United States)

    Learn about coal-tar products, which can raise your risk of skin cancer, lung cancer, and other types of cancer. Examples of coal-tar products include creosote, coal-tar pitch, and certain preparations used to treat skin conditions such as eczema, psoriasis, and dandruff.