Directory of Open Access Journals (Sweden)
A. Carvalho
2009-01-01
Full Text Available Vince was an unusual hurricane that developed over the North Atlantic Ocean in an unexpected area, on October 2005. In this work, the authors analyze its background and genesis over the ocean, making use of satellite imagery and numerical models. The impacts on sea state are investigated both numerically and observationally. Landfall over the Iberian Peninsula is monitored with surface observations and a radar system at Algarve (Portugal.
International Nuclear Information System (INIS)
Vanyushin, I. V.; Gergel, V. A.; Gontar', V. M.; Zimoglyad, V. A.; Tishin, Yu. I.; Kholodnov, V. A.; Shcheleva, I. M.
2007-01-01
A new discrete theoretical model of the development and relaxation of a local microbreakdown in silicon avalanche photodiodes operating in the Geiger mode is developed. It is shown that the spreading resistance in the substrate profoundly affects both the amplitude of a single-photon electrical pulse and the possibility of attaining the steady-state form of the avalanche breakdown excluding the Geiger mode of the photodiode's operation. The model is employed to interpret the experimental data obtained using test single-photon cells of avalanche photodiodes fabricated on the basis of the 0.25-μm silicon technology with the use of deep implantation to form the region of avalanche multiplication for the charge carriers. Excellent functional properties of the studied type of the single-photon (Geiger) cell are noted. A typical amplitude characteristic of the cell for optical radiation with the wavelength λ = 0.56 μm in the irradiance range of 10 -3 -10 2 lx is presented; this characteristic indicates that the quantum efficiency of photoconversion is extremely high
International Nuclear Information System (INIS)
Pellion, D.
2008-12-01
The genesis of the work presented in this this is in the field of very high energy astrophysics. One century ago, scientists identified a new type of messenger coming from space: cosmic rays. This radiation consists of particles (photons or other) of very high energy which bombard the Earth permanently. The passage of cosmic radiations in the Earth's atmosphere results in the creation of briefs luminous flashes (5 ns) of very low intensity (1 pW), a Cherenkov flash, and then becomes visible on the ground. In the current state of the art the best detector of light today is the Photomultiplier tube (PMT), thanks to its characteristics of sensitivity and speed. But there are some drawbacks: low quantum efficiency, cost, weight etc. We present in this thesis a new alternative technology: silicon photon counters, made of photodiodes polarized in Geiger mode. This operating mode makes it possible to obtain an effect of multiplication comparable to that of the PMT. A physical and electrical model was developed to reproduce the behaviour of this detector. We then present in this thesis work an original technological process allowing the realization of these devices in the Center of Technology of LAAS-CNRS, with the simulation of each operation of the process. We developed a scheme for the electric characterization of the device, from the static mode to the dynamic mode, in order to check conformity with SILVACO simulations and to the initial model. Results are already excellent, given this is only a first prototype step, and comparable with the results published in the literature. These silicon devices can intervene in all the applications where there is a photomultiplier and replace it. The applications are thus very numerous and the growth of the market of these detectors is very fast. We present a first astrophysical experiment installed at the 'Pic du Midi' site which has detected Cherenkov flashes from cosmic rays with this new semiconductor technology. (author)
Energy Technology Data Exchange (ETDEWEB)
Pellion, D
2008-12-15
The genesis of the work presented in this this is in the field of very high energy astrophysics. One century ago, scientists identified a new type of messenger coming from space: cosmic rays. This radiation consists of particles (photons or other) of very high energy which bombard the Earth permanently. The passage of cosmic radiations in the Earth's atmosphere results in the creation of briefs luminous flashes (5 ns) of very low intensity (1 pW), a Cherenkov flash, and then becomes visible on the ground. In the current state of the art the best detector of light today is the Photomultiplier tube (PMT), thanks to its characteristics of sensitivity and speed. But there are some drawbacks: low quantum efficiency, cost, weight etc. We present in this thesis a new alternative technology: silicon photon counters, made of photodiodes polarized in Geiger mode. This operating mode makes it possible to obtain an effect of multiplication comparable to that of the PMT. A physical and electrical model was developed to reproduce the behaviour of this detector. We then present in this thesis work an original technological process allowing the realization of these devices in the Center of Technology of LAAS-CNRS, with the simulation of each operation of the process. We developed a scheme for the electric characterization of the device, from the static mode to the dynamic mode, in order to check conformity with SILVACO simulations and to the initial model. Results are already excellent, given this is only a first prototype step, and comparable with the results published in the literature. These silicon devices can intervene in all the applications where there is a photomultiplier and replace it. The applications are thus very numerous and the growth of the market of these detectors is very fast. We present a first astrophysical experiment installed at the 'Pic du Midi' site which has detected Cherenkov flashes from cosmic rays with this new semiconductor technology
In hoc signo vinces. De geschiedschrijving van de Godsdienstwetenschap.
Molendijk, Arie L.
2003-01-01
Summary: In hoc signo vinces. The Historiography of Science of Religion How is the history of science of religion to be written? Various recent books on the history of the scholarly study of religion – prominent among them Hans G. Kippenberg’s Discovering Religious History in the Modern Age
Regioselective hydroarylations and parallel kinetic resolution of Vince lactam.
Kamlet, Adam S; Préville, Cathy; Farley, Kathleen A; Piotrowski, David W
2013-09-27
Two regioselective and complementary hydroarylation reactions of an unsymmetrical cyclic olefin have been developed. The products can be transformed in one step into constrained γ-amino acids. Regioselective arylation of Vince lactam is controlled by the choice of phosphine ligand enantiomer and the substituent on the amide nitrogen atom. The method was extended to a general regiodivergent parallel kinetic resolution of the racemic lactam. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Electronic structure effects of amide group: Vince lactam
Novak, Igor; Kovač, Branka
2005-03-01
HeI photoelectron spectrum of 2-azabicyclo[2.2.1]hept-5-en-3-one (Vince lactam) has been measured. The assignment of the spectrum was made by comparison with photoelectron spectra of related compounds and by taking into account the lactam's molecular structure. The analysis of the electronic structure of amide group, in terms of inductive and conjugative effects, is presented on the basis of photoelectron spectroscopic data.
Single hydration of the peptide bond: the case of the Vince lactam.
Écija, Patricia; Basterretxea, Francisco J; Lesarri, Alberto; Millán, Judith; Castaño, Fernando; Cocinero, Emilio J
2012-10-18
2-Azabicyclo[2.2.1]hept-5-en-3-one (ABH or Vince lactam) and its monohydrated complex (ABH···H(2)O) have been observed in a supersonic jet by Fourier transform microwave spectroscopy. ABH is broadly used in the synthesis of therapeutic drugs, whereas the ABH···H(2)O system offers a simple model to explain the conformational preferences of water linked to a constrained peptidic bond. A single predominant form of the Vince lactam and its singly hydrated complex have been detected, determining the rotational constants, centrifugal distortion constants, and nuclear quadrupole coupling tensor. The monohydrated complex is stabilized by two hydrogen bonds (C═O···H-O and N-H···O) closing a six-membered ring. The complexation energy has been estimated to be ∼10 kJ mol(-1) from experimental results. In addition, the observed structure in the gas phase has been compared with solid-phase diffraction data. The structural parameters and binding energies of ABH···H(2)O have also been compared with similar molecules containing peptide bonds. Ab initio (MP2) and density functional (M06-2X and B3LYP) methods have supported the experimental work, describing the rotational parameters and conformational landscape of the title compound and its singly hydrated complex.
Erala, Siiri
2009-01-01
Akadeemilist võimekust parandavate ravimite kasutamisest USA kõrgkoolide õppurite seas. Meditsiiniajakiri British Medical Journal avaldas austraalia teadlase Vince Cakici artikli akadeemilisest dopingust. Tallinna Ülikooli terviseteaduste ja spordi instituudi direktori Kristjan Pordi teemakohased kommentaarid
An Inexpensive Coincidence Circuit for the Pasco Geiger Sensors
Fichera, F; Librizzi, F; Riggi, F
2005-01-01
A simple coincidence circuit was devised to carry out educational coincidence experiments involving the use of Geiger counters. The system was tested by commercially available Geiger sensors from PASCO, and is intended to be used in collaboration with high school students and teachers
Updated world map of the Köppen-Geiger climate classification
Directory of Open Access Journals (Sweden)
M. C. Peel
2007-10-01
Full Text Available Although now over 100 years old, the classification of climate originally formulated by Wladimir Köppen and modified by his collaborators and successors, is still in widespread use. It is widely used in teaching school and undergraduate courses on climate. It is also still in regular use by researchers across a range of disciplines as a basis for climatic regionalisation of variables and for assessing the output of global climate models. Here we have produced a new global map of climate using the Köppen-Geiger system based on a large global data set of long-term monthly precipitation and temperature station time series. Climatic variables used in the Köppen-Geiger system were calculated at each station and interpolated between stations using a two-dimensional (latitude and longitude thin-plate spline with tension onto a 0.1°×0.1° grid for each continent. We discuss some problems in dealing with sites that are not uniquely classified into one climate type by the Köppen-Geiger system and assess the outcomes on a continent by continent basis. Globally the most common climate type by land area is BWh (14.2%, Hot desert followed by Aw (11.5%, Tropical savannah. The updated world Köppen-Geiger climate map is freely available electronically in the Supplementary Material Section.
Stereoselective synthesis of 2'-fluoro-6'-methylene carbocyclic adenosine via Vince lactam.
Singh, Uma S; Mishra, Ram C; Shankar, Ravi; Chu, Chung K
2014-05-02
2'-Fluoro-6'-methylene carbocyclic adenosine (FMCA) is a potent and selective inhibitor of wild type as well as drug-resistant hepatitis B virus (HBV) mutants. FMCA demonstrated excellent anti-HBV activity against both adefovir-resistant and lamivudine-resistant double (rtL180M/rtM204V) mutants as well as in lamivudine/entecavir triple mutants (L180M+S202G+M204V) in vitro. Its monophosphate prodrug (FMCAP) demonstrated a greater than 12-fold increase of anti-HBV activity in comparison to that of the nucleoside without elevation of cellular toxicity. In the preliminary in vivo study in chimeric mice harboring the lamivudine/entecavir triple mutant, FMCAP effectively reduced HBV viral load, while entecavir was not effective. Therefore, it was of great interest to develop an efficient synthetic procedure to support the preclinical investigation. In this article, a new approach for the synthesis of FMCA from a readily available starting material (Vince lactam) in 16 steps is described. An efficient and practical methodology for stereospecific preparation of a versatile carbocyclic key intermediate, D-2'-fluoro-6'-methylene cyclopentanol 14, has been developed from diazotization, elimination, stereoselective epoxidation, fluorination, and oxidation-reduction sequence of the Vince lactam in 14 steps. The utility of D-2'-fluoro-6'-methylene cyclopentanol 14 is demonstrated in the preparation of FMCA using the Mitsunobu coupling to introduce the adenine base to synthesize the final nucleoside.
Geiger-Mueller counters for measuring tritium
International Nuclear Information System (INIS)
Kostadinov, K.N.; Yanev, Y.I.; Todorovsky, D.S.
1978-01-01
In the course of building up a procedure for easy and inexpensive assay of low 3 H-activity in water samples pure acetylene filling of GM counters was carried out. The counters used were of the Johnston type with a stainless steel cathodes and tungsten anode wire. When filled with pure acetylene, synthesized in specially constructed vacuum apparatus, they showed very good characteristics in the GM region. In the range of acetylene pressures 40-100 mm Hg, the length of plateaus varied from 150 to 250 V and there was a clear dependence of the plateau length on the acetylene pressure in the counter. The same was true of the threshold and working voltage. Increasing acetylene pressure led to a certain increase in the background of the counter probably due to photosensitivity. When using acetylene pressures exceeding 70 mm Hg, the increase in the background was negligible. The slope of the plateau was usually not more then 2-3%/100V and the dead time determined by the Stever method was 150 s. The obtained characteristics of the counter support the conclusion that acetylene can be used as pure filling gas of Geiger counters to measure tritium. (K.M.)
Geiger-Muller with a mica window halogen quenched counters aspects
International Nuclear Information System (INIS)
Gorski, M.S.; Bruzinga, W.A.
1990-09-01
We present the development of a model of a Geiger-Muller with likeness the model ZP 1410 Phillips. The prototype has a cylindrical shape with 37mm of effective length and a mica window of 1,5 to 2,0mg/cm sup(2) thickness with a useful diam of 19,8mm. For the window preparation and special cutting technique was developed Basically two types of quenching agents, bromine and chlorine were studied. Due to the high corrosive nature of these gases, we work with treatment of surface of the cathode through electropolishment, chemical passiveness, hard chrome and nickel coating. Out main objective was to get a Geiger-Muller detector with an operational plateau over 200V, working voltage above 600V and a sensivity of 320 counts/sec at 10 sup(-1) m Gy/h. (author)
60 years Geiger-Mueller counter - 40 years scintillation counter
International Nuclear Information System (INIS)
Stolz, W.; Herforth, L.
1988-01-01
This review is devoted to two anniversaries that of the invention of the Geiger-Mueller counter sixty years ago and that of the development of the scintillation counter forty years ago. Besides the history described the importance at the present time is emphasized. The advances made in further improvement of these detectors are considered. 99 refs. (author)
General Roy S. Geiger, USMC: Marine Aviator, Joint Force Commander
2007-06-01
always harmonious. For example, when she insisted that he take violin lessons at age eleven, he resisted this notion in a dramatic manner. Running...several months before returning home. There are no records of his ever playing the violin . The Geiger children learned to survive and excel under...Lieutenant Lawson H. M. Sanderson addressed these problems in his development of CAS techniques , which he refined under Geiger’s guidance.13
Geiger counters of gamma rays with a bismuth cathode
International Nuclear Information System (INIS)
Meunier, R.; Legrand, J.P.
1953-01-01
Geiger Muller counters present a lake of efficiency of some per cent, for the γ radiations. In the region 0,3 - 1 MeV, a substantial growth of their output can be obtained by a special construction of their cathode. In accordance with previous works, we constructed some counter of formed cathode by a pleated copper wire fencing covered of Bi by electrolysis. The successive modifications brought to a cylindrical conventional cathode in sheet metal of copper, that succeeds to this type of cathode, drive to an improvement of the output. (M.B.) [fr
ASIC Readout Circuit Architecture for Large Geiger Photodiode Arrays
Vasile, Stefan; Lipson, Jerold
2012-01-01
The objective of this work was to develop a new class of readout integrated circuit (ROIC) arrays to be operated with Geiger avalanche photodiode (GPD) arrays, by integrating multiple functions at the pixel level (smart-pixel or active pixel technology) in 250-nm CMOS (complementary metal oxide semiconductor) processes. In order to pack a maximum of functions within a minimum pixel size, the ROIC array is a full, custom application-specific integrated circuit (ASIC) design using a mixed-signal CMOS process with compact primitive layout cells. The ROIC array was processed to allow assembly in bump-bonding technology with photon-counting infrared detector arrays into 3-D imaging cameras (LADAR). The ROIC architecture was designed to work with either common- anode Si GPD arrays or common-cathode InGaAs GPD arrays. The current ROIC pixel design is hardwired prior to processing one of the two GPD array configurations, and it has the provision to allow soft reconfiguration to either array (to be implemented into the next ROIC array generation). The ROIC pixel architecture implements the Geiger avalanche quenching, bias, reset, and time to digital conversion (TDC) functions in full-digital design, and uses time domain over-sampling (vernier) to allow high temporal resolution at low clock rates, increased data yield, and improved utilization of the laser beam.
International Nuclear Information System (INIS)
Korff, Sebastian
2014-01-01
This thesis studies the creation and establishment history of this instrument called first electron counting tube in the years 1928 and 1929. It deals thereby with the last two years of the common work of Hans Geiger and Walter Mueller, from which the measuring instrument later renamed to Geiger-Mueller counting tube. The results of this scientific case study are didactically worked out and made usable for the teaching of physics in the school.
Geiger Muller (GM) detector as online monitor: an experimental study
International Nuclear Information System (INIS)
Jayan, M.P.; Pawar, V.J.; Krishnakumar, P.; Sureshkumar, M.
2014-01-01
Monitoring the inadvertent release of radioactivity into otherwise inactive liquid streams is a common requirement in nuclear industry. In addition to conventional off-line sampling and measurement methods, nuclear facilities usually uses online methods to get real-time detection of activity contents in process cooling water lines and steam condensate lines. Due to its simplicity, ruggedness and cost effectiveness, Geiger Muller counter is obviously the first choice for online application. Though GM based monitors for such online application were in industrial use for a long time, practical data on the response of the detector with respect low level activities in the effluents is scarce in literature. This work was carried out to fill this information gap. The data generated in these experiments may be useful in giving a realistic interpretation of the response of the existing monitors and setting up their alarm limits
Application of Geiger-mode photosensors in Cherenkov detectors
Energy Technology Data Exchange (ETDEWEB)
Gamal, Ahmed, E-mail: gamal.ahmed@assoc.oeaw.ac.a [Stefan Meyer Institute for Subatomic Physics of the Austrian Academy of Sciences, Vienna (Austria); Al-Azhar University, Faculty of Science, Physics Department, Cairo (Egypt); Paul, Buehler; Michael, Cargnelli [Stefan Meyer Institute for Subatomic Physics of the Austrian Academy of Sciences, Vienna (Austria); Roland, Hohler [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Johann, Marton [Stefan Meyer Institute for Subatomic Physics of the Austrian Academy of Sciences, Vienna (Austria); Herbert, Orth [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Ken, Suzuki [Stefan Meyer Institute for Subatomic Physics of the Austrian Academy of Sciences, Vienna (Austria)
2011-05-21
Silicon-based photosensors (SiPMs) working in the Geiger-mode represent an elegant solution for the readout of particle detectors working at low-light levels like Cherenkov detectors. Especially the insensitivity to magnetic fields makes this kind of sensors suitable for modern detector systems in subatomic physics which are usually employing magnets for momentum resolution. We are characterizing SiPMs of different manufacturers for selecting sensors and finding optimum operating conditions for given applications. Recently we designed and built a light concentrator prototype with 8x8 cells to increase the active photon detection area of an 8x8 SiPM (Hamamatsu MPPC S10931-100P) array. Monte Carlo studies, measurements of the collection efficiency, and tests with the MPPC were carried out. The status of these developments are presented.
Application of Geiger-mode photosensors in Cherenkov detectors
Gamal, Ahmed; Paul, Bühler; Michael, Cargnelli; Roland, Hohler; Johann, Marton; Herbert, Orth; Ken, Suzuki
2011-05-01
Silicon-based photosensors (SiPMs) working in the Geiger-mode represent an elegant solution for the readout of particle detectors working at low-light levels like Cherenkov detectors. Especially the insensitivity to magnetic fields makes this kind of sensors suitable for modern detector systems in subatomic physics which are usually employing magnets for momentum resolution. We are characterizing SiPMs of different manufacturers for selecting sensors and finding optimum operating conditions for given applications. Recently we designed and built a light concentrator prototype with 8×8 cells to increase the active photon detection area of an 8×8 SiPM (Hamamatsu MPPC S10931-100P) array. Monte Carlo studies, measurements of the collection efficiency, and tests with the MPPC were carried out. The status of these developments are presented.
Wolfgang Geiger (17 July 1921 - 3 July 2000
Directory of Open Access Journals (Sweden)
Renata Boucher-Rodoni
2000-08-01
Full Text Available Wolfgang Geiger died on the 3rd July 2000, at the age of 79. He was born on July 17th 1921 in Biel; his mother died at his birth. His childhood was spent with his father, a well-known artist, partly in Ligerz, on Lake Biel, and partly in Porto Ronco in Ticino, on Lago Maggiore. After high school in Biel, he began his University studies, first at the Swiss Federal Institute of Technology in Zürich, then in Basel, where he studied under Professor A. Portmann. During his PhD a grant from the Janggen-Pöhn foundation enabled him to work for some months at the Institut des Pêches maritimes du Maroc, in Casablanca, with Dr. J .Furnestin. In 1953 he completed his PhD on teleost fish brain. His career as a biologist began in Bern at the Eidgenossische Inspektion für Fortwesen, Jagd und Fischerei. In 1962 he was appointed head assistant (chef des travaux at the University of Geneva, in the comparative anatomy and physiology laboratory (Dr H. J. Huggel, where he discovered the joys and the limitations of teaching. He was highly regarded as a lecturer and taught in a relaxed atmosphere of mutual respect and trust, much appreciated by his students. Professor Geiger was also the main organiser of field trips to Sète, on the French Mediterranean coast, where he was in his element living on the water. He went out on the trawlers with the students and introduced them enthusiastically to the marvels of sea fauna. He was happy during those field trips and had the knack of communicating his happiness to the students.
Directory of Open Access Journals (Sweden)
R. S. Crosbie
2012-09-01
Full Text Available The Köppen-Geiger climate classification has been used for over a century to delineate climate types across the globe. As it was developed to mimic the distribution of vegetation, it may provide a useful surrogate for making projections of the future distribution of vegetation, and hence resultant hydrological implications, under climate change scenarios. This paper developed projections of the Köppen-Geiger climate types covering the Australian continent for a 2030 and 2050 climate relative to a 1990 historical baseline climate using 17 Global Climate Models (GCMs and five global warming scenarios. At the highest level of classification for a +2.4 °C future climate (the upper limit projected for 2050 relative to the historical baseline, it was projected that the area of the continent covered by
– tropical climate types would increase from 8.8% to 9.1%;
– arid climate types would increase from 76.5% to 81.7%;
– temperate climate types would decrease from 14.7% to 9.2%;
– cold climate types would decrease from 0.016% to 0.001%.
Previous climate change impact studies on water resources in Australia have assumed a static vegetation distribution. If the change in projected climate types is used as a surrogate for a change in vegetation, then the major transition in climate from temperate to arid in parts of Australia under a drier future climate could cause indirect effects on water resources. A transition from annual cropping to perennial grassland would have a compounding effect on the projected reduction in recharge. In contrast, a transition from forest to grassland would have a mitigating effect on the projected reduction in runoff.
Development of tecniques for constructing Geiger-Mueller counters
International Nuclear Information System (INIS)
Baccarelli, A.M.
1984-01-01
A systematic study of several construction techniques of Geiger-Mueller counters was carried out in order to establish the most suitable technology for such purpose. The results obtained with counters for alpha, beta and gamma rays, which were designed and built in the laboratory of Sao Paulo University (USP) are described. Most of the counters were built inside Pyrex-glass envelope and their cathodes were made of electrolytic copper or brass foils of still a silver layer deposited by chemical process. Some counters were made with cilyndrical brass tube. Anode wires of different materials and diameters and severals quenching vapors were used and the results obtained are described. All the procedures used in preparation of surfaces, cleaning of materials, purification of filling mixtures, the procedures for operating evacuation and filling of counters are described. The results obtained with self quenching counters using soda glass and an external colloidas graphite cathode are presented and the influence of filling mixtures is analysed. A technology to produce reliable counters from materials and gases easily available in the country was established. It is shown that counters with an external cathode can be used when recovery time on order of 2 μs are required. The plateus obtained for such counters were on order of 1000V with slope of about 0.5%. (Author) [pt
Calibration of Farmer dosimeter and Geiger-Mueller counters
International Nuclear Information System (INIS)
Yudelev, M.; Jones, D.T.L.
1988-11-01
According to the protocol adopted at NAC for neutron beam calibration a Farmer type dosimeter is the Secondary Standard instrument used to obtain the exposure calibration factors for tissue equivalent (TE) ion chambers in Co-60 beam. Miniature Geiger-Mueller (GM) counters are used in conjunction with the TE ion chambers to determine the gamma dose component in mixed neutron-gamma radiation fields which are produced by the neutron therapy treatment system. The calibration factors for the GM counters are somewhat lower (∼10%) than previous measurements with similar counters. The orientational changes in the sensitivity of the GM counters cause a change of about 14% in the calibration factor of Far West Technology (GM2) counter and about 2% for Alrad (ZP1300) counters either with or without the Li 6 F caps. The attenuation of the Co-60 gamma rays in the Li 6 F cap results in an increase of the calibration factor by about 2% for all counters. 2 figs., 5 refs., 4 tabs
Wang, Jianjun; Zhu, Junge; Wu, Sheng
2015-06-01
A novel (+) γ-lactamase gene (rutB) was cloned from Escherichia coli JM109 and expressed in E. coli BL21 (DE3), and the recombinant protein was characterized. The optimal conditions for the enzyme were pH 7.0 and temperature 30 °C, which indicated that it was a mesophilic protein. The free purified enzyme was deactivated when incubated at 50 °C for 30 min. However, the k cat value of RutB at its optimal temperature was about 2.5 times that of the archaeal enzyme from Sulfolobus sofataricus at its optimal temperature (85 °C). After immobilization on macroporous resin using glutaraldehyde cross-linkage, the thermostability of the crude enzyme was greatly enhanced and the deactivating temperature was raised to 70 °C. After immobilization, the minimal substrate inhibition concentration for RutB also improved from 0.75 to 1.5 M. The optimal concentrations of immobilized enzyme and substrate were determined to be 250 mg/ml and 1.5 M, when the initial reaction velocity was the response variable in batch transformations. This immobilization of RutB on macroporous resins provides another feasible approach for the preparation of optically active Vince lactam. As a member of the isochorismatase superfamily, RutB was demonstrated to be another typical γ-lactamase that showed catalytic promiscuity.
Characterization of new hexagonal large area Geiger Avalanche Photodiodes
International Nuclear Information System (INIS)
Boccone, V.; Aguilar, J.A.; Della Volpe, D.; Christov, A.; Montaruli, T.; Rameez, M.; Basili, A.
2013-06-01
Photomultipliers (PMTs) are the standard detector for construction of the current generation of imaging Atmospheric Cherenkov Telescopes (IACTs). Despite impressive improvements in QE and reliability in the last years, these devices suffer from the limitation of being unable to operate in the partially illuminated sky (during full or partial moon periods) as the excess light leads to a significant increase in the rate of ageing of the devices themselves and consequently limit the life of the camera. A viable alternative is the large area Geiger-mode avalanche photodiodes (G-APDs also known as Silicon Photomultipliers or SiPMs) that are commercially available from different producers in various types and dimensions. The sufficiency of the maturity of this technology for application to Cherenkov Astronomy has already been demonstrated by the FACT telescope. One of the camera designs under study for the 4 m Davies Cotton Telescope foresees the utilization of a large area G-APDs coupled to non imaging light concentrators. In collaboration with Hamamatsu and deriving from their current technology, we have designed a new hexagonal shaped large area G-APD HEX S12516 which when coupled to a Winston cone of 24 degrees cutting angle allows for a pixel angular resolution of 0.25 degrees for a f/D 1.4 telescope with a diameter of 4 m. The device, available in 2 different cell size configurations (50 μm and 100 μm), is divided into 4 different channels powered in common cathode mode. A temperature sensor was included for a better temperature evaluation in the characterization phase. The first 3 prototypes were fully characterized and the results are compared to the larger area devices commercially available such as the S10985-050C (2x2 array of 3x3 mm 2 G-APDs). The photo-detection efficiency is measured applying the Poisson statistics method using pulsed LED at 7 different wavelengths from 355 to 670 nm and for different bias over-voltages (V ov ). Optical crosstalk and
Cosmic Rays with Portable Geiger Counters: From Sea Level to Airplane Cruise Altitudes
Blanco, Francesco; La Rocca, Paola; Riggi, Francesco
2009-01-01
Cosmic ray count rates with a set of portable Geiger counters were measured at different altitudes on the way to a mountain top and aboard an aircraft, between sea level and cruise altitude. Basic measurements may constitute an educational activity even with high school teams. For the understanding of the results obtained, simulations of extensive…
Project and construction of a system to measure with Geiger-Mueller counter
International Nuclear Information System (INIS)
Melo, F.A. de.
1984-01-01
The project and construction of a Geiger-Mueller detection and analysis system using Brazilian electronic equipment and presented. The measure system has an high voltage source that permits to give variable output voltage in stages of 30 to 30 V, a pulse counter and a clock, that drives simultaneously with the counter. (E.G.) [pt
Geiger mode mapping: A new imaging modality for focused ion microprobes
Energy Technology Data Exchange (ETDEWEB)
Yang, Changyi; Hougaard, Christiaan R. [School of Physics, ARC Centre for Quantum Computation and Communication Technology, University of Melbourne, Parkville, VIC 3010 (Australia); Bielejec, Edward; Caroll, Malcolm S. [Sandia National Laboratories, POB 5800, Albuquerque, NM 87185 (United States); Jamieson, David N., E-mail: d.jamieson@unimelb.edu.au [School of Physics, ARC Centre for Quantum Computation and Communication Technology, University of Melbourne, Parkville, VIC 3010 (Australia)
2015-04-01
Geiger mode detectors fabricated in silicon are used to detect incident photons with high sensitivity. They are operated with large internal electric fields so that a single electron–hole pair can trigger an avalanche breakdown which generates a signal in an external circuit. We have applied a modified version of the ion beam induced charge technique in a nuclear microprobe system to investigate the application of Geiger mode detectors to detect discrete ion impacts. Our detectors are fabricated with an architecture based on the avalanche diode structure and operated with a transient bias voltage that activates the Geiger mode. In this mode avalanche breakdown is triggered by ion impact followed by diffusion of an electron–hole pair into the sensitive volume. The avalanche breakdown is quenched by removal of the transient bias voltage which is synchronized with a beam gate. An alternative operation mode is possible at lower bias voltages where the avalanche process self-quenches and the device exhibits linear charge gain as a consequence. Incorporation of such a device into a silicon substrate potentially allows the exceptional sensitivity of Geiger mode to register an electron–hole pair from sub-10 keV donor atom implants for the deterministic construction of shallow arrays of single atoms in the substrate required for emerging quantum technologies. Our characterization system incorporates a fast electrostatic ion beam switcher gated by the transient device bias, duration 800 ns, with a time delay, duration 500 ns, that allows for both the ion time of flight and the diffusion of the electron–hole pairs in the substrate into the sensitive region of the device following ion impact of a scanned 1 MeV H microbeam. We compare images at the micron scale mapping the response of the device to ion impact operated in both Geiger mode and avalanche (linear) mode for silicon devices engineered with this ultimate-sensitivity detector structure.
Gao, Shuaihua; Zhu, Shaozhou; Huang, Rong; Li, Hongxia; Wang, Hao; Zheng, Guojun
2018-01-01
To produce promising biocatalysts, natural enzymes often need to be engineered to increase their catalytic performance. In this study, the enantioselectivity and thermostability of a (+)-γ-lactamase from Microbacterium hydrocarbonoxydans as the catalyst in the kinetic resolution of Vince lactam (2-azabicyclo[2.2.1]hept-5-en-3-one) were improved. Enantiomerically pure (-)-Vince lactam is the key synthon in the synthesis of antiviral drugs, such as carbovir and abacavir, which are used to fight against HIV and hepatitis B virus. The work was initialized by using the combinatorial active-site saturation test strategy to engineer the enantioselectivity of the enzyme. The approach resulted in two mutants, Val54Ser and Val54Leu, which catalyzed the hydrolysis of Vince lactam to give (-)-Vince lactam, with 99.2% (enantiomeric ratio [E] > 200) enantiomeric excess (ee) and 99.5% ee (E > 200), respectively. To improve the thermostability of the enzyme, 11 residues with high temperature factors (B-factors) calculated by B-FITTER or high root mean square fluctuation (RMSF) values from the molecular dynamics simulation were selected. Six mutants with increased thermostability were obtained. Finally, the mutants generated with improved enantioselectivity and mutants evolved for enhanced thermostability were combined. Several variants showing (+)-selectivity (E value > 200) and improved thermostability were observed. These engineered enzymes are good candidates to serve as enantioselective catalysts for the preparation of enantiomerically pure Vince lactam. IMPORTANCE Enzymatic kinetic resolution of the racemic Vince lactam using (+)-γ-lactamase is the most often utilized means of resolving the enantiomers for the preparation of carbocyclic nucleoside compounds. The efficiency of the native enzymes could be improved by using protein engineering methods, such as directed evolution and rational design. In our study, two properties (enantioselectivity and thermostability) of a
Real Time Coincidence Processing Algorithm for Geiger Mode LADAR using FPGAs
2017-01-09
typical UAV communication links. A physical 64x256 Geiger-mode ladar array was integrated with an FPGA processing board running a baseline processing...implementation and results including FPGA complexity studies and algorithm performance results. Detailed FPGA utilization reports are generated for...imaging for robotics rely on synchronous time of flight (TOF) focal plane arrays (FPAs), with one example of that being the Microsoft Kinect sensor
Criticality Detection Using a Mirion Technologies DRM-2NC Remote Area Monitor Geiger-Mueller Probe
Kryskow, Adam P.
The prompt fission neutron activation and subsequent response of a DRM-2NC Geiger-Mueller probe (manufactured by Mirion Technologies) was investigated for the purpose of creating a criticality accident detection algorithm with sensitivity and false positive suppression comparable to modern criticality accident detection systems. The expected decay pattern of secondary emissions arising from the neutron induced activity of the Geiger-Mueller probe was investigated experimentally in high neutron fluence environments at research reactors operated by the University of Massachusetts Lowell, Pennsylvania State University, and the White Sands Missile Range of Los Alamos National Laboratory. Monte Carlo techniques were used to both identify key probe materials responsible for the majority of the Geiger-Mueller response and investigate the effects of boron doping to increase detector sensitivity and enhance the signal to noise ratio. Subsequently, a statistical algorithm centered on a point weighted linear regression of the combined effective half-life was developed as the basis for criticality declaration. Final testing of the system indicated that the system was capable of meeting all ANSI criticality accident criteria with sufficient sensitivity to the minimum accident of concern, an adequate response time, and an extremely low likelihood of false alarm.
Wang, Jianjun; Zhao, Guogang; Zhang, Zhiwei; Liang, Qiulin; Min, Cong; Wu, Sheng
2014-08-01
At present, autotransporter protein mediated surface display has opened a new dimension in the development of whole-cell biocatalysts. Here, we report the identification of a novel autotransporter Xcc_Est from Xanthomonas campestris pv campestris 8004 by bioinformatic analysis and application of Xcc_Est as an anchoring motif for surface display of γ-lactamase (Gla) from thermophilic archaeon Sulfolobus solfataricus P2 in Escherichia coli. The localization of γ-lactamase in the cell envelope was monitored by Western blot, activity assay and flow cytometry analysis. Either the full-length or truncated Xcc_Est could efficiently transport γ-lactamase to the cell surface. Compared with the free enzyme, the displayed γ-lactamase exhibited optimum temperature of 30 °C other than 90 °C, with a substantial decrease of 60 °C. Under the preparation system, the engineered E. coli with autodisplayed γ-lactamase converted 100 g racemic vince lactam to produce 49.2 g (-) vince lactam at 30 °C within 4 h. By using chiral HPLC, the ee value of the produced (-) vince lactam was determined to be 99.5 %. The whole-cell biocatalyst exhibited excellent stability under the operational conditions. Our results indicate that the E. coli with surface displayed γ-lactamase is an efficient and economical whole cell biocatalyst for preparing the antiviral drug intermediate (-) vince lactam at mild temperature, eliminating expensive energy cost performed at high temperature.
Construction techniques and working principles of external cathode Geiger-Mueller counters
International Nuclear Information System (INIS)
Sevegnani, Francisco Xavier
1996-01-01
In this paper, the construction technique and working principles of the external cathode Geiger-Mueller counter are described in detail. During the analysis of the behavior of these counters a new phenomena was observed, related to an increase int he background rate with the applied voltage. The experiments have also shown that the pulse amplitude of those counters decreases exponentially with the counting rate. The counters built with the techniques described in this paper has shown plateaus of about 1400 V with slope of 0,8%/100 V. (author)
Development and design of geiger counter interface circuit for extended radiation intensity range
International Nuclear Information System (INIS)
Elaraby, S.M.; Kamel, S.A.
2005-01-01
This paper has focused on the development and the design of geiger counter interface circuit for extended radiation intensity ranges. Specially in this case, amplitude sensitivity processing time of conventional interface circuit are the main problems. The proposed interface circuit is applicable for use in either analog or digital based instruments, in radiation dose rate measurements from the micro sever t/hour (MSv/h) up to sever t/hour (Sv/h) ranges. The proposed Design is based on an unconventional of the CMOS unbuffered logic hex-inverter device 4069 UB as analog/digital processing element. This design has been investigated by simulation for sensitivity to signal amplitude, rejection of super imposed traces signals occurrence that may cause erroneous output and for temperature variations Effects. Simulation results verify the applicability of the proposed geiger counter interface circuit for wider range of radiation intensity measurements, accurately. The proposed design has been implemented and verified through calibration by the national institute for standards NIS-EG
Application of a background-compensated Geiger-Mueller counter to a survey meter
International Nuclear Information System (INIS)
Mori, C.; Kumanomido, H.; Watanabe, T.
1984-01-01
A background-compensated Geiger-Mueller counter was used as a probe for a GM survey meter to obtain a net count rate of β-rays from a radioactive source in a quick survey. Although a background counting ratio between the two parts in the counter, front and rear, varied somewhat depending on the incident direction of background γ-rays, it was possible to compensate the background counts by subtracting a part of the rear counts, which were background counts, from the front counts, which contained β-ray counts and background counts. Undesirable small pulses generated during the recovering time after a full Geiger discharge were eliminated by an anticoincidence gating method. The survey meter with this counter and a differential ratemeter is useful for obtaining a net count rate of β-rays emitted from a surface radioactive-contamination or from a source being put near the window of the counter with nearly the same accuracy in half the measuring time as compared with conventional GM counters. (orig.)
Evaluation of single photon and Geiger mode Lidar for the 3D Elevation Program
Stoker, Jason M.; Abdullah, Qassim; Nayegandhi, Amar; Winehouse, Jayna
2016-01-01
Data acquired by Harris Corporation’s (Melbourne, FL, USA) Geiger-mode IntelliEarth™ sensor and Sigma Space Corporation’s (Lanham-Seabrook, MD, USA) Single Photon HRQLS sensor were evaluated and compared to accepted 3D Elevation Program (3DEP) data and survey ground control to assess the suitability of these new technologies for the 3DEP. While not able to collect data currently to meet USGS lidar base specification, this is partially due to the fact that the specification was written for linear-mode systems specifically. With little effort on part of the manufacturers of the new lidar systems and the USGS Lidar specifications team, data from these systems could soon serve the 3DEP program and its users. Many of the shortcomings noted in this study have been reported to have been corrected or improved upon in the next generation sensors.
Design of a portable dose rate detector based on a double Geiger-Mueller counter
Wang, Peng; Tang, Xiao-Bin; Gong, Pin; Huang, Xi; Wen, Liang-Sheng; Han, Zhen-Yang; He, Jian-Ping
2018-01-01
A portable dose rate detector was designed to monitor radioactive pollution and radioactive environments. The portable dose detector can measure background radiation levels (0.1 μSv/h) to nuclear accident radiation levels (>10 Sv/h). Both automatic switch technology of a double Geiger-Mueller counter and time-to-count technology were adopted to broaden the measurement range of the instrument. Global positioning systems and the 3G telecommunication protocol were installed to prevent radiation damage to the human body. In addition, the Monte Carlo N-Particle code was used to design the thin layer of metal for energy compensation, which was used to flatten energy response The portable dose rate detector has been calibrated by the standard radiation field method, and it can be used alone or in combination with additional radiation detectors.
International Nuclear Information System (INIS)
Sevegnani, F.X.
1988-01-01
The construction techniques for external cathode (Maze) and internal cathode Geiger-Muller counters are described, showing the operation principles and the used material nature. More than 200 counter types were evaluated analysing their characteristics. The influence of several types of guard-rings was studied, for optimizing counter operation conditions. Plateaus of the order of 700 V with slope of 0,3%/100 V for the net counting rate, and 1400 V with a slope of 0,8/100 V for total counts using total pressure of 10 cmHg, were obtained. A counter for β detection, using blown glass window in one of the edges of the cylinder was constructed. Counters of long life using materials such as, mica, adhesive glues, etc., were obtained. The results shown that the best counter operation occurs when it is empty in a vacuum of 10 -5 mmHg. (M.C.K.) [pt
Multipixel geiger-mode photon detectors for ultra-weak light sources
International Nuclear Information System (INIS)
Campisi, A.; Cosentino, L.; Finocchiaro, P.; Pappalardo, A.; Musumeci, F.; Privitera, S.; Scordino, A.; Tudisco, S.; Fallica, G.; Sanfilippo, D.; Mazzillo, M.; Condorelli, G.; Piazza, A.; Valvo, G.; Lombardo, S.; Sciacca, E.; Bonanno, G.; Belluso, M.
2007-01-01
Arrays of Single Photon Avalanche Detectors (SPAD) are considered today as a possible alternative to PMTs and other semiconductor devices in several applications, like physics research, bioluminescence, Positron Emission Tomography (PET) systems, etc. We have developed and characterized a first prototype array produced by STMicroelectronics in silicon planar technology and working at low voltage (30-40 V) in Geiger mode operation. The single cell structure (size down to 20 μm) and the geometrical arrangement give rise to appealing intrinsic characteristics of the device, such as photon detection efficiency, dark count map, cross-talk effects, timing and energy resolution. New prototypes are under construction with a higher number of pixels that have a common output signal to obtain a so-called SiPM (Silicon PhotoMultiplier) configuration
Geiger-Mueller haloid counter dead time dependence on counting rate
International Nuclear Information System (INIS)
Onishchenko, A.M.; Tsvetkov, A.A.
1980-01-01
The experimental dependences of the dead time of Geiger counters (SBM-19, SBM-20, SBM-21 and SGM-19) on the loading, are presented. The method of two sources has been used to determine the dead time counters of increased stability. The counters are switched on according to the usually used circuit of discrete counting with loading resistance of 50 MOhm and the separating capacity of 10 pF. Voltage pulses are given to the counting device with the time of resolution of 100 ns, discrimenation threshold 3 V, input resistance 3.6 Ω and the input capacity-15 pF. The time constant of the counter RC-circuit is 50 μs
Point Spread Function (PSF) noise filter strategy for geiger mode LiDAR
Smith, O'Neil; Stark, Robert; Smith, Philip; St. Romain, Randall; Blask, Steven
2013-05-01
LiDAR is an efficient optical remote sensing technology that has application in geography, forestry, and defense. The effectiveness is often limited by signal-to-noise ratio (SNR). Geiger mode avalanche photodiode (APD) detectors are able to operate above critical voltage, and a single photoelectron can initiate the current surge, making the device very sensitive. These advantages come at the expense of requiring computationally intensive noise filtering techniques. Noise is a problem which affects the imaging system and reduces the capability. Common noise-reduction algorithms have drawbacks such as over aggressive filtering, or decimating in order to improve quality and performance. In recent years, there has been growing interest on GPUs (Graphics Processing Units) for their ability to perform powerful massive parallel processing. In this paper, we leverage this capability to reduce the processing latency. The Point Spread Function (PSF) filter algorithm is a local spatial measure that has been GPGPU accelerated. The idea is to use a kernel density estimation technique for point clustering. We associate a local likelihood measure with every point of the input data capturing the probability that a 3D point is true target-return photons or noise (background photons, dark-current). This process suppresses noise and allows for detection of outliers. We apply this approach to the LiDAR noise filtering problem for which we have recognized a speed-up factor of 30-50 times compared to traditional sequential CPU implementation.
Geiger-mode APD camera system for single-photon 3D LADAR imaging
Entwistle, Mark; Itzler, Mark A.; Chen, Jim; Owens, Mark; Patel, Ketan; Jiang, Xudong; Slomkowski, Krystyna; Rangwala, Sabbir
2012-06-01
The unparalleled sensitivity of 3D LADAR imaging sensors based on single photon detection provides substantial benefits for imaging at long stand-off distances and minimizing laser pulse energy requirements. To obtain 3D LADAR images with single photon sensitivity, we have demonstrated focal plane arrays (FPAs) based on InGaAsP Geiger-mode avalanche photodiodes (GmAPDs) optimized for use at either 1.06 μm or 1.55 μm. These state-of-the-art FPAs exhibit excellent pixel-level performance and the capability for 100% pixel yield on a 32 x 32 format. To realize the full potential of these FPAs, we have recently developed an integrated camera system providing turnkey operation based on FPGA control. This system implementation enables the extremely high frame-rate capability of the GmAPD FPA, and frame rates in excess of 250 kHz (for 0.4 μs range gates) can be accommodated using an industry-standard CameraLink interface in full configuration. Real-time data streaming for continuous acquisition of 2 μs range gate point cloud data with 13-bit time-stamp resolution at 186 kHz frame rates has been established using multiple solid-state storage drives. Range gate durations spanning 4 ns to 10 μs provide broad operational flexibility. The camera also provides real-time signal processing in the form of multi-frame gray-scale contrast images and single-frame time-stamp histograms, and automated bias control has been implemented to maintain a constant photon detection efficiency in the presence of ambient temperature changes. A comprehensive graphical user interface has been developed to provide complete camera control using a simple serial command set, and this command set supports highly flexible end-user customization.
Energy Technology Data Exchange (ETDEWEB)
Fontolan, Juliana A.; Biral, Antonio Renato P., E-mail: fontolanjuliana@gmail.com.br, E-mail: biral@ceb.unicamp.br [Hospital das Clinicas (CEB/UNICAMP), Campinas, SP (Brazil). Centro de Engenharia Biomedica
2013-07-01
It is known that the distribution at time intervals of random and unrelated events leads to the Poisson distribution . This work aims to study the distribution in time intervals of events resulting from radioactive decay of atoms present in the UNICAMP where activities involving the use of ionizing radiation are performed environments . The proposal is that the distribution surveys at intervals of these events in different locations of the university are carried out through the use of a Geiger-Mueller tube . In a next step , the evaluation of distributions obtained by using non- parametric statistics (Chi- square and Kolmogorov Smirnoff) will be taken . For analyzes involving correlations we intend to use the ANOVA (Analysis of Variance) statistical tool . Measured in six different places within the Campinas , with the use of Geiger- Muller its count mode and a time window of 20 seconds was performed . Through statistical tools chi- square and Kolmogorov Smirnoff tests, using the EXCEL program , it was observed that the distributions actually refer to a Poisson distribution. Finally, the next step is to perform analyzes involving correlations using the statistical tool ANOVA.
International Nuclear Information System (INIS)
Neumann, K.
1989-01-01
The radioactivity levels of irradiated air were measured directly using a Geiger-Mueller counting circuit and compared to the activity of irradiated tablets consisting of solid-state malonic acid and triazole, which was determined by means of a scintillation counter. During studies in various accelerator plants in North Rhine-Westphalia parallel measurements were carried out on the basis of the two methods described above. In order to determine the percentage shares of nitrogen and oxygen in the total amount of radioactivity the values measured were further analysed using the linear regression procedure and the method of the least squares according to Gauss. The combined use of the data obtained and suitable mathematical models permitted average values to be calculated for the activity concentrations occurring during a predetermined period of time under clinical conditions in the area of irradiation. Even in a set of unfavourable circumstances did these prove to be appreciably below the threshold values mandated by the radiation protection laws for individuals at risk of occupational radiation exposure. (orig./DG) [de
Particulate matter time-series and Köppen-Geiger climate classes in North America and Europe
Pražnikar, Jure
2017-02-01
Four years of time-series data on the particulate matter (PM) concentrations from 801 monitoring stations located in Europe and 234 stations in North America were analyzed. Using k-means clustering with distance correlation as a measure for similarity, 5 distinct PM clusters in Europe and 9 clusters across the United States of America (USA) were found. This study shows that meteorology has an important role in controlling PM concentrations, as comparison between Köppen-Geiger climate zones and identified PM clusters revealed very good spatial overlapping. Moreover, the Köppen-Geiger boundaries in Europe show a high similarity to the boundaries as defined by PM clusters. The western USA is much more diverse regarding climate zones; this characteristic was confirmed by cluster analysis, as 6 clusters were identified in the west, and only 3 were identified on the eastern side of the USA. The lowest similarity between PM time-series in Europe was observed between the Iberian Peninsula and the north Europe clusters. These two regions also show considerable differences, as the cold semi-arid climate has a long and hot summer period, while the cool continental climate has a short summertime and long and cold winters. Additionally, intra-continental examination of European clusters showed meteorologically driven phenomena in autumn 2011 encompassing a large European region from Bulgaria in the south, Germany in central Europe and Finland in the north with high PM concentrations in November and a decline in December 2011. Inter-continental comparison between Europe and the USA clusters revealed a remarkable difference between the PM time-series located in humid continental zone. It seems that because of higher shortwave downwelling radiation (≈210 W m-2) over the USA's continental zone, and consequently more intense production of secondary aerosols, a summer peak in PM concentration was observed. On the other hand, Europe's humid continental climate region experiences
International Nuclear Information System (INIS)
Matsuda, Kazunori; Sanada, Junpei
1990-01-01
Simple methods were applied to investigate the characteristics of a Geiger-Mueller counter with Q-gas flowing at 1 atm. The propagation velocity of the photon-aided avalanche along the anode wire depends linearly on the strength of the electric field in the counter. Its fluctuation (FWHM) as a function of distance between the source position and the end point is discussed. (orig.)
Saucedo Aguiar, Álvaro Edmundo
2016-01-01
Este proyecto busca contribuir con el desarrollo de la industria cacaotera en el Cantón Vinces, puesto que es aquí donde se encuentra el mejor cacao del Ecuador situación dada por las condiciones de clima y suelo, no obstante la comercialización de este producto siempre ha sido como materia prima es decir en grano. Esta característica ha sido por mucho tiempo una limitante para que los productores que tienen pequeñas extensiones de tierras dedicadas al cultivo de cacao no obtengan mayores ing...
International Nuclear Information System (INIS)
Urquizo, Rafael; Gago, Javier; Mendoza, Pablo; Cruz-Saco, Cesar; Rojas, Jorge
2014-01-01
This article presents the implementation of a measurement system using a Geiger-Mueller detector (GM) in order to adapt it into a 99m Tc generator prototype. The response signal of the measurement system designed in terms of count rate is linearly proportional to the variation of the activity measured between 280 and 170 mCi of 99m Tc with a relative error of ± 2,8 %. However, further tests are needed to evaluate the correlation for an activity level lower than 20 mCi in order to obtain an adequate range of use. (authors).
International Nuclear Information System (INIS)
Anon.
1999-01-01
In order to study the Chernobyl accident impact on ecosystems, Ukrainian and Swiss scientists have used a plant: the Arabidopsis thaliana. They have introduced in its genome a gene coding an enzyme called β-glucuronidase. This substance, when it is expressed, colours vegetal cells blue. In fact the introduced gene is divided between 2 paired chromosomes. When the plant is placed on a nuclear contaminated soil, radiation damaged chromosomes exchange fragments and the 2 parts of the enzyme gene may recombine, the enzyme can then be expressed. For low and medium contamination ( 2 ) biologists have found a correlation between the number of blue spots on the plant and the irradiation rate. (A.C.)
PEP-4 geiger-mode hexagonal calorimeter
International Nuclear Information System (INIS)
Wenzel, W.A.
1982-01-01
The design and performance of the calorimeter are briefly described. Design aspects include illustrations of the active volume of the detector, edge connections, module assembly and analog electronics. Performance data for cosmic rays and radiation sources, including efficiency and channel sensitivity are discussed
Unexpected Retroaldol-Aldol Reaction during O-Alkylation of Hydroxylated Vince Lactam Derivatives.
Bengtsson, Christoffer; Wetzel, Alexander; Bergman, Joakim; Brånalt, Jonas
2016-01-15
The unexpected retroaldol-aldol reaction during O-alkylation of a β-hydroxy lactam was found to be highly dependent on the temperature and shows a remarkable solvent effect. In DMF, O-alkylation is faster than retroaldol-aldol rearrangement giving exclusively products with retention of configuration. In THF, O-alkylation is slower than rearrangement, giving selectively products with inversion of stereochemistry. In DMSO, a retroaldol reaction followed by fast intramolecular proton transfer occurs to give the ring-opened aldehyde.
Energy Technology Data Exchange (ETDEWEB)
Badoni, Davide [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy)]. E-mail: davide.badoni@roma2.infn.it; Altamura, Francesco [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Basili, Alessandro [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Bencardino, Raffaele [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Bidoli, Vittorio [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Casolino, Marco [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); De Carli, Anna [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Froysland, Tom [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Marchetti, Marcello [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Messi, Roberto [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Minori, Mauro [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Picozza, Piergiorgio [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Salina, Gaetano [Phy. Dep. Univ. ' Tor Vergata' , Tor Vergata Sect. INFN (Italy); Galper, Arkady [Moscow Engineering and Physics Institute (Russian Federation); Korotkov, Mikhail [Moscow Engineering and Physics Institute (Russian Federation); Popov, Alexander [Moscow Engineering and Physics Institute (Russian Federation)
2007-03-01
Silicon Photomultipliers (Si-PM) consist of an array of semiconductor photodiodes joint on the common substrate and operating in limited geiger mode. A new generation of Si-PM is currently under test in INFN Rome Tor Vergata facilities: they consist of a 5625 element, 3*3mm{sup 2} array with an improved light response. These elements have been characterized. Furthermore, a functional model of the Si-PM has been developed to be used in a VLSI development of front-end electronics.
Energy Technology Data Exchange (ETDEWEB)
Jones, K.W. [Los Alamos National Lab., NM (United States); Browman, A. [Amparo Corp., Sante Fe, NM (United States)
1997-01-01
The BPI Model 2080 Pulsed Neutron Detector has been used for over seven years as an area radiation monitor and dose limiter at the LANSCE accelerator complex. Operating experience and changing environments over this time have revealed several vulnerabilities (susceptibility to electrical noise, paralysis in high dose rate fields, etc.). Identified vulnerabilities have been connected; these modifications include component replacement and circuit design changes. The data and experiments leading to these modifications will be presented and discussed. Calibration of the instrument is performed in mixed static gamma and neutron source fields. The statistical characteristics of the Geiger-Muller tubes coupled with significantly different sensitivity to gamma and neutron doses require that careful attention be paid to acceptable fluctuations in dose rate over time during calibration. The performance of the instrument has been modeled using simple Poisson statistics and the operating characteristics of the Geiger-Muller tubes. The results are in excellent agreement with measurements. The analysis and comparison with experimental data will be presented.
Directory of Open Access Journals (Sweden)
Galo Alberto Salcedo
Full Text Available ABSTRACT: The irrigation water consumption of a soilless cucumber crop under greenhouse conditions in a humid tropical climate has been evaluated in this paper in order to improve the irrigation water and fertilizers management in these specific conditions. For this purpose, a field experiment was conducted. Two trials were carried out during the years 2011 and 2014 in an experimental farm located in Vinces (Ecuador. In each trial, the complete growing cycle of a cucumber crop grown under a greenhouse was evaluated. Crop development was monitored and a good fit to a sigmoidal Gompertz type growth function was reported. The daily water uptake of the crop was measured and related to the most relevant indoor climate variables. Two different combination methods, namely the Penman-Monteith equation and the Baille equation, were applied. However, the results obtained with these combination methods were not satisfactory due to the poor correlation between the climatic variables, especially the incoming radiation, and the crop's water uptake (WU. On contrary, a good correlation was reported between the crop's water uptake and the leaf area index (LAI, especially in the initial crop stages. However, when the crop is fully developed, the WU stabilizes and becomes independent from the LAI. A preliminary model to simulate the water uptake of the crop was adjusted using the data obtained in the first experiment and then validated with the data of the second experiment.
Optical Communications With A Geiger Mode APD Array
2016-02-09
this state trigger a chain reaction, resulting in a large sudden spike in voltage which can be read out as a digital pulse. This pulse can be timed to...detector array is backed by a custom readout integrated circuit , or ROIC, which takes care of resetting each of the detectors after they are triggered...during readout to build up a complete time- and position-stamped map of all pixel fires. 2.1 ROIC Clocking The major unit of time for the ROIC is the
Large Format Geiger Mode Avalanche Photodiode Arrays and Readout Circuits
2017-06-01
efficiency. The pixels in early CMOS designs effectively contained a “stop watch ” – circuitry operating at fast clock frequencies required to...Ghioni, A. Lacaita, C. Samori, and F. Zappa, "Avalanche photodiodes and quenching circuits for single-photon detection," Appl . Opt. 35, 1956-1976
Accelerated physical modelling of radioactive waste migration in soil
International Nuclear Information System (INIS)
Zimmie, T.F.; De, A.; Mahmud, M.B.
1994-01-01
A 100 g-tonne geotechnical centrifuge was used to study the long-term migration of a contaminant and radioactive tracer through a saturated soil medium. The use of the centrifuge simulates the acceleration of travel time in the prototype, which is N times larger than the model, by N 2 , where N is the desired g level. For a 5 h run at 60 g, the test modelled a migration time of about 2 years for a prototype 60 times larger than the small-scale model tested. Iodine 131, used as the tracer, was injected onto the surface of the soil, and was allowed to migrate with a constant head of water through the saturated soil. End window Geiger-Mueller (G-M) tubes were used to measure the count rate of the radioactive tracer flowing through the soil. The time from the peak response of one G-M tube to the other denotes the travel time between the two points in the flow domain. The results obtained using the radioactive tracer are in good agreement with the test performed on the same model setup using potassium permanganate as tracer and with numerical flow net modelling. Radioactive tracers can be useful in the study of nonradioactive contaminants as well, offering a nonintrusive (nondestructive) method of measuring contaminant migration. (author). 18 refs., 1 tab., 7 figs
An empirical model of the high-energy electron environment at Jupiter
Soria-Santacruz, M.; Garrett, H. B.; Evans, R. W.; Jun, I.; Kim, W.; Paranicas, C.; Drozdov, A.
2016-10-01
We present an empirical model of the energetic electron environment in Jupiter's magnetosphere that we have named the Galileo Interim Radiation Electron Model version-2 (GIRE2) since it is based on Galileo data from the Energetic Particle Detector (EPD). Inside 8RJ, GIRE2 adopts the previously existing model of Divine and Garrett because this region was well sampled by the Pioneer and Voyager spacecraft but poorly covered by Galileo. Outside of 8RJ, the model is based on 10 min averages of Galileo EPD data as well as on measurements from the Geiger Tube Telescope on board the Pioneer spacecraft. In the inner magnetosphere the field configuration is dipolar, while in the outer magnetosphere it presents a disk-like structure. The gradual transition between these two behaviors is centered at about 17RJ. GIRE2 distinguishes between the two different regions characterized by these two magnetic field topologies. Specifically, GIRE2 consists of an inner trapped omnidirectional model between 8 to 17RJ that smoothly joins onto the original Divine and Garrett model inside 8RJ and onto a GIRE2 plasma sheet model at large radial distances. The model provides a complete picture of the high-energy electron environment in the Jovian magnetosphere from ˜1 to 50RJ. The present manuscript describes in great detail the data sets, formulation, and fittings used in the model and provides a discussion of the predicted high-energy electron fluxes as a function of energy and radial distance from the planet.
An accurate behavioral model for single-photon avalanche diode statistical performance simulation
Xu, Yue; Zhao, Tingchen; Li, Ding
2018-01-01
An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.
Berger, Beverly; Isenberg, James
2004-02-01
It is hard to find an area of mathematical relativity that has not been profoundly influenced and advanced by the work of Vincent Moncrief. He has done groundbreaking work on everything from his studies of gravitational radiation production by astrophysical systems to his model proofs of strong cosmic censorship. His work on the manifold structure of the space of solutions of the Einstein constraint equations, and on methods for systematically constructing such solutions, has done much to define and shape that field of inquiry, while his work on long time existence for the Yang-Mills and Einstein equations has significantly advanced the study of nonlinear hyperbolic PDE theory. The underlying theme to this very diverse portfolio of work is this: throughout his career, Vince has consistently chosen to attack those problems in general relativity which are both central to understanding the physical content of the gravitational physics of the theory, and are amenable to rigorous mathematical analysis. Combining an uncanny physical insight with a deep mathematical skill, and adding a legendary degree of carefulness, Vince has remade a great many of the areas of mathematical general relativity. The best known early example of Vince's unique style is the tour de force represented by his analyses of perturbations of black holes. He combined remarkable insight and calculational ability to develop techniques based on gauge invariant variables that both simplified the calculations and clarified their physical significance. These methods have proven invaluable for subsequent analytic and numerical studies (by Vince and by many others) of black hole stability and gravitational radiation from perturbed black holes. While a significant portion of Vince's earliest work concentrated on black hole systems, from around 1980 on, Vince's focus has been on cosmological spacetimes. He has shown that large families of them have Cauchy horizons (and therefore violate causality), and has
Energy Technology Data Exchange (ETDEWEB)
Granados, C. E.
1959-07-01
Total details about the manufacture elements in counter fabrication and the way of obtention are described as well as total indications useful in the installation process and filling of the counter. The appropriate materials and precautions that might be adopted in order to obtain counters with uniform operation and good characteristics, are described. Counters are of brass, with thin mica or aluminium windows and operate at 1100 V approximately with a slope lower than 5 % 1100 V. (Author)
Range walk error correction and modeling on Pseudo-random photon counting system
Shen, Shanshan; Chen, Qian; He, Weiji
2017-08-01
Signal to noise ratio and depth accuracy are modeled for the pseudo-random ranging system with two random processes. The theoretical results, developed herein, capture the effects of code length and signal energy fluctuation are shown to agree with Monte Carlo simulation measurements. First, the SNR is developed as a function of the code length. Using Geiger-mode avalanche photodiodes (GMAPDs), longer code length is proven to reduce the noise effect and improve SNR. Second, the Cramer-Rao lower bound on range accuracy is derived to justify that longer code length can bring better range accuracy. Combined with the SNR model and CRLB model, it is manifested that the range accuracy can be improved by increasing the code length to reduce the noise-induced error. Third, the Cramer-Rao lower bound on range accuracy is shown to converge to the previously published theories and introduce the Gauss range walk model to range accuracy. Experimental tests also converge to the presented boundary model in this paper. It has been proven that depth error caused by the fluctuation of the number of detected photon counts in the laser echo pulse leads to the depth drift of Time Point Spread Function (TPSF). Finally, numerical fitting function is used to determine the relationship between the depth error and the photon counting ratio. Depth error due to different echo energy is calibrated so that the corrected depth accuracy is improved to 1cm.
International Nuclear Information System (INIS)
Vinogradov, S.
2012-01-01
Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.
Detection and 3d Modelling of Vehicles from Terrestrial Stereo Image Pairs
Coenen, M.; Rottensteiner, F.; Heipke, C.
2017-05-01
The detection and pose estimation of vehicles plays an important role for automated and autonomous moving objects e.g. in autonomous driving environments. We tackle that problem on the basis of street level stereo images, obtained from a moving vehicle. Processing every stereo pair individually, our approach is divided into two subsequent steps: the vehicle detection and the modelling step. For the detection, we make use of the 3D stereo information and incorporate geometric assumptions on vehicle inherent properties in a firstly applied generic 3D object detection. By combining our generic detection approach with a state of the art vehicle detector, we are able to achieve satisfying detection results with values for completeness and correctness up to more than 86%. By fitting an object specific vehicle model into the vehicle detections, we are able to reconstruct the vehicles in 3D and to derive pose estimations as well as shape parameters for each vehicle. To deal with the intra-class variability of vehicles, we make use of a deformable 3D active shape model learned from 3D CAD vehicle data in our model fitting approach. While we achieve encouraging values up to 67.2% for correct position estimations, we are facing larger problems concerning the orientation estimation. The evaluation is done by using the object detection and orientation estimation benchmark of the KITTI dataset (Geiger et al., 2012).
Spädtke, P
2013-01-01
Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.
Santhosh, K. P.; Sukumaran, Indu
2017-09-01
Half-life predictions have been performed for the proton emitters with Z >50 in the ground state and isomeric state using the Coulomb and proximity potential model for deformed nuclei (CPPMDN). The agreement of the calculated values with the experimental data made it possible to predict some proton emissions that are not verified experimentally yet. For a comparison, the calculations also are performed using other theoretical models, such as the Gamow-like model of Zdeb et al. [Eur. Phys. J. A 52, 323 (2016), 10.1140/epja/i2016-16323-7], the semiempirical relation of Hatsukawa et al. [Phys. Rev. C 42, 674 (1990), 10.1103/PhysRevC.42.674], and the CPPM of Santhosh et al. [Pramana 58, 611 (2002)], 10.1007/s12043-002-0019-2. The Geiger-Nuttall law, originally observed for α decay, studied for proton radioactivity is found to work well provided it is plotted for the isotopes of a given proton emitter nuclide with the same ℓ value. The universal curve is found to be valid for proton radioactivity also as we obtained a single straight line for all proton emissions irrespective of the parents. Through the analysis of the experimentally measured half-lives of 44 proton emitters, the study revealed that the present systematic study lends support to a unified description for studying α decay, cluster radioactivity, and proton radioactivity.
Demonstration of Lasercom and Spatial Tracking with a Silicon Geiger-Mode APD Array
2016-02-26
that can be processed by higher layers in the ROIC and then transported off-chip digitally. In this way both spatial and temporal information can be...communications,” Proc. SPIE 8971, pp. 89710I–89710I–7, 2014. 4. P. A. Hiskett and R. A. Lamb , “Underwater optical communications with a single photon
Description of the manufacture of a Geiger-Muller counter with window
International Nuclear Information System (INIS)
Granados, C. E.
1959-01-01
Total details about the manufacture elements in counter fabrication and the way of obtention are described as well as total indications useful in the installation process and filling of the counter. The appropriate materials and precautions that might be adopted in order to obtain counters with uniform operation and good characteristics, are described. Counters are of brass, with thin mica or aluminium windows and operate at 1100 V approximately with a slope lower than 5 % 1100 V. (Author)
Geiger-Mode SiGe Receiver for Long-Range Optical Communications, Phase I
National Aeronautics and Space Administration — The objective of this program is to develop, demonstrate and implement a photon-counting detector array sensitive in the wavelength range from 1000 nm to 1600 nm,...
Experimental evaluation of penetration capabilities of a Geiger-mode APD array laser radar system
Jonsson, Per; Tulldahl, Michael; Hedborg, Julia; Henriksson, Markus; Sjöqvist, Lars
2017-10-01
Laser radar 3D imaging has the potential to improve target recognition in many scenarios. One case that is challenging for most optical sensors is to recognize targets hidden in vegetation or behind camouflage. The range resolution of timeof- flight 3D sensors allows segmentation of obscuration and target if the surfaces are separated far enough so that they can be resolved as two distances. Systems based on time-correlated single-photon counting (TCSPC) have the potential to resolve surfaces closer to each other compared to laser radar systems based on proportional mode detection technologies and is therefore especially interesting. Photon counting detection is commonly performed with Geigermode Avalanche Photodiodes (GmAPD) that have the disadvantage that they can only detect one photon per laser pulse per pixel. A strong return from an obscuring object may saturate the detector and thus limit the possibility to detect the hidden target even if photons from the target reach the detector. The operational range where good foliage penetration is observed is therefore relatively narrow for GmAPD systems. In this paper we investigate the penetration capability through semi-transparent surfaces for a laser radar with a 128×32 pixel GmAPD array and a 1542 nm wavelength laser operating at a pulse repetition frequency of 90 kHz. In the evaluation a screen was placed behind different canvases with varying transmissions and the detected signals from the surfaces for different laser intensities were measured. The maximum return from the second surface occurs when the total detection probability is around 0.65-0.75 per pulse. At higher laser excitation power the signal from the second surface decreases. To optimize the foliage penetration capability it is thus necessary to adaptively control the laser power to keep the returned signal within this region. In addition to the experimental results, simulations to study the influence of the pulse energy on penetration through foliage in a scene with targets behind vegetation are presented. The optimum detection of targets occurs here at a slightly higher total photon count rate probability because a number of pixel have no obscuration in front the target in their field of view.
Geiger-Mode Avalanche Photodiode Arrays Integrated to All-Digital CMOS Circuits
2016-01-20
Digital CMOS Circuits* *This work was sponsored by the Assistant Secretary of Defense...control. The ultimate sensitivity limitation of a CCD is set by the readout noise of the output amplifier that senses the charge packets and...all‐ digital CMOS readout circuits. The term "photon counting" is used broadly here to mean digital recording of a photon arrival within the
Plant functional type mapping for earth system models
Directory of Open Access Journals (Sweden)
B. Poulter
2011-11-01
Full Text Available The sensitivity of global carbon and water cycling to climate variability is coupled directly to land cover and the distribution of vegetation. To investigate biogeochemistry-climate interactions, earth system models require a representation of vegetation distributions that are either prescribed from remote sensing data or simulated via biogeography models. However, the abstraction of earth system state variables in models means that data products derived from remote sensing need to be post-processed for model-data assimilation. Dynamic global vegetation models (DGVM rely on the concept of plant functional types (PFT to group shared traits of thousands of plant species into usually only 10–20 classes. Available databases of observed PFT distributions must be relevant to existing satellite sensors and their derived products, and to the present day distribution of managed lands. Here, we develop four PFT datasets based on land-cover information from three satellite sensors (EOS-MODIS 1 km and 0.5 km, SPOT4-VEGETATION 1 km, and ENVISAT-MERIS 0.3 km spatial resolution that are merged with spatially-consistent Köppen-Geiger climate zones. Using a beta (ß diversity metric to assess reclassification similarity, we find that the greatest uncertainty in PFT classifications occur most frequently between cropland and grassland categories, and in dryland systems between shrubland, grassland and forest categories because of differences in the minimum threshold required for forest cover. The biogeography-biogeochemistry DGVM, LPJmL, is used in diagnostic mode with the four PFT datasets prescribed to quantify the effect of land-cover uncertainty on climatic sensitivity of gross primary productivity (GPP and transpiration fluxes. Our results show that land-cover uncertainty has large effects in arid regions, contributing up to 30% (20% uncertainty in the sensitivity of GPP (transpiration to precipitation. The availability of PFT datasets that are consistent
International Nuclear Information System (INIS)
Prasad, S.K.; Sudheer, T.S.; Sahoo, L.; Vinayagam, Bhakti; Kamble, Mahesh; Khuspe, R.R.; Anilkumar, Rekha; Verma, K.K.
2009-01-01
Β-γ contamination cheek up of TLD cassettes were carried out and the isotopes found were 137 Cs, 106 Ru, 60 Co, 64 Cu, 144 Ce and 95 Nb with activity per square cm varying from 0.05-4.70 Bq/cm 2 with median value 1.3. The assessed dose in TLD was in the range of 2.10 mSv to 22.05 mSv for beta, 0.05 mSv to 5.25 mSv for gamma. The beta doses have median value of 6.19 mSv. This contamination may be due to active water contamination on TLD's of personnel working for irradiated fuel handling or work in fuel rod (under water) storage area. This gives a method to estimate skin exposure of personnel due to skin contamination during work. Chances of getting TLD's contaminated due to various reasons were studied. Contamination was found maximum inside the cassette box having area 16 cm 2 . In case of plastic pouch of TLD disc contamination was detected in three cases. Contamination level on TLD cassettes using GM counter was found in the range of 0.30-3.6 Bq/cm 2 for cassettes. By opening the window of the surveymeter contamination and field of these cassettes in closed condition were found to increase by 20% due to the measurement of beta dose. With the same condition contamination of TLD cassette in open condition was found five times more. This is due to the a-contamination which is five times more than a contamination, The most prominent isotope 137 Cs in common chemical forms are soluble in water and if inhaled or ingested are rapidly and completely absorbed in the lungs and across the gastrointestinal tract. Thus a skin contamination of most prominent isotope 137 Cs can lead to intake in addition to skin dose. Fading studies of contamination of TLD cassettes were carried out. It was found negligible after counting with GM counting set up after a period of 3 months. But one of the TLD cassettes was showing an 80% reduction of contamination after 3 months with GM counting set up, the contaminants being 141 Ce, 103 Ru and 95 Nb. The gamma peaks in the external exposure are due to external exposure in specific work during the period. 60 Co is an activation product by activation of pipe line corrosion parts, while 64 Cu, a positron emitter is an activation product formed from 63 Cu from the tubing used in primary coolant water system. Beta contamination dose conversion factor of 2 mSv/Bq/cm 2 is arrived at, for a period of 1 month. (author)
The Challenge of Publication for English Non-Dominant-Language Authors in Mathematics Education
Geiger, Vince; Straesser, Rudolf
2015-01-01
This article developed from a request by a German mathematics educator (Rudolf) to an Australian colleague (Vince) for a language check of an international conference presentation. Because of a history of cooperation, Rudolf trusted that Vince could provide a faithful translation. Vince happily accepted but had two reservations: he did not wish to…
Synthesis, antiproliferative activity and molecular docking of Colchicine derivatives.
Huczyński, Adam; Majcher, Urszula; Maj, Ewa; Wietrzyk, Joanna; Janczak, Jan; Moshari, Mahshad; Tuszynski, Jack A; Bartl, Franz
2016-02-01
In order to create more potent anticancer agents, a series of five structurally different derivatives of Colchicine have been synthesised. These compounds were characterised spectroscopically and structurally and their antiproliferative activity against four human tumour cell lines (HL-60, HL-60/vinc, LoVo, LoVo/DX) was evaluated. Additionally the activity of the studied compounds was calculated using computational methods involving molecular docking of the Colchicine derivatives to β-tubulin. The experimental and computational results are in very good agreement indicating that the antimitotic activity of Colchicine derivatives can be readily predicted using computational modeling methods. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Carlson, Kerstin
The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...
ten Cate, J.M.
2015-01-01
Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of
Annette Geiger (Hg.: Der schöne Körper. Köln u.a.: Böhlau Verlag 2008
Directory of Open Access Journals (Sweden)
Susanne Gramatzki
2009-07-01
Full Text Available In den Beiträgen des Bandes werden unterschiedlichste Formen des Schönheitshandelns in Vergangenheit und Gegenwart unter einer kulturwissenschaftlichen Perspektive beleuchtet. Modische Kleidung, dekorative Kosmetik, Diäten, Figurtraining, Schönheitsoperationen usw. stehen dem Ich als Techniken zur Verfügung, um einem idealen, häufig über die Medien vermittelten Körperbild nahezukommen. Der instruktive und angemessen bebilderte Band betrachtet diese Körperpraktiken als gleichermaßen gesellschaftliche wie ästhetische Phänomene; Zwang und Lust der Simulation werden von den Autor/-innen ebenso behandelt wie das Verhältnis von (Geschlechter-Identität und (Geschlechter-Performanz.The essays in this volume illuminate the most diverse forms of past and present acts of beauty from the perspective of Cultural Studies. Fashionable clothing, decorative cosmetics, diets, body sculpting, cosmetic operations, etc. are available to the individual as techniques to achieve the ideal body image that is often communicated through the media. The instructive and appropriately illustrated volume sees these body practices as both social and aesthetic phenomena. The authors address compulsion, the desire for simulation, and the relationship between (gender identity and (gender performance.
Energy Technology Data Exchange (ETDEWEB)
Park, Kwang Hun [Dept. of Nuclear Medicine, Kyungbuk National University Hospital, Daegu (Korea, Republic of); Kim, Kgu Hwan [Dept. of Radiological Technology, Daegu Health College, Daegu (Korea, Republic of)
2016-12-15
Radioactive iodine(131I) treatment reduces recurrence and increases survival in patients with differentiated thyroid cancer. However, it is important in terms of radiation safety management to measure the radiation dose rate generated from the patient because the radiation emitted from the patient may cause the exposure. Research methods, it measured radiation dose-rate according to the elapsed time from 1 m from the upper abdomen of the patient by intake of radioactive iodine. Directly comparing the changes over time, high dose rate sensitivity and efficiency is statistically significant, and higher chamber than GM counter(p<0.05). Low dose rate sensitivity and efficiency in the chamber had lower levels than gm counter, but not statistically significant(p>0.05). In this study confirmed the characteristics of calibrated ionization chamber and GM counter according to the radiation intensity during high-dose radioactive iodine therapy by measuring the accurate and rapid radiation dose rate to the patient explains, discharged patients will be reduced to worry about radiation hazard of family and others person.
Zandbelt, Bram
2017-01-01
Introductory presentation on cognitive modeling for the course ‘Cognitive control’ of the MSc program Cognitive Neuroscience at Radboud University. It addresses basic questions, such as 'What is a model?', 'Why use models?', and 'How to use models?'
Anaïs Schaeffer
2012-01-01
By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models. Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...
Directory of Open Access Journals (Sweden)
Tea Ya. Danelyan
2014-01-01
Full Text Available The article states the general principles of structural modeling in aspect of the theory of systems and gives the interrelation with other types of modeling to adjust them to the main directions of modeling. Mathematical methods of structural modeling, in particular method of expert evaluations are considered.
African Journals Online (AJOL)
Moatez Billah HARIDA
The use of the simulator “Hybrid Electrical Vehicle Model Balances Fidelity and. Speed (HEVMBFS)” and the global control strategy make it possible to achieve encouraging results. Key words: Series parallel hybrid vehicle - nonlinear model - linear model - Diesel engine - Engine modelling -. HEV simulator - Predictive ...
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Piccolo, Chiara; Heitzig, Martina
2011-01-01
This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...... procedure is introduced for the analysis and solution of property models. Models that capture and represent the temperature dependent behaviour of physical properties are introduced, as well as equation of state models (EOS) such as the SRK EOS. Modelling of liquid phase activity coefficients are also...
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
International Nuclear Information System (INIS)
Buchler, J.R.; Gottesman, S.T.; Hunter, J.H. Jr.
1990-01-01
Various papers on galactic models are presented. Individual topics addressed include: observations relating to galactic mass distributions; the structure of the Galaxy; mass distribution in spiral galaxies; rotation curves of spiral galaxies in clusters; grand design, multiple arm, and flocculent spiral galaxies; observations of barred spirals; ringed galaxies; elliptical galaxies; the modal approach to models of galaxies; self-consistent models of spiral galaxies; dynamical models of spiral galaxies; N-body models. Also discussed are: two-component models of galaxies; simulations of cloudy, gaseous galactic disks; numerical experiments on the stability of hot stellar systems; instabilities of slowly rotating galaxies; spiral structure as a recurrent instability; model gas flows in selected barred spiral galaxies; bar shapes and orbital stochasticity; three-dimensional models; polar ring galaxies; dynamical models of polar rings
Energy Technology Data Exchange (ETDEWEB)
Prieto, E.; Casanova, R.; Salvado, M.
2013-07-01
In this work, it is demonstrated that spectrometric equipment can be used to measure dose rates. Besides, an analysis method for spectrometric data obtained in short integration periods is proposed. The method basically consists in the study of the evolution of the number of counts in certain windows or regions of interest (ROIs) in gamma spectra. The ROIs are chosen strategically for its probability of containing counts coming from characteristic gamma emissions from certain isotopes of interest. The method is useful to set early warning criteria.
DEFF Research Database (Denmark)
Ravn, Anders P.; Staunstrup, Jørgen
1994-01-01
This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...
Hydrological models are mediating models
Babel, L. V.; Karssenberg, D.
2013-08-01
Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting
International Nuclear Information System (INIS)
Phillips, C.K.
1985-12-01
This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs
Modelling in Business Model design
Simonse, W.L.
2013-01-01
It appears that business model design might not always produce a design or model as the expected result. However when designers are involved, a visual model or artefact is produced. To assist strategic managers in thinking about how they can act, the designers challenge is to combine strategy and
International Nuclear Information System (INIS)
Yang, H.
1999-01-01
The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future
International Nuclear Information System (INIS)
Laurence, D.
1997-01-01
This paper is an introduction course in modelling turbulent thermohydraulics, aimed at computational fluid dynamics users. No specific knowledge other than the Navier Stokes equations is required beforehand. Chapter I (which those who are not beginners can skip) provides basic ideas on turbulence physics and is taken up in a textbook prepared by the teaching team of the ENPC (Benque, Viollet). Chapter II describes turbulent viscosity type modelling and the 2k-ε two equations model. It provides details of the channel flow case and the boundary conditions. Chapter III describes the 'standard' (R ij -ε) Reynolds tensions transport model and introduces more recent models called 'feasible'. A second paper deals with heat transfer and the effects of gravity, and returns to the Reynolds stress transport model. (author)
2016-01-01
This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.
DEFF Research Database (Denmark)
Larsen, Lars Bjørn; Vesterager, Johan
sharing many of the characteristics of a virtual enterprise. This extended enterprise will have the following characteristics: The extended enterprise is focused on satisfying the current customer requirement so that it has a limited life expectancy, but should be capable of being recreated to deal....... One or more units from beyond the network may complement the extended enterprise. The common reference model for this extended enterprise will utilise GERAM (Generalised Enterprise Reference Architecture and Methodology) to provide an architectural framework for the modelling carried out within......This report provides an overview of the existing models of global manufacturing, describes the required modelling views and associated methods and identifies tools, which can provide support for this modelling activity.The model adopted for global manufacturing is that of an extended enterprise...
DEFF Research Database (Denmark)
Blomhøj, Morten
2004-01-01
modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive......Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...
DEFF Research Database (Denmark)
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...... be characterized by their occurrence times and the participating books and borrowers. When we characterize events as information objects we focus on concepts like information structures. When viewed as change agents events are phenomena that trigger change. For example, when borrow event occurs books are moved...
Bottle, Neil
2013-01-01
The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...
Wenninger, Magnus J
2012-01-01
Well-illustrated, practical approach to creating star-faced spherical forms that can serve as basic structures for geodesic domes. Complete instructions for making models from circular bands of paper with just a ruler and compass. Discusses tessellation, or tiling, and how to make spherical models of the semiregular solids and concludes with a discussion of the relationship of polyhedra to geodesic domes and directions for building models of domes. "". . . very pleasant reading."" - Science. 1979 edition.
Modeling Documents with Event Model
Directory of Open Access Journals (Sweden)
Longhui Wang
2015-08-01
Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.
DEFF Research Database (Denmark)
Højgaard, Tomas; Hansen, Rune
The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful...... to construct this approach in mathematics education research....
Flores, J.; Kiss, S.; Cano, P.; Nijholt, Antinus; Zwiers, Jakob
2003-01-01
We concentrate our efforts on building virtual modelling environments where the content creator uses controls (widgets) as an interactive adjustment modality for the properties of the edited objects. Besides the advantage of being an on-line modelling approach (visualised just like any other on-line
DEFF Research Database (Denmark)
Gøtze, Jens Peter; Krentz, Andrew
2014-01-01
In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...
Poortman, Sybilla; Sloep, Peter
2006-01-01
Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in
Oh, Phil Seok; Oh, Sung Jin
2013-01-01
Modeling in science has been studied by education researchers for decades and is now being applied broadly in school. It is among the scientific practices featured in the "Next Generation Science Standards" ("NGSS") (Achieve Inc. 2013). This article describes modeling activities in an extracurricular science club in a high…
Jongerden, M.R.; Haverkort, Boudewijn R.H.M.
2008-01-01
The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However,
International Nuclear Information System (INIS)
V. Chipman
2002-01-01
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses
DEFF Research Database (Denmark)
Kindler, Ekkart
2009-01-01
, these notations have been extended in order to increase expressiveness and to be more competitive. This resulted in an increasing number of notations and formalisms for modelling business processes and in an increase of the different modelling constructs provided by modelling notations, which makes it difficult......There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts...... to compare modelling notations and to make transformations between them. One of the reasons is that, in each notation, the new concepts are introduced in a different way by extending the already existing constructs. In this chapter, we go the opposite direction: We show that it is possible to add most...
Energy Technology Data Exchange (ETDEWEB)
Veronica J. Rutledge
2013-01-01
The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to
McMEEKIN, Thomas A; Ross, Thomas
1996-12-01
The concept of predictive microbiology has developed rapidly through the initial phases of experimental design and model development and the subsequent phase of model validation. A fully validated model represents a general rule which may be brought to bear on particular cases. For some microorganism/food combinations, sufficient confidence now exists to indicate substantial benefits to the food industry from use of predictive models. Several types of devices are available to monitor and record environmental conditions (particularly temperature). These "environmental histories" can be interpreted, using predictive models, in terms of microbial proliferation. The current challenge is to provide systems for the collection and interpretation of environmental information which combine ease of use, reliability, and security, providing the industrial user with the ability to make informed and precise decisions regarding the quality and safety of foods. Many specific applications for predictive modeling can be developed from a basis of understanding the inherent qualities of a fully validated model. These include increased precision and confidence in predictions based on accumulation of quantitative data, objective and rapid assessment of the effect of environmental conditions on microbial proliferation, and flexibility in monitoring the relative contribution of component parts of processing, distribution, and storage systems for assurance of shelf life and safety.
Lin, Tony; Erfan, Sasan
2016-01-01
Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…
Modeling complexes of modeled proteins.
Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A
2017-03-01
Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Pedersen, Mogens Jin; Stritch, Justin Michael
2018-01-01
contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... of the model using two previously published replication studies as examples....
Blacher, René
2010-01-01
Ce rapport complete les deux rapports précédents et apporte une explication plus simple aux résultats précédents : à savoir la preuve que les suites obtenues sont aléatoires.; In previous reports, we have show how to transform a text $y_n$ in a random sequence by using functions of Fibonacci $T_q$. Now, in this report, we obtain a clearer result by proving that $T_q(y_n)$ has the IID model as correct model. But, it is necessary to define correctly a correct model. Then, we study also this pro...
National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of all...
Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia
Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.
Searle, Shayle R
2012-01-01
This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.
EPA's modeling community is working to gain insights into certain parts of a physical, biological, economic, or social system by conducting environmental assessments for Agency decision making to complex environmental issues.
International Nuclear Information System (INIS)
Rosner, J.L.
1981-01-01
This paper invites experimenters to consider the wide variety of tests suggested by the new aspects of quark models since the discovery of charm and beauty, and nonrelativistic models. Colors and flavours are counted and combined into hadrons. The current quark zoo is summarized. Models and theoretical background are studied under: qualitative QCD: strings and bags, potential models, relativistic effects, electromagnetic transitions, gluon emissions, and single quark transition descriptions. Hadrons containing quarks known before 1974 (i.e. that can be made of ''light'' quarks u, d, and s) are treated in Section III, while those containing charmed quarks and beauty (b) quarks are discussed in Section IV. Unfolding the properties of the sixth quark from information on its hadrons is seen as a future application of the methods used in this study
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Digital Repository Service at National Institute of Oceanography (India)
Unnikrishnan, A.S.; Manoj, N.T.
developed most of the above models. This is a good approximation to simulate horizontal distribution of active and passive variables. The future challenge lies in developing capability to simulate the distribution in the vertical....
International Nuclear Information System (INIS)
Peccei, R.D.
If quarks and leptons are composite, it should be possible eventually to calculate their mass spectrum and understand the reasons for the observed family replications, questions which lie beyond the standard model. Alas, all experimental evidence to date points towards quark and lepton elemenarity with the typical momentum scale Λsub(comp), beyond which effects of inner structure may be seen, probably being greater than ITeV. One supersymmetric preon model explained provides a new dynamical alternative for obtaining light fermions which is that these states are quasi Goldstone fermions. This, and similar models are discussed. Although quasi Goldstone fermions provide an answer to the 0sup(th)-order question of composite models the questions of how masses and families are generated remain unanswered. (U.K.)
Skaaret, Eimund
Calculation procedures, used in the design of ventilating systems, which are especially suited for displacement ventilation in addition to linking it to mixing ventilation, are addressed. The two zone flow model is considered and the steady state and transient solutions are addressed. Different methods of supplying air are discussed, and different types of air flow are considered: piston flow, plane flow and radial flow. An evaluation model for ventilation systems is presented.
DEFF Research Database (Denmark)
Lasrado, Lester Allan; Vatrapu, Ravi
2016-01-01
effects, unicausal reduction, and case specificity. Based on the developments in set theoretical thinking in social sciences and employing methods like Qualitative Comparative Analysis (QCA), Necessary Condition Analysis (NCA), and set visualization techniques, in this position paper, we propose...... and demonstrate a new approach to maturity models in the domain of Information Systems. This position paper describes the set-theoretical approach to maturity models, presents current results and outlines future research work....
Accelerated life models modeling and statistical analysis
Bagdonavicius, Vilijandas
2001-01-01
Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Energy Technology Data Exchange (ETDEWEB)
Curtis, S.B.
1990-09-01
Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) ``interaction`` of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.
Energy Technology Data Exchange (ETDEWEB)
Curtis, S.B.
1990-09-01
Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) interaction'' of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.
Smith, J. A.; Cooper, K.; Randolph, M.
1984-01-01
A classical description of the one dimensional radiative transfer treatment of vegetation canopies was completed and the results were tested against measured prairie (blue grama) and agricultural canopies (soybean). Phase functions are calculated in terms of directly measurable biophysical characteristics of the canopy medium. While the phase functions tend to exhibit backscattering anisotropy, their exact behavior is somewhat more complex and wavelength dependent. A Monte Carlo model was developed that treats soil surfaces with large periodic variations in three dimensions. A photon-ray tracing technology is used. Currently, the rough soil surface is described by analytic functions and appropriate geometric calculations performed. A bidirectional reflectance distribution function is calculated and, hence, available for other atmospheric or canopy reflectance models as a lower boundary condition. This technique is used together with an adding model to calculate several cases where Lambertian leaves possessing anisotropic leaf angle distributions yield non-Lambertian reflectance; similar behavior is exhibited for simulated soil surfaces.
Eck, Christof; Knabner, Peter
2017-01-01
Mathematical models are the decisive tool to explain and predict phenomena in the natural and engineering sciences. With this book readers will learn to derive mathematical models which help to understand real world phenomena. At the same time a wealth of important examples for the abstract concepts treated in the curriculum of mathematics degrees are given. An essential feature of this book is that mathematical structures are used as an ordering principle and not the fields of application. Methods from linear algebra, analysis and the theory of ordinary and partial differential equations are thoroughly introduced and applied in the modeling process. Examples of applications in the fields electrical networks, chemical reaction dynamics, population dynamics, fluid dynamics, elasticity theory and crystal growth are treated comprehensively.
Cardey, Sylviane
2013-01-01
In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-01-01
Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.
International Nuclear Information System (INIS)
Woosley, S.E.; Weaver, T.A.
1980-01-01
Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the 56 Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed
DEFF Research Database (Denmark)
Stubkjær, Erik
2005-01-01
to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...... to land. The paper advances the position that cadastral modeling has to include not only the physical objects, agents, and information sets of the domain, but also the objectives or requirements of cadastral systems....
Torres Aldave, Edison Román
2015-01-01
La violencia en el mundo es parte de la existencia y convivencia humana, desde que registra la historia con referencia a la evolución del hombre de una u otra forma éste ha estado en conflicto, iniciándose de manera más representativa en el Esclavismo (aparece las clases sociales) ya que en el Comunismo Primitivo existía una convivencia de grupo y todo se compartía en común, el hombre luchaba pero para sobrevivir ante la inclemencia de la naturaleza, que era su prioridad. El...
African Journals Online (AJOL)
Simple analytic polynomials have been proposed for estimating solar radiation in the traditional Northern, Central and Southern regions of Malawi. There is a strong agreement between the polynomials and the SSE model with R2 values of 0.988, 0.989 and 0.989 and root mean square errors of 0.061, 0.057 and 0.062 ...
DEFF Research Database (Denmark)
Nash, Ulrik William
2014-01-01
Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory...
Indian Academy of Sciences (India)
pattern of the watershed LULC, leading to an accretive linear growth of agricultural and settlement areas. The annual rate of ... thereby advocates for better agricultural practices with additional energy subsidy to arrest further forest loss and LULC ...... automaton model and GIS: Long-term urban growth pre- diction for San ...
DEFF Research Database (Denmark)
Arnoldi, Jakob
The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing ...
Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.
This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…
DEFF Research Database (Denmark)
About the reconstruction of Palle Nielsen's (f. 1942) work The Model from 1968: a gigantic playground for children in the museum, where they can freely romp about, climb in ropes, crawl on wooden structures, work with tools, jump in foam rubber, paint with finger paints and dress up in costumes....
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:
Lomnitz, Cinna
Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:
DEFF Research Database (Denmark)
Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens
2011-01-01
term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence...
Model-reduced inverse modeling
Vermeulen, P.T.M.
2006-01-01
Although faster computers have been developed in recent years, they tend to be used to solve even more detailed problems. In many cases this will yield enormous models that can not be solved within acceptable time constraints. Therefore, there is a need for alternative methods that simulate such
Building Models and Building Modelling
DEFF Research Database (Denmark)
Jørgensen, Kaj; Skauge, Jørn
2008-01-01
I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygningsmodelleringsprogrammer beskrevet. Vigtige aspekter om comp...
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-12-01
Full Text Available
Barr, Michael
2002-01-01
Acyclic models is a method heavily used to analyze and compare various homology and cohomology theories appearing in topology and algebra. This book is the first attempt to put together in a concise form this important technique and to include all the necessary background. It presents a brief introduction to category theory and homological algebra. The author then gives the background of the theory of differential modules and chain complexes over an abelian category to state the main acyclic models theorem, generalizing and systemizing the earlier material. This is then applied to various cohomology theories in algebra and topology. The volume could be used as a text for a course that combines homological algebra and algebraic topology. Required background includes a standard course in abstract algebra and some knowledge of topology. The volume contains many exercises. It is also suitable as a reference work for researchers.
DEFF Research Database (Denmark)
Pedersen, Mogens Jin; Stritch, Justin Michael
2018-01-01
Replication studies relate to the scientific principle of replicability and serve the significant purpose of providing supporting (or contradicting) evidence regarding the existence of a phenomenon. However, replication has never been an integral part of public administration and management...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...
DEFF Research Database (Denmark)
2012-01-01
on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...
DEFF Research Database (Denmark)
on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...
DEFF Research Database (Denmark)
Michael, John
others' minds. Then (2), in order to bring to light some possible justifications, as well as hazards and criticisms of the methodology of looking time tests, I will take a closer look at the concept of folk psychology and will focus on the idea that folk psychology involves using oneself as a model...... of other people in order to predict and understand their behavior. Finally (3), I will discuss the historical location and significance of the emergence of looking time tests...
1975-01-01
detailed rendered visible in his photographs by streams of photographs of spheres entering the water small bubbles from electrolysis . So far as is...of the cavity is opaque or, brined wihile the sphere wats still in the oil. At if translucent, the contrast between thle jet and about the time the...and brass, for example) should be so model velocity scale according to Equation 1.18, selected that electrolysis is not a problem. the addition of
Vincent, Julian F V
2003-01-01
Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more compl...
DEFF Research Database (Denmark)
This book reflects and expands on the current trend in the building industry to understand, simulate and ultimately design buildings by taking into consideration the interlinked elements and forces that act on them. This approach overcomes the traditional, exclusive focus on building tasks, while....... The chapter authors were invited speakers at the 5th Symposium "Modelling Behaviour", which took place at the CITA in Copenhagen in September 2015....
1980-02-01
a teuto 014aceo 0-oiuato 4 ajj 210- I 14 *Experiments l~~lamCID - l2 C15 model+ Aida ditane &Gray medium K .2 a Experiments hont target n-IO a0 deawa...possibilita di valutazione dello scambio termico in focolai di caldaie per ricaldamento"I Atti E Rassegna Tecnica Societa ingegneri e arc~hitetti in Torino
DEFF Research Database (Denmark)
practice: the duration of active influence that representation can hold in relation to the represented; the means, methods and media through which representations are constructed and used; and what it is that is being represented. Featuring contributions from some of the world’s most advanced thinkers....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....
International Nuclear Information System (INIS)
McIllvaine, C.M.
1994-01-01
Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO 2 ), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NO x concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NO x coordinates of the point, known as the NMOC/NO x ratio. Results obtained by the described model are presented
Energy Technology Data Exchange (ETDEWEB)
Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas
2005-11-01
Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.
Characterization of the airborne activity confinement system prefilter material
Energy Technology Data Exchange (ETDEWEB)
Long, T.A.; Monson, P.R.
1992-05-01
A general concern with assessing the effects of postulated severe accidents is predicting and preventing the release of radioactive isotopes to the environment at the Savannah River Site (SRS) reactor. Unless the confinement systems are breached in an accident the Airborne Activity Confinement System forces all of the internal air through the filter compartments. Proper modeling of the radioactivity released to the environment requires knowledge of the filtering characteristics of the demisters, the HEPA`s, and the charcoal beds. An investigation of the mass loading characteristics for a range of particle sizes was performed under the direction of Vince Novick of Argonne National Laboratory (ANL) for the Savannah River Technology Center (SRTC) in connection with the restart of the K reactor. Both solid and liquid aerosols were used to challenge sample prefilter and HEPA filters. The results of the ANL investigation are reported in this document.
Characterization of the airborne activity confinement system prefilter material
Energy Technology Data Exchange (ETDEWEB)
Long, T.A.; Monson, P.R.
1992-05-01
A general concern with assessing the effects of postulated severe accidents is predicting and preventing the release of radioactive isotopes to the environment at the Savannah River Site (SRS) reactor. Unless the confinement systems are breached in an accident the Airborne Activity Confinement System forces all of the internal air through the filter compartments. Proper modeling of the radioactivity released to the environment requires knowledge of the filtering characteristics of the demisters, the HEPA's, and the charcoal beds. An investigation of the mass loading characteristics for a range of particle sizes was performed under the direction of Vince Novick of Argonne National Laboratory (ANL) for the Savannah River Technology Center (SRTC) in connection with the restart of the K reactor. Both solid and liquid aerosols were used to challenge sample prefilter and HEPA filters. The results of the ANL investigation are reported in this document.
Object Modeling and Building Information Modeling
Auråen, Hege; Gjemdal, Hanne
2016-01-01
The main part of this thesis is an online course (Small Private Online Course) entitled "Introduction to Object Modeling and Building Information Modeling". This supplementary report clarifies the choices made in the process of developing the course. The course examines the basic concepts of object modeling, modeling techniques and a modeling language (UML). Further, building information modeling (BIM) is presented as a modeling process, and the object modeling concepts in the BIM softw...
Directory of Open Access Journals (Sweden)
PAPAJ Jan
2014-05-01
Full Text Available Traditional wireless networks use the concept of the point-to-point forwarding inherited from reliable wired networks which seems to be not ideal for wireless environment. New emerging applications and networks operate mostly disconnected. So-called Delay-Tolerant networks (DTNs are receiving increasing attentions from both academia and industry. DTNs introduced a store-carry-and-forward concept solving the problem of intermittent connectivity. Behavior of such networks is verified by real models, computer simulation or combination of the both approaches. Computer simulation has become the primary and cost effective tool for evaluating the performance of the DTNs. OPNET modeler is our target simulation tool and we wanted to spread OPNET’s simulation opportunity towards DTN. We implemented bundle protocol to OPNET modeler allowing simulate cases based on bundle concept as epidemic forwarding which relies on flooding the network with messages and the forwarding algorithm based on the history of past encounters (PRoPHET. The implementation details will be provided in article.
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Model Checking Algorithms for Markov Reward Models
Cloth, Lucia; Cloth, L.
2006-01-01
Model checking Markov reward models unites two different approaches of model-based system validation. On the one hand, Markov reward models have a long tradition in model-based performance and dependability evaluation. On the other hand, a formal method like model checking allows for the precise
DEFF Research Database (Denmark)
Bork Petersen, Franziska
2013-01-01
focus centres on how the catwalk scenography evokes a ‘defiguration’ of the walking models and to what effect. Vibskov’s mobile catwalk draws attention to the walk, which is a key element of models’ performance but which usually functions in fashion shows merely to present clothes in the most...... catwalks. Vibskov’s catwalk induces what the dance scholar Gabriele Brandstetter has labelled a ‘defigurative choregoraphy’: a straying from definitions, which exist in ballet as in other movement-based genres, of how a figure should move and appear (1998). The catwalk scenography in this instance...
Students' Models of Curve Fitting: A Models and Modeling Perspective
Gupta, Shweta
2010-01-01
The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…
1989-01-01
A wooden model of the ALEPH experiment and its cavern. ALEPH was one of 4 experiments at CERN's 27km Large Electron Positron collider (LEP) that ran from 1989 to 2000. During 11 years of research, LEP's experiments provided a detailed study of the electroweak interaction. Measurements performed at LEP also proved that there are three – and only three – generations of particles of matter. LEP was closed down on 2 November 2000 to make way for the construction of the Large Hadron Collider in the same tunnel. The cavern and detector are in separate locations - the cavern is stored at CERN and the detector is temporarily on display in Glasgow physics department. Both are available for loan.
Energy Technology Data Exchange (ETDEWEB)
Wilhelm, Christoph [Forschungszentrum Karlsruhe GmbH, Karlsruhe (Germany). Physikalisches Messlabor
2009-07-01
The development of activity measurement techniques started together with the discovery of radioactivity in 1896. A great impulse was given to the development by the rising of nuclear technology in the 50s. The detection techniques used today have been developed mainly at that time and in the following years. With the huge progress in semiconductor industry and in computer technology, the application of measuring processes developed then has become much simpler. Today, even commercial measuring systems for low-level-measurements of radioactivity are available. (orig.)
The IMACLIM model; Le modele IMACLIM
Energy Technology Data Exchange (ETDEWEB)
NONE
2003-07-01
This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)
Building Mental Models by Dissecting Physical Models
Srivastava, Anveshna
2016-01-01
When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…
Atmospheric Models/Global Atmospheric Modeling
1998-09-30
Atmospheric Models /Global Atmospheric Modeling Timothy F. Hogan Naval Research Laboratory Monterey, CA 93943-5502 phone: (831) 656-4705 fax: (831...to 00-00-1998 4. TITLE AND SUBTITLE Atmospheric Models /Global Atmospheric Modeling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...initialization of increments, improved cloud prediction, and improved surface fluxes) have been transition to 6.4 (Global Atmospheric Models , PE 0603207N, X-0513
Models in architectural design
Pauwels, Pieter
2017-01-01
Whereas architects and construction specialists used to rely mainly on sketches and physical models as representations of their own cognitive design models, they rely now more and more on computer models. Parametric models, generative models, as-built models, building information models (BIM), and so forth, they are used daily by any practitioner in architectural design and construction. Although processes of abstraction and the actual architectural model-based reasoning itself of course rema...
International Nuclear Information System (INIS)
Tozini, A.V.
1984-01-01
A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt
DEFF Research Database (Denmark)
Branlard, Emmanuel Simon Pierre
2017-01-01
Different models of wake expansion are presented in this chapter: the 1D momentum theory model, the cylinder analog model and Theodorsen’s model. Far wake models such as the ones from Frandsen or Rathmann or only briefly mentioned. The different models are compared to each other. Results from...
Model Manipulation for End-User Modelers
DEFF Research Database (Denmark)
Acretoaie, Vlad
End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... of these proposals. To achieve its first goal, the thesis presents the findings of a Systematic Mapping Study showing that human factors topics are scarcely and relatively poorly addressed in model transformation research. Motivated by these findings, the thesis explores the requirements of end-user modelers......, and transformations using their modeling notation and editor of choice. The VM* languages are implemented via a single execution engine, the VM* Runtime, built on top of the Henshin graph-based transformation engine. This approach combines the benefits of flexibility, maturity, and formality. To simplify model editor...
Model-to-model interface for multiscale materials modeling
Energy Technology Data Exchange (ETDEWEB)
Antonelli, Perry Edward [Iowa State Univ., Ames, IA (United States)
2017-12-17
A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface will also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.
Concept Modeling vs. Data modeling in Practice
DEFF Research Database (Denmark)
Madsen, Bodil Nistrup; Erdman Thomsen, Hanne
2015-01-01
This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models...
Cognitive models embedded in system simulation models
International Nuclear Information System (INIS)
Siegel, A.I.; Wolf, J.J.
1982-01-01
If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context
Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher
2014-01-01
The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...
Air Quality Dispersion Modeling - Alternative Models
Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.
Wake modelling combining mesoscale and microscale models
DEFF Research Database (Denmark)
Badger, Jake; Volker, Patrick; Prospathospoulos, J.
2013-01-01
parameterizations are demonstrated in theWeather Research and Forecasting mesoscale model (WRF) in an idealized atmospheric flow. The model framework is the Horns Rev I wind farm experiencing an 7.97 m/s wind from 269.4o. Three of the four parameterizations use thrust output from the CRESflow-NS microscale model......In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake....... The characteristics of the mesoscale wake that developed from the four parameterizations are examined. In addition the mesoscale model wakes are compared to measurement data from Horns Rev I. Overall it is seen as an advantage to incorporate microscale model data in mesocale model wake parameterizations....
A Model of Trusted Measurement Model
Ma Zhili; Wang Zhihao; Dai Liang; Zhu Xiaoqin
2017-01-01
A model of Trusted Measurement supporting behavior measurement based on trusted connection architecture (TCA) with three entities and three levels is proposed, and a frame to illustrate the model is given. The model synthesizes three trusted measurement dimensions including trusted identity, trusted status and trusted behavior, satisfies the essential requirements of trusted measurement, and unified the TCA with three entities and three levels.
Molecular Models: Construction of Models with Magnets
Directory of Open Access Journals (Sweden)
Kalinovčić P.
2015-07-01
Full Text Available Molecular models are indispensable tools in teaching chemistry. Beside their high price, commercially available models are generally too small for classroom demonstration. This paper suggests how to make space-filling (callote models from Styrofoam with magnetic balls as connectors and disc magnets for showing molecular polarity
Target Scattering Metrics: Model-Model and Model Data comparisons
2017-12-13
be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for...stainless steel replica of artillery shell Table 7. Targets used in the TIER simulations for the metrics study. C. Four Potential Metrics: Four...Four metrics were investigated. The metric, based on 2D cross-correlation, is typically used in classification algorithms. Model-model comparisons
Collett, David
2002-01-01
INTRODUCTION Some Examples The Scope of this Book Use of Statistical Software STATISTICAL INFERENCE FOR BINARY DATA The Binomial Distribution Inference about the Success Probability Comparison of Two Proportions Comparison of Two or More Proportions MODELS FOR BINARY AND BINOMIAL DATA Statistical Modelling Linear Models Methods of Estimation Fitting Linear Models to Binomial Data Models for Binomial Response Data The Linear Logistic Model Fitting the Linear Logistic Model to Binomial Data Goodness of Fit of a Linear Logistic Model Comparing Linear Logistic Models Linear Trend in Proportions Comparing Stimulus-Response Relationships Non-Convergence and Overfitting Some other Goodness of Fit Statistics Strategy for Model Selection Predicting a Binary Response Probability BIOASSAY AND SOME OTHER APPLICATIONS The Tolerance Distribution Estimating an Effective Dose Relative Potency Natural Response Non-Linear Logistic Regression Models Applications of the Complementary Log-Log Model MODEL CHECKING Definition of Re...
DEFF Research Database (Denmark)
Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik
1997-01-01
This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...
Automated data model evaluation
International Nuclear Information System (INIS)
Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana
2012-01-01
Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation
DEFF Research Database (Denmark)
Hansen, Mads Fogtmann; Fagertun, Jens; Larsen, Rasmus
2011-01-01
This paper presents a fusion of the active appearance model (AAM) and the Riemannian elasticity framework which yields a non-linear shape model and a linear texture model – the active elastic appearance model (EAM). The non-linear elasticity shape model is more flexible than the usual linear subs...
Haiganoush Preisler; Alan Ager
2013-01-01
For applied mathematicians forest fire models refer mainly to a non-linear dynamic system often used to simulate spread of fire. For forest managers forest fire models may pertain to any of the three phases of fire management: prefire planning (fire risk models), fire suppression (fire behavior models), and postfire evaluation (fire effects and economic models). In...
Willden, Jeff
2001-01-01
"Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…
From Numeric Models to Granular System Modeling
Directory of Open Access Journals (Sweden)
Witold Pedrycz
2015-03-01
To make this study self-contained, we briefly recall the key concepts of granular computing and demonstrate how this conceptual framework and its algorithmic fundamentals give rise to granular models. We discuss several representative formal setups used in describing and processing information granules including fuzzy sets, rough sets, and interval calculus. Key architectures of models dwell upon relationships among information granules. We demonstrate how information granularity and its optimization can be regarded as an important design asset to be exploited in system modeling and giving rise to granular models. With this regard, an important category of rule-based models along with their granular enrichments is studied in detail.
Geologic Framework Model Analysis Model Report
International Nuclear Information System (INIS)
Clayton, R.
2000-01-01
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Geologic Framework Model Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Mathematical Modeling Using MATLAB
National Research Council Canada - National Science Library
Phillips, Donovan
1998-01-01
.... Mathematical Modeling Using MA MATLAB acts as a companion resource to A First Course in Mathematical Modeling with the goal of guiding the reader to a fuller understanding of the modeling process...
CSIR Research Space (South Africa)
Osburn, L
2010-01-01
Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...
DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...
Hiemstra, Djoerd; Liu, Ling; Tamer Özsu, M.
2017-01-01
In language modeling, n-gram models are probabilistic models of text that use some limited amount of history, or word dependencies, where n refers to the number of words that participate in the dependence relation.
Souza, D', Austin
2013-01-01
Presentatie gegeven op 13 mei 2013 op de bijeenkomst "Business Model Canvas Challenge Assen". Het Business Model Canvas is ontworpen door Alex Osterwalder. Het model werkt zeer overzichtelijk en bestaat uit negen bouwstenen.
Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...
DEFF Research Database (Denmark)
De Giovanni, Domenico
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
DEFF Research Database (Denmark)
De Giovanni, Domenico
2010-01-01
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
Brax, P.; Martin, J.; Riazuelo, A.
2001-01-01
A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric o...
Computational neurogenetic modeling
Benuskova, Lubica
2010-01-01
Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol
Overuse Injury Assessment Model
National Research Council Canada - National Science Library
Stuhmiller, James H; Shen, Weixin; Sih, Bryant
2005-01-01
.... Previously, we developed a preliminary model that predicted the stress fracture rate and used biomechanical modeling, nonlinear optimization for muscle force, and bone structural analysis to estimate...
Finch, W Holmes; Kelley, Ken
2014-01-01
A powerful tool for analyzing nested designs in a variety of fields, multilevel/hierarchical modeling allows researchers to account for data collected at multiple levels. Multilevel Modeling Using R provides you with a helpful guide to conducting multilevel data modeling using the R software environment.After reviewing standard linear models, the authors present the basics of multilevel models and explain how to fit these models using R. They then show how to employ multilevel modeling with longitudinal data and demonstrate the valuable graphical options in R. The book also describes models fo
Cosmological models without singularities
International Nuclear Information System (INIS)
Petry, W.
1981-01-01
A previously studied theory of gravitation in flat space-time is applied to homogeneous and isotropic cosmological models. There exist two different classes of models without singularities: (i) ever-expanding models, (ii) oscillating models. The first class contains models with hot big bang. For these models there exist at the beginning of the universe-in contrast to Einstein's theory-very high but finite densities of matter and radiation with a big bang of very short duration. After short time these models pass into the homogeneous and isotropic models of Einstein's theory with spatial curvature equal to zero and cosmological constant ALPHA >= O. (author)
DEFF Research Database (Denmark)
Knudsen, Torben
2011-01-01
model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...
Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...
National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...
Regularized Structural Equation Modeling
Jacobucci, Ross; Grimm, Kevin J.; McArdle, John J.
2016-01-01
A new method is proposed that extends the use of regularization in both lasso and ridge regression to structural equation models. The method is termed regularized structural equation modeling (RegSEM). RegSEM penalizes specific parameters in structural equation models, with the goal of creating easier to understand and simpler models. Although regularization has gained wide adoption in regression, very little has transferred to models with latent variables. By adding penalties to specific parameters in a structural equation model, researchers have a high level of flexibility in reducing model complexity, overcoming poor fitting models, and the creation of models that are more likely to generalize to new samples. The proposed method was evaluated through a simulation study, two illustrative examples involving a measurement model, and one empirical example involving the structural part of the model to demonstrate RegSEM’s utility. PMID:27398019
Integrated Site Model Process Model Report
International Nuclear Information System (INIS)
Booth, T.
2000-01-01
The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM
Better models are more effectively connected models
Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John
2016-04-01
The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity
Modelling bankruptcy prediction models in Slovak companies
Directory of Open Access Journals (Sweden)
Kovacova Maria
2017-01-01
Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.
Generalized latent variable modeling multilevel, longitudinal, and structural equation models
Skrondal, Anders; Rabe-Hesketh, Sophia
2004-01-01
This book unifies and extends latent variable models, including multilevel or generalized linear mixed models, longitudinal or panel models, item response or factor models, latent class or finite mixture models, and structural equation models.
International Nuclear Information System (INIS)
M. A. Wasiolek
2003-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Energy Technology Data Exchange (ETDEWEB)
D. W. Wu
2003-07-16
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Energy Technology Data Exchange (ETDEWEB)
M. A. Wasiolek
2003-10-27
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Lumped Thermal Household Model
DEFF Research Database (Denmark)
Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob
2013-01-01
a lumped model approach as an alternative to the individual models. In the lumped model, the portfolio is seen as baseline consumption superimposed with an ideal storage of limited power and energy capacity. The benefit of such a lumped model is that the computational effort of flexibility optimization...
DEFF Research Database (Denmark)
Larsen, Bjarke Alexander; Andkjær, Kasper Ingdahl; Schoenau-Fog, Henrik
2015-01-01
This paper proposes a new relation model, called "The Moody Mask model", for Interactive Digital Storytelling (IDS), based on Franceso Osborne's "Mask Model" from 2011. This, mixed with some elements from Chris Crawford's Personality Models, is a system designed for dynamic interaction between ch...
DEFF Research Database (Denmark)
Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.
The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS...
Rahmani, Fouad Lazhar
2010-11-01
The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].
Numerical Modelling of Streams
DEFF Research Database (Denmark)
Vestergaard, Kristian
In recent years there has been a sharp increase in the use of numerical water quality models. Numeric water quality modeling can be divided into three steps: Hydrodynamic modeling for the determination of stream flow and water levels. Modelling of transport and dispersion of a conservative...
DEFF Research Database (Denmark)
Ayres, Phil
2012-01-01
This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...
R. Pietersz (Raoul); M. van Regenmortel
2005-01-01
textabstractCurrently, there are two market models for valuation and risk management of interest rate derivatives, the LIBOR and swap market models. In this paper, we introduce arbitrage-free constant maturity swap (CMS) market models and generic market models featuring forward rates that span
Modeling the Accidental Deaths
Directory of Open Access Journals (Sweden)
Mariyam Hafeez
2008-01-01
Full Text Available The model for accidental deaths in the city of Lahore has been developed by using a class of Generalized Linear Models. Various link functions have been used in developing the model. The diagnostic checks have been carried out to see the validity of the fitted model.
Cultural Resource Predictive Modeling
2017-10-01
refining formal, inductive predictive models is the quality of the archaeological and environmental data. To build models efficiently, relevant...geomorphology, and historic information . Lessons Learned: The original model was focused on the identification of prehistoric resources. This...system but uses predictive modeling informally . For example, there is no probability for buried archaeological deposits on the Burton Mesa, but there is
Modelling Railway Interlocking Systems
DEFF Research Database (Denmark)
Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth
2000-01-01
In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)
Comparing Active Vision Models
Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.
2009-01-01
Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different
Comparing active vision models
Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.
2009-01-01
Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different
Van Bloemendaal, Karen; Dijkema, Gerard P.J.; Woerdman, Edwin; Jong, Mattheus
2015-01-01
This White Paper provides an overview of the modelling approaches adopted by the project partners in the EDGaR project 'Understanding Gas Sector Intra- and Inter- Market interactions' (UGSIIMI). The paper addresses three types of models: complementarity modelling, agent-based modelling and property
DEFF Research Database (Denmark)
Ayres, Phil
2012-01-01
This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...... of design. Three distinctions are drawn through which to develop this discussion of models in an architectural context. An examination of these distinctions serves to nuance particular characteristics and roles of models, the modelling activity itself and those engaged in it....
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2011-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2008-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P
2008-01-01
Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...
DEFF Research Database (Denmark)
Justesen, Lise; Overgaard, Svend Skafte
2017-01-01
-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored......This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open...
Energy Technology Data Exchange (ETDEWEB)
C. Ahlers; H. Liu
2000-03-12
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.
Energy Technology Data Exchange (ETDEWEB)
C.F. Ahlers, H.H. Liu
2001-12-18
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the AMR Development Plan for U0035 Calibrated Properties Model REV00 (CRWMS M&O 1999c). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.
DEFF Research Database (Denmark)
Justesen, Lise; Overgaard, Svend Skafte
2017-01-01
This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open......-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored...
MulensModel: Microlensing light curves modeling
Poleski, Radoslaw; Yee, Jennifer
2018-03-01
MulensModel calculates light curves of microlensing events. Both single and binary lens events are modeled and various higher-order effects can be included: extended source (with limb-darkening), annual microlensing parallax, and satellite microlensing parallax. The code is object-oriented and written in Python3, and requires AstroPy (ascl:1304.002).
Business Models and Business Model Innovation
DEFF Research Database (Denmark)
Foss, Nicolai J.; Saebi, Tina
2018-01-01
While research on business models and business model innovation continue to exhibit growth, the field is still, even after more than two decades of research, characterized by a striking lack of cumulative theorizing and an opportunistic borrowing of more or less related ideas from neighbouring...
Takahashi, Takehiro; Schibuya, Noboru
The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.
Phenomenology of inflationary models
Olyaei, Abbas
2018-01-01
There are many inflationary models compatible with observational data. One can investigate inflationary models by looking at their general features, which are common in most of the models. Here we have investigated some of the single-field models without considering their origin in order to find the phenomenology of them. We have shown how to adjust the simple harmonic oscillator model in order to be in good agreement with observational data.
Goldstein, Harvey
2011-01-01
This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.
DEFF Research Database (Denmark)
Langseth, Helge; Nielsen, Thomas Dyhre
2005-01-01
parametric family ofdistributions. In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....
Geochemistry Model Validation Report: External Accumulation Model
Energy Technology Data Exchange (ETDEWEB)
K. Zarrabi
2001-09-27
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Geochemistry Model Validation Report: External Accumulation Model
International Nuclear Information System (INIS)
Zarrabi, K.
2001-01-01
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Crop rotation modelling - A European model intercomparison
DEFF Research Database (Denmark)
Kollas, Chris; Kersebaum, Kurt C; Nendel, Claas
2015-01-01
crop growth simulation models to predict yields in crop rotations at five sites across Europe under minimal calibration. Crop rotations encompassed 301 seasons of ten crop types common to European agriculture and a diverse set of treatments (irrigation, fertilisation, CO2 concentration, soil types...... accurately than main crops (cereals). The majority of models performed better for the treatments of increased CO2 and nitrogen fertilisation than for irrigation and soil-related treatments. The yield simulation of the multi-model ensemble reduced the error compared to single-model simulations. The low degree...... representation of crop rotations, further research is required to synthesise existing knowledge of the physiology of intermediate crops and of carry-over effects from the preceding to the following crop, and to implement/improve the modelling of processes that condition these effects....
Modelling of an homogeneous equilibrium mixture model
International Nuclear Information System (INIS)
Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.
2014-01-01
We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)
Model Reduction in Groundwater Modeling and Management
Siade, A. J.; Kendall, D. R.; Putti, M.; Yeh, W. W.
2008-12-01
Groundwater management requires the development and implementation of mathematical models that, through simulation, evaluate the effects of anthropogenic impacts on an aquifer system. To obtain high levels of accuracy, one must incorporate high levels of complexity, resulting in computationally demanding models. This study provides a methodology for solving groundwater management problems with reduced computational effort by replacing the large, complex numerical model with a significantly smaller, simpler approximation. This is achieved via Proper Orthogonal Decomposition (POD), where the goal is to project the larger model solution space onto a smaller or reduced subspace in which the management problem will be solved, achieving reductions in computation time of up to three orders of magnitude. Once the solution is obtained in the reduced space with acceptable accuracy, it is then projected back to the full model space. A major challenge when using this method is the definition of the reduced solution subspace. In POD, this subspace is defined based on samples or snapshots taken at specific times from the solution of the full model. In this work we determine when snapshots should be taken on the basis of the exponential behavior of the governing partial differential equation. This selection strategy is then generalized for any groundwater model by obtaining and using the optimal snapshot selection for a simplified, dimensionless model. Protocols are developed to allow the snapshot selection results of the simplified, dimensionless model to be transferred to that of a complex, heterogeneous model with any geometry. The proposed methodology is finally applied to a basin in the Oristano Plain located in the Sardinia Island, Italy.
Model Validation Status Review
International Nuclear Information System (INIS)
E.L. Hardin
2001-01-01
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Model Validation Status Review
Energy Technology Data Exchange (ETDEWEB)
E.L. Hardin
2001-11-28
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Modeling for Battery Prognostics
Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick
2017-01-01
For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient
DEFF Research Database (Denmark)
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these cri......Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....
DEFF Research Database (Denmark)
Cameron, Ian T.; Gani, Rafiqul
This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....
Modeling volatility using state space models.
Timmer, J; Weigend, A S
1997-08-01
In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).
Empirical Model Building Data, Models, and Reality
Thompson, James R
2011-01-01
Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m
Zephyr - the prediction models
DEFF Research Database (Denmark)
Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg
2001-01-01
utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models.......This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...
International Nuclear Information System (INIS)
Harvey, M.; Khanna, F.C.
1975-01-01
The general problem of what constitutes a physical model and what is known about the free nucleon-nucleon interaction are considered. A time independent formulation of the basic equations is chosen. Construction of the average field in which particles move in a general independent particle model is developed, concentrating on problems of defining the average spherical single particle field for any given nucleus, and methods for construction of effective residual interactions and other physical operators. Deformed shell models and both spherical and deformed harmonic oscillator models are discussed in detail, and connections between spherical and deformed shell models are analyzed. A section on cluster models is included. 11 tables, 21 figures
Peabody, Hume L.
2017-01-01
This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.
Microsoft tabular modeling cookbook
Braak, Paul te
2013-01-01
This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling
Directory of Open Access Journals (Sweden)
Luiz Carlos Bresser-Pereira
2012-03-01
Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.
Geller, Michael; Telem, Ofri
2015-05-15
We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.
Energy Technology Data Exchange (ETDEWEB)
D.W. Wu; A.J. Smith
2004-11-08
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
International Nuclear Information System (INIS)
D.W. Wu; A.J. Smith
2004-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Modelling of Innovation Diffusion
Directory of Open Access Journals (Sweden)
Arkadiusz Kijek
2010-01-01
Full Text Available Since the publication of the Bass model in 1969, research on the modelling of the diffusion of innovation resulted in a vast body of scientific literature consisting of articles, books, and studies of real-world applications of this model. The main objective of the diffusion model is to describe a pattern of spread of innovation among potential adopters in terms of a mathematical function of time. This paper assesses the state-of-the-art in mathematical models of innovation diffusion and procedures for estimating their parameters. Moreover, theoretical issues related to the models presented are supplemented with empirical research. The purpose of the research is to explore the extent to which the diffusion of broadband Internet users in 29 OECD countries can be adequately described by three diffusion models, i.e. the Bass model, logistic model and dynamic model. The results of this research are ambiguous and do not indicate which model best describes the diffusion pattern of broadband Internet users but in terms of the results presented, in most cases the dynamic model is inappropriate for describing the diffusion pattern. Issues related to the further development of innovation diffusion models are discussed and some recommendations are given. (original abstract
Sumner, J G; Fernández-Sánchez, J; Jarvis, P D
2012-04-07
Recent work has discussed the importance of multiplicative closure for the Markov models used in phylogenetics. For continuous-time Markov chains, a sufficient condition for multiplicative closure of a model class is ensured by demanding that the set of rate-matrices belonging to the model class form a Lie algebra. It is the case that some well-known Markov models do form Lie algebras and we refer to such models as "Lie Markov models". However it is also the case that some other well-known Markov models unequivocally do not form Lie algebras (GTR being the most conspicuous example). In this paper, we will discuss how to generate Lie Markov models by demanding that the models have certain symmetries under nucleotide permutations. We show that the Lie Markov models include, and hence provide a unifying concept for, "group-based" and "equivariant" models. For each of two and four character states, the full list of Lie Markov models with maximal symmetry is presented and shown to include interesting examples that are neither group-based nor equivariant. We also argue that our scheme is pleasing in the context of applied phylogenetics, as, for a given symmetry of nucleotide substitution, it provides a natural hierarchy of models with increasing number of parameters. We also note that our methods are applicable to any application of continuous-time Markov chains beyond the initial motivations we take from phylogenetics. Crown Copyright Â© 2011. Published by Elsevier Ltd. All rights reserved.
Integrated Medical Model – Chest Injury Model
National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...
Traffic & safety statewide model and GIS modeling.
2012-07-01
Several steps have been taken over the past two years to advance the Utah Department of Transportation (UDOT) safety initiative. Previous research projects began the development of a hierarchical Bayesian model to analyze crashes on Utah roadways. De...
Nonlinear Modeling by Assembling Piecewise Linear Models
Yao, Weigang; Liou, Meng-Sing
2013-01-01
To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.
Solid Waste Projection Model: Model user's guide
International Nuclear Information System (INIS)
Stiles, D.L.; Crow, V.L.
1990-08-01
The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab
National Research Council Canada - National Science Library
Feiler, Peter
2007-01-01
.... The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language (AADL) is an industry-standard, architecture-modeling notation specifically designed to support a component- based approach to modeling embedded systems...
DEFF Research Database (Denmark)
Juhl, Joakim
This thesis is about mathematical modelling and technology development. While mathematical modelling has become widely deployed within a broad range of scientific practices, it has also gained a central position within technology development. The intersection of mathematical modelling......-efficiency project, this thesis presents an analysis of the central practices that materialised representative physical modelling and implemented operational regulation models. In order to show how the project’s representative modelling and technology development connected physical theory with concrete problems...... theoretical outset, the existing literature on simulation models, and the study’s methodological and empirical approach. The purpose of this thesis is to describe the central practices that developed regulation technology for industrial production processes and to analyse how mathematical modelling...
International Nuclear Information System (INIS)
Pulkkinen, U.
2004-04-01
The report describes a simple comparison of two CCF-models, the ECLM, and the Beta-model. The objective of the comparison is to identify differences in the results of the models by applying the models in some simple test data cases. The comparison focuses mainly on theoretical aspects of the above mentioned CCF-models. The properties of the model parameter estimates in the data cases is also discussed. The practical aspects in using and estimating CCFmodels in real PSA context (e.g. the data interpretation, properties of computer tools, the model documentation) are not discussed in the report. Similarly, the qualitative CCF-analyses needed in using the models are not discussed in the report. (au)
Modeling EERE deployment programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Controlling Modelling Artifacts
DEFF Research Database (Denmark)
Smith, Michael James Andrew; Nielson, Flemming; Nielson, Hanne Riis
2011-01-01
the possible configurations of the system (for example, by counting the number of components in a certain state). We motivate our methodology with a case study of the LMAC protocol for wireless sensor networks. In particular, we investigate the accuracy of a recently proposed high-level model of LMAC......When analysing the performance of a complex system, we typically build abstract models that are small enough to analyse, but still capture the relevant details of the system. But it is difficult to know whether the model accurately describes the real system, or if its behaviour is due to modelling...... artifacts that were inadvertently introduced. In this paper, we propose a novel methodology to reason about modelling artifacts, given a detailed model and a highlevel (more abstract) model of the same system. By a series of automated abstraction steps, we lift the detailed model to the same state space...
Modeling Fluid Structure Interaction
National Research Council Canada - National Science Library
Benaroya, Haym
2000-01-01
The principal goal of this program is on integrating experiments with analytical modeling to develop physics-based reduced-order analytical models of nonlinear fluid-structure interactions in articulated naval platforms...
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Agena, S. M.; Pusey, M. L.; Bogle, I. D.
1999-01-01
A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.
International Nuclear Information System (INIS)
Anon.
1977-01-01
Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed
National Aeronautics and Space Administration — The Galactic model is a spatial and spectral template. The model for the Galactic diffuse emission was developed using spectral line surveys of HI and CO (as a...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
DEFF Research Database (Denmark)
Riis, Troels; Jørgensen, John Leif
1999-01-01
This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....
Laboratory of Biological Modeling
Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...
Bounding species distribution models
Directory of Open Access Journals (Sweden)
Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE
2011-10-01
Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].
Bounding Species Distribution Models
Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].
Directory of Open Access Journals (Sweden)
Oleg Svatos
2013-01-01
Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.
Emissions Modeling Clearinghouse
U.S. Environmental Protection Agency — The Emissions Modeling Clearinghouse (EMCH) supports and promotes emissions modeling activities both internal and external to the EPA. Through this site, the EPA...
Diaz Alva, Fredesvindo
2013-01-01
El problema que aborda el presente trabajo, está centrado en la necesidad de desarrollar en los estudiantes capacidades matemáticas, expresadas en el razonamiento y demostración; comunicación matemática y la resolución de problemas, asimismo apropiarse de diversas estrategias facto - perceptibles, para solucionar diversas situaciones problemáticas en el área de matemática. Los estudiantes desarrollan capacidades que les accedan actuar con criterios de comprensión, planificación, ejecución y c...
Differential models in ecology
International Nuclear Information System (INIS)
Barco Gomez, Carlos; Barco Gomez, German
2002-01-01
The models mathematical writings with differential equations are used to describe the populational behavior through the time of the animal species. These models can be lineal or no lineal. The differential models for unique specie include the exponential pattern of Malthus and the logistical pattern of Verlhust. The lineal differential models to describe the interaction between two species include the competition relationships, predation and symbiosis
Tashiro, Tohru
2014-03-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
GARCH Modelling of Cryptocurrencies
Directory of Open Access Journals (Sweden)
Jeffrey Chu
2017-10-01
Full Text Available With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.
Yongquan Zhou; Jian Xie; Liangliang Li; Mingzhi Ma
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformati...
Optimization modeling with spreadsheets
Baker, Kenneth R
2015-01-01
An accessible introduction to optimization analysis using spreadsheets Updated and revised, Optimization Modeling with Spreadsheets, Third Edition emphasizes model building skills in optimization analysis. By emphasizing both spreadsheet modeling and optimization tools in the freely available Microsoft® Office Excel® Solver, the book illustrates how to find solutions to real-world optimization problems without needing additional specialized software. The Third Edition includes many practical applications of optimization models as well as a systematic framework that il
Artificial neural network modelling
Samarasinghe, Sandhya
2016-01-01
This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .
2006-01-01
This is the version 1.1 of the TENCompetence Domain Model (version 1.0 released at 19-6-2006; version 1.1 at 9-11-2008). It contains several files: a) a pdf with the model description, b) three jpg files with class models (also in the pdf), c) a MagicDraw zip file with the model itself, d) a release
Petrone, Giovanni; Spagnuolo, Giovanni
2016-01-01
This comprehensive guide surveys all available models for simulating a photovoltaic (PV) generator at different levels of granularity, from cell to system level, in uniform as well as in mismatched conditions. Providing a thorough comparison among the models, engineers have all the elements needed to choose the right PV array model for specific applications or environmental conditions matched with the model of the electronic circuit used to maximize the PV power production.
International Nuclear Information System (INIS)
Tashiro, Tohru
2014-01-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model
Energy Technology Data Exchange (ETDEWEB)
J. Wang
2003-06-24
The purpose of this Model Report is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Office of Repository Development (ORD). The UZ contains the unsaturated rock layers overlying the repository and host unit, which constitute a natural barrier to flow, and the unsaturated rock layers below the repository which constitute a natural barrier to flow and transport. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.10.8 [under Work Package (WP) AUZM06, Climate Infiltration and Flow], and Section I-1-1 [in Attachment I, Model Validation Plans]). In Section 4.2, four acceptance criteria (ACs) are identified for acceptance of this Model Report; only one of these (Section 4.2.1.3.6.3, AC 3) was identified in the TWP (BSC 2002 [160819], Table 3-1). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, and drift-scale and mountain-scale coupled-process models from the UZ Flow, Transport and Coupled Processes Department in the Natural Systems Subproject of the Performance Assessment (PA) Project. The Calibrated Properties Model output will also be used by the Engineered Barrier System Department in the Engineering Systems Subproject. The Calibrated Properties Model provides input through the UZ Model and other process models of natural and engineered systems to the Total System Performance Assessment (TSPA) models, in accord with the PA Strategy and Scope in the PA Project of the Bechtel SAIC Company, LLC (BSC). The UZ process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions. UZ flow is a TSPA model component.
DEFF Research Database (Denmark)
Gudiksen, Sune Klok; Poulsen, Søren Bolvig; Buur, Jacob
2014-01-01
Well-established companies are currently struggling to secure profits due to the pressure from new players' business models as they take advantage of communication technology and new business-model configurations. Because of this, the business model research field flourishes currently; however, t...... illustrates how the application of participatory business model design toolsets can open up discussions on alternative scenarios through improvisation, mock-up making and design game playing, before qualitative judgment on the most promising scenario is carried out....
Model Checking Feature Interactions
DEFF Research Database (Denmark)
Le Guilly, Thibaut; Olsen, Petur; Pedersen, Thomas
2015-01-01
This paper presents an offline approach to analyzing feature interactions in embedded systems. The approach consists of a systematic process to gather the necessary information about system components and their models. The model is first specified in terms of predicates, before being refined to t...... to timed automata. The consistency of the model is verified at different development stages, and the correct linkage between the predicates and their semantic model is checked. The approach is illustrated on a use case from home automation....
DEFF Research Database (Denmark)
Thoft-Christensen, Palle
Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed.......Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed....
Model description and evaluation of model performance: DOSDIM model
International Nuclear Information System (INIS)
Lewyckyj, N.; Zeevaert, T.
1996-01-01
DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs
E. Gregory McPherson; Paula J. Peper
2012-01-01
This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...
Classifying variability modeling techniques
Sinnema, Marco; Deelstra, Sybren
Variability modeling is important for managing variability in software product families, especially during product derivation. In the past few years, several variability modeling techniques have been developed, each using its own concepts to model the variability provided by a product family. The
Energy Technology Data Exchange (ETDEWEB)
Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology
1996-12-31
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
International Nuclear Information System (INIS)
Lum, C.
2004-01-01
The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process
Vega, Solmaria Halleck; Elhorst, J. Paul
We provide a comprehensive overview of the strengths and weaknesses of different spatial econometric model specifications in terms of spillover effects. Based on this overview, we advocate taking the SLX model as point of departure in case a well-founded theory indicating which model is most
DEFF Research Database (Denmark)
Andresen, Mette
2007-01-01
-authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...
Modeling EERE Deployment Programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.
Models of Business Internationalisation
Directory of Open Access Journals (Sweden)
Jurgita Vabinskaitė
2011-04-01
Full Text Available The study deals with the theoretical models of business internationalisation: the “Uppsala” Internationalisation Model, modified “Uppsala” model, the Eclectic Paradigm and analysis of transactional costs, Industrial Network approach, the Advantage Package and the Advantage Cycle.Article in Lithuanian
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
Engineering of products and processes is increasingly “model-centric”. Models in their multitudinous forms are ubiquitous, being heavily used for a range of decision making activities across all life cycle phases. This chapter gives an overview of what is a model, the principal activities in the ...
Christensen, V.; Pauly, D.
1996-01-01
A brief review of the status of the ECOPATH modeling approach and software is presented, with emphasis on the recent release of a Windows version (ECOPATH 3.0), which enables consideration of uncertainties, and sets the stage for simulation modeling using ECOSIM. Modeling of coral reefs is emphasized.
International Nuclear Information System (INIS)
Martin Llorente, F.
1990-01-01
The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination
Kelderman, Hendrikus
1984-01-01
Existing statistical tests for the fit of the Rasch model have been criticized, because they are only sensitive to specific violations of its assumptions. Contingency table methods using loglinear models have been used to test various psychometric models. In this paper, the assumptions of the Rasch
Modeling Epidemic Network Failures
DEFF Research Database (Denmark)
Ruepp, Sarah Renée; Fagertun, Anna Manolova
2013-01-01
This paper presents the implementation of a failure propagation model for transport networks when multiple failures occur resulting in an epidemic. We model the Susceptible Infected Disabled (SID) epidemic model and validate it by comparing it to analytical solutions. Furthermore, we evaluate...... to evaluate multiple epidemic scenarios in various network types....
DEFF Research Database (Denmark)
Friis, Silje Alberthe Kamille; Gelting, Anne Katrine Gøtzsche
2014-01-01
the approaches and reach a new level of conscious action when designing? Informed by theories of design thinking, knowledge production, and learning, we have developed a model, the 5C model, accompanied by 62 method cards. Examples of how the model has been applied in an educational setting are provided...
The nontopological soliton model
International Nuclear Information System (INIS)
Wilets, L.
1988-01-01
The nontopological soliton model introduced by Friedberg and Lee, and variations of it, provide a method for modeling QCD which can effectively include the dynamics of hadronic collisions as well as spectra. Absolute color confinement is effected by the assumed dielectric properties of the medium. A recently proposed version of the model is chirally invariant. 32 refs., 5 figs., 1 tab
International Nuclear Information System (INIS)
Thomas, A.W.
1981-01-01
Recent developments in the bag model, in which the constraints of chiral symmetry are explicitly included are reviewed. The model leads to a new understanding of the Δ-resonance. The connection of the theory with current algebra is clarified and implications of the model for the structure of the nucleon are discussed
Flexible survival regression modelling
DEFF Research Database (Denmark)
Cortese, Giuliana; Scheike, Thomas H; Martinussen, Torben
2009-01-01
Regression analysis of survival data, and more generally event history data, is typically based on Cox's regression model. We here review some recent methodology, focusing on the limitations of Cox's regression model. The key limitation is that the model is not well suited to represent time-varyi...
Automated Simulation Model Generation
Huang, Y.
2013-01-01
One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become
DEFF Research Database (Denmark)
Bergdahl, Basti; Sonnenschein, Nikolaus; Machado, Daniel
2016-01-01
An introduction to genome-scale models, how to build and use them, will be given in this chapter. Genome-scale models have become an important part of systems biology and metabolic engineering, and are increasingly used in research, both in academica and in industry, both for modeling chemical pr...
Land, Kenneth C.
2001-01-01
Examines the definition, construction, and interpretation of social indicators. Shows how standard classes of formalisms used to construct models in contemporary sociology are derived from the general theory of models. Reviews recent model building and evaluation related to active life expectancy among the elderly, fertility rates, and indicators…
Intersection of Feature Models
van den Broek, P.M.
In this paper, we present an algorithm for the construction of the intersection of two feature models. The feature models are allowed to have "requires" and "excludes" constraints, and should be parent-compatible. The algorithm is applied to the problem of combining feature models from stakeholders
van den Broek, P.M.; Galvao, I.; Noppen, J.A.R.
2010-01-01
In this paper, we consider the problem of merging feature models which consist of trees with "requires" and "excludes" constraints. For any two such feature models which are parent-compatible, their merge is defined to be the smallest parent-compatible feature model which has all products of the
Bogiages, Christopher A.; Lotter, Christine
2011-01-01
In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…
Model Breaking Points Conceptualized
Vig, Rozy; Murray, Eileen; Star, Jon R.
2014-01-01
Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…
Rodarius, C.; Rooij, L. van; Lange, R. de
2007-01-01
The objective of this work was to create a scalable human occupant model that allows adaptation of human models with respect to size, weight and several mechanical parameters. Therefore, for the first time two scalable facet human models were developed in MADYMO. First, a scalable human male was
DEFF Research Database (Denmark)
Andreasen, Martin Møller; Meldrum, Andrew
pricing factors using the sequential regression approach. Our findings suggest that the two models largely provide the same in-sample fit, but loadings from ordinary and risk-adjusted Campbell-Shiller regressions are generally best matched by the shadow rate models. We also find that the shadow rate...... models perform better than the QTSMs when forecasting bond yields out of sample....
Modeling agriculture in the Community Land Model
Directory of Open Access Journals (Sweden)
B. Drewniak
2013-04-01
Full Text Available The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon–nitrogen version of the Community Land Model (CLM, to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model – simulating agriculture in a realistic way, complete with
Modeling agriculture in the Community Land Model
Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, V. R.; Jacob, R.
2013-04-01
The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types - maize, soybean, and spring wheat - into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model - simulating agriculture in a realistic way, complete with fertilizer and residue management
Yum, Soo-Young; Yoon, Ki-Young; Lee, Choong-Il; Lee, Byeong-Chun; Jang, Goo
2016-09-30
Animal models, particularly pigs, have come to play an important role in translational biomedical research. There have been many pig models with genetically modifications via somatic cell nuclear transfer (SCNT). However, because most transgenic pigs have been produced by random integration to date, the necessity for more exact gene-mutated models using recombinase based conditional gene expression like mice has been raised. Currently, advanced genome-editing technologies enable us to generate specific gene-deleted and -inserted pig models. In the future, the development of pig models with gene editing technologies could be a valuable resource for biomedical research.
Mathematical modelling techniques
Aris, Rutherford
1995-01-01
""Engaging, elegantly written."" - Applied Mathematical ModellingMathematical modelling is a highly useful methodology designed to enable mathematicians, physicists and other scientists to formulate equations from a given nonmathematical situation. In this elegantly written volume, a distinguished theoretical chemist and engineer sets down helpful rules not only for setting up models but also for solving the mathematical problems they pose and for evaluating models.The author begins with a discussion of the term ""model,"" followed by clearly presented examples of the different types of mode
Sörhammar, David; Bengtson, Anna
2006-01-01
In September 2005 SAS introduced a new business model. Where did the model come from and what influenced it? This paper’s focus is on the making of the model where we study the making of a business model as a dynamic process through time. In concrete terms, traces of today’s model can be found and examined from the SAS group’s embryonic attempts starting in 1946, through the financially good years during the 1980s, to the market re-regulation in contemporary time. During these years several c...
International Nuclear Information System (INIS)
Iachello, F.; Arima, A.
1987-01-01
The book gives an account of some of the properties of the interacting boson model. The model was introduced in 1974 to describe in a unified way the collective properties of nuclei. The book presents the mathematical techniques used to analyse the structure of the model. The mathematical framework of the model is discussed in detail. The book also contains all the formulae that have been developed throughout the years to account for collective properties of nuclei. These formulae can be used by experimentalists to compare their data with the predictions of the model. (U.K.)
International Nuclear Information System (INIS)
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.; Johnson, J.D.; Ostmeyer, R.M.; Aldrich, D.C.; Blond, R.M.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions
DEFF Research Database (Denmark)
Könemann, Patrick
2009-01-01
Computing differences (diffs) and merging different versions is well-known for text files, but for models it is a very young field - especially patches for models are still matter of research. Text-based and model-based diffs have different starting points because the semantics of their structure...... is fundamentally different. This paper reports on our ongoing work on model-independent diffs, i.e. a diff that does not directly refer to the models it was created from. Based on that, we present an idea of how the diff could be generalized, e.g. many atomic diffs are merged to a new, generalized diff. One use...
International Nuclear Information System (INIS)
Knee, H.E.; Schryver, J.C.
1991-01-01
Models of human behavior and cognition (HB and C) are necessary for understanding the total response of complex systems. Many such models have come available over the past thirty years for various applications. Unfortunately, many potential model users remain skeptical about their practicality, acceptability, and usefulness. Such hesitancy stems in part to disbelief in the ability to model complex cognitive processes, and a belief that relevant human behavior can be adequately accounted for through the use of commonsense heuristics. This paper will highlight several models of HB and C and identify existing and potential applications in attempt to dispel such notions. (author)
Blaha, Michael
2010-01-01
Best-selling author and database expert with more than 25 years of experience modeling application and enterprise data, Dr. Michael Blaha provides tried and tested data model patterns, to help readers avoid common modeling mistakes and unnecessary frustration on their way to building effective data models. Unlike the typical methodology book, "Patterns of Data Modeling" provides advanced techniques for those who have mastered the basics. Recognizing that database representation sets the path for software, determines its flexibility, affects its quality, and influences whether it succ
International Nuclear Information System (INIS)
McGraw, M.
2000-01-01
The UZ Colloid Transport model development plan states that the objective of this Analysis/Model Report (AMR) is to document the development of a model for simulating unsaturated colloid transport. This objective includes the following: (1) use of a process level model to evaluate the potential mechanisms for colloid transport at Yucca Mountain; (2) Provide ranges of parameters for significant colloid transport processes to Performance Assessment (PA) for the unsaturated zone (UZ); (3) Provide a basis for development of an abstracted model for use in PA calculations
Directory of Open Access Journals (Sweden)
Paul Walton
2014-09-01
Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.
Energy Technology Data Exchange (ETDEWEB)
Brown, T.W.
2010-11-15
The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)
DEFF Research Database (Denmark)
Højsgaard, Søren; Edwards, David; Lauritzen, Steffen L.
of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...
Meister, Jeffrey P.
1987-01-01
The Mechanics of Materials Model (MOMM) is a three-dimensional inelastic structural analysis code for use as an early design stage tool for hot section components. MOMM is a stiffness method finite element code that uses a network of beams to characterize component behavior. The MOMM contains three material models to account for inelastic material behavior. These include the simplified material model, which assumes a bilinear stress-strain response; the state-of-the-art model, which utilizes the classical elastic-plastic-creep strain decomposition; and Walker's viscoplastic model, which accounts for the interaction between creep and plasticity that occurs under cyclic loading conditions.
Long, John
2014-01-01
Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ
Computer Based Modelling and Simulation
Indian Academy of Sciences (India)
leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.
Making ecological models adequate
Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David
2018-01-01
Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.
Collins, Lisa M.; Part, Chérie E.
2013-01-01
Simple Summary In this review paper we discuss the different modeling techniques that have been used in animal welfare research to date. We look at what questions they have been used to answer, the advantages and pitfalls of the methods, and how future research can best use these approaches to answer some of the most important upcoming questions in farm animal welfare. Abstract The use of models in the life sciences has greatly expanded in scope and advanced in technique in recent decades. However, the range, type and complexity of models used in farm animal welfare is comparatively poor, despite the great scope for use of modeling in this field of research. In this paper, we review the different modeling approaches used in farm animal welfare science to date, discussing the types of questions they have been used to answer, the merits and problems associated with the method, and possible future applications of each technique. We find that the most frequently published types of model used in farm animal welfare are conceptual and assessment models; two types of model that are frequently (though not exclusively) based on expert opinion. Simulation, optimization, scenario, and systems modeling approaches are rarer in animal welfare, despite being commonly used in other related fields. Finally, common issues such as a lack of quantitative data to parameterize models, and model selection and validation are discussed throughout the review, with possible solutions and alternative approaches suggested. PMID:26487411
Energy Technology Data Exchange (ETDEWEB)
T. Ghezzehej
2004-10-04
The purpose of this model report is to document the calibrated properties model that provides calibrated property sets for unsaturated zone (UZ) flow and transport process models (UZ models). The calibration of the property sets is performed through inverse modeling. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.6 and 2.1.1.6). Direct inputs to this model report were derived from the following upstream analysis and model reports: ''Analysis of Hydrologic Properties Data'' (BSC 2004 [DIRS 170038]); ''Development of Numerical Grids for UZ Flow and Transport Modeling'' (BSC 2004 [DIRS 169855]); ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]); ''Geologic Framework Model'' (GFM2000) (BSC 2004 [DIRS 170029]). Additionally, this model report incorporates errata of the previous version and closure of the Key Technical Issue agreement TSPAI 3.26 (Section 6.2.2 and Appendix B), and it is revised for improved transparency.
Energy Technology Data Exchange (ETDEWEB)
Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-13
These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.
CREDIT RISK. DETERMINATION MODELS
Directory of Open Access Journals (Sweden)
MIHAELA GRUIESCU
2012-01-01
Full Text Available The internationalization of financial flows and banking and the rapid development of markets have changed the financial sector, causing him to respond with force and imagination. Under these conditions, the concerns of financial and banking institutions, rating institutions are increasingly turning to find the best solutions to hedge risks and maximize profits. This paper aims to present a number of advantages, but also limits the Merton model, the first structural model for modeling credit risk. Also, some are extensions of the model, some empirical research and performance known, others such as state-dependent models (SDM, which together with the liquidation process models (LPM, are two recent efforts in the structural models, show different phenomena in real life.
Modelling of wastewater systems
DEFF Research Database (Denmark)
Bechmann, Henrik
Oxygen Demand) flux and SS flux in the inlet to the WWTP. COD is measured by means of a UV absorption sensor while SS is measured by a turbidity sensor. These models include a description of the deposit of COD and SS amounts, respectively, in the sewer system, and the models can thus be used to quantify......In this thesis, models of pollution fluxes in the inlet to 2 Danish wastewater treatment plants (WWTPs) as well as of suspended solids (SS) concentrations in the aeration tanks of an alternating WWTP and in the effluent from the aeration tanks are developed. The latter model is furthermore used...... to analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time...
Energy Technology Data Exchange (ETDEWEB)
Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann
2008-09-01
In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.
Hydrological land surface modelling
DEFF Research Database (Denmark)
Ridler, Marc-Etienne Francois
and disaster management. The objective of this study is to develop and investigate methods to reduce hydrological model uncertainty by using supplementary data sources. The data is used either for model calibration or for model updating using data assimilation. Satellite estimates of soil moisture and surface......Recent advances in integrated hydrological and soil-vegetation-atmosphere transfer (SVAT) modelling have led to improved water resource management practices, greater crop production, and better flood forecasting systems. However, uncertainty is inherent in all numerical models ultimately leading...... hydrological and tested by assimilating synthetic hydraulic head observations in a catchment in Denmark. Assimilation led to a substantial reduction of model prediction error, and better model forecasts. Also, a new assimilation scheme is developed to downscale and bias-correct coarse satellite derived soil...
Hydrological land surface modelling
DEFF Research Database (Denmark)
Ridler, Marc-Etienne Francois
Recent advances in integrated hydrological and soil-vegetation-atmosphere transfer (SVAT) modelling have led to improved water resource management practices, greater crop production, and better flood forecasting systems. However, uncertainty is inherent in all numerical models ultimately leading...... and disaster management. The objective of this study is to develop and investigate methods to reduce hydrological model uncertainty by using supplementary data sources. The data is used either for model calibration or for model updating using data assimilation. Satellite estimates of soil moisture and surface...... hydrological and tested by assimilating synthetic hydraulic head observations in a catchment in Denmark. Assimilation led to a substantial reduction of model prediction error, and better model forecasts. Also, a new assimilation scheme is developed to downscale and bias-correct coarse satellite derived soil...
DEFF Research Database (Denmark)
Laursen, Jesper
solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...
DEFF Research Database (Denmark)
Tamke, Martin
2015-01-01
and informed feedback. Introducing the term "Aware models", the paper investigates how computational models become an enabler for a better informed architectural design practice, through the embedding of knowledge about constraints, behaviour and processes of formation and making into generative design models......Appearing almost alive, a novel set of computational design models can become an active counterpart for architects in the design process. The ability to loop, sense and query and the integration of near real-time simulation provide these models with a depth and agility that allows for instant....... The inspection of several computational design projects in architectural research highlights three different types of awareness a model can possess and devises strategies to establish and finally design with aware models. This design practice is collaborative in nature and characterized by a bidirectional flow...
Directory of Open Access Journals (Sweden)
Alexander Fedorov
2011-03-01
Full Text Available The author supposed that media education models can be divided into the following groups:- educational-information models (the study of the theory, history, language of media culture, etc., based on the cultural, aesthetic, semiotic, socio-cultural theories of media education;- educational-ethical models (the study of moral, religions, philosophical problems relying on the ethic, religious, ideological, ecological, protectionist theories of media education;- pragmatic models (practical media technology training, based on the uses and gratifications and ‘practical’ theories of media education;- aesthetical models (aimed above all at the development of the artistic taste and enriching the skills of analysis of the best media culture examples. Relies on the aesthetical (art and cultural studies theory; - socio-cultural models (socio-cultural development of a creative personality as to the perception, imagination, visual memory, interpretation analysis, autonomic critical thinking, relying on the cultural studies, semiotic, ethic models of media education.
Untangling RFID Privacy Models
Directory of Open Access Journals (Sweden)
Iwen Coisel
2013-01-01
Full Text Available The rise of wireless applications based on RFID has brought up major concerns on privacy. Indeed nowadays, when such an application is deployed, informed customers yearn for guarantees that their privacy will not be threatened. One formal way to perform this task is to assess the privacy level of the RFID application with a model. However, if the chosen model does not reflect the assumptions and requirements of the analyzed application, it may misevaluate its privacy level. Therefore, selecting the most appropriate model among all the existing ones is not an easy task. This paper investigates the eight most well-known RFID privacy models and thoroughly examines their advantages and drawbacks in three steps. Firstly, five RFID authentication protocols are analyzed with these models. This discloses a main worry: although these protocols intuitively ensure different privacy levels, no model is able to accurately distinguish them. Secondly, these models are grouped according to their features (e.g., tag corruption ability. This classification reveals the most appropriate candidate model(s to be used for a privacy analysis when one of these features is especially required. Furthermore, it points out that none of the models are comprehensive. Hence, some combinations of features may not match any model. Finally, the privacy properties of the eight models are compared in order to provide an overall view of their relations. This part highlights that no model globally outclasses the other ones. Considering the required properties of an application, the thorough study provided in this paper aims to assist system designers to choose the best suited model.
Energy Technology Data Exchange (ETDEWEB)
Hammerand, Daniel Carl; Scherzinger, William Mark
2007-09-01
The Library of Advanced Materials for Engineering (LAME) provides a common repository for constitutive models that can be used in computational solid mechanics codes. A number of models including both hypoelastic (rate) and hyperelastic (total strain) constitutive forms have been implemented in LAME. The structure and testing of LAME is described in Scherzinger and Hammerand ([3] and [4]). The purpose of the present report is to describe the material models which have already been implemented into LAME. The descriptions are designed to give useful information to both analysts and code developers. Thus far, 33 non-ITAR/non-CRADA protected material models have been incorporated. These include everything from the simple isotropic linear elastic models to a number of elastic-plastic models for metals to models for honeycomb, foams, potting epoxies and rubber. A complete description of each model is outside the scope of the current report. Rather, the aim here is to delineate the properties, state variables, functions, and methods for each model. However, a brief description of some of the constitutive details is provided for a number of the material models. Where appropriate, the SAND reports available for each model have been cited. Many models have state variable aliases for some or all of their state variables. These alias names can be used for outputting desired quantities. The state variable aliases available for results output have been listed in this report. However, not all models use these aliases. For those models, no state variable names are listed. Nevertheless, the number of state variables employed by each model is always given. Currently, there are four possible functions for a material model. This report lists which of these four methods are employed in each material model. As far as analysts are concerned, this information is included only for the awareness purposes. The analyst can take confidence in the fact that model has been properly implemented
International Nuclear Information System (INIS)
Lundberg, Jonas; Johansson, Björn JE
2015-01-01
It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies
Directory of Open Access Journals (Sweden)
Dan Alexandru Anghel
2012-01-01
Full Text Available In semiconductor laser modeling, a good mathematical model gives near-reality results. Three methods of modeling solutions from the rate equations are presented and analyzed. A method based on the rate equations modeled in Simulink to describe quantum well lasers was presented. For different signal types like step function, saw tooth and sinus used as input, a good response of the used equations is obtained. Circuit model resulting from one of the rate equations models is presented and simulated in SPICE. Results show a good modeling behavior. Numerical simulation in MathCad gives satisfactory results for the study of the transitory and dynamic operation at small level of the injection current. The obtained numerical results show the specific limits of each model, according to theoretical analysis. Based on these results, software can be built that integrates circuit simulation and other modeling methods for quantum well lasers to have a tool that model and analysis these devices from all points of view.
Geochemical modeling: a review
International Nuclear Information System (INIS)
Jenne, E.A.
1981-06-01
Two general families of geochemical models presently exist. The ion speciation-solubility group of geochemical models contain submodels to first calculate a distribution of aqueous species and to secondly test the hypothesis that the water is near equilibrium with particular solid phases. These models may or may not calculate the adsorption of dissolved constituents and simulate the dissolution and precipitation (mass transfer) of solid phases. Another family of geochemical models, the reaction path models, simulates the stepwise precipitation of solid phases as a result of reacting specified amounts of water and rock. Reaction path models first perform an aqueous speciation of the dissolved constituents of the water, test solubility hypotheses, then perform the reaction path modeling. Certain improvements in the present versions of these models would enhance their value and usefulness to applications in nuclear-waste isolation, etc. Mass-transfer calculations of limited extent are certainly within the capabilities of state-of-the-art models. However, the reaction path models require an expansion of their thermodynamic data bases and systematic validation before they are generally accepted
International Nuclear Information System (INIS)
Tanaka, G.
1998-01-01
The data presented here includes male and female models for Asian populations in the age groups: Newborn, 1 year, 5 years, 10 years, 15 years and adult. The model for adult male was presented at the 3rd Research Coordination Meeting held in Tianjin, October 1993. At that time, the CRP participants requested Dr. Tanaka to continue development of a female model. The adult female model was developed together with models for five younger age groups. It is intended to provide useful data for radiation protection, and has been submitted to ICRP for use in developing revised models for internal dosimetry. The model is based on normal organ masses as well as physical measurements obtained primarily from Chinese, Indian and Japanese populations. These are believed to be the most extensive data sets available. The data presented here also takes into account the variations found in the data reported by other CRP participants. It should be stressed that the model is, at the same time, based on the approach used by the ICRP Reference Man Task Group in development of their Reference Man. As noted above, the adult male model was presented at the RCM Meeting in Tianjin and approved by the participants as ''Tanaka Model'' that would be convenient for use in internal dosimetry studies for subjects from Asian populations. It is also the essential part of a publication which is a revised edition of the previous work
International Nuclear Information System (INIS)
Martin, W.E.; McDonald, L.A.
1997-01-01
The eight book chapters demonstrate the link between the physical models of the environment and the policy analysis in support of policy making. Each chapter addresses an environmental policy issue using a quantitative modeling approach. The volume addresses three general areas of environmental policy - non-point source pollution in the agricultural sector, pollution generated in the extractive industries, and transboundary pollutants from burning fossil fuels. The book concludes by discussing the modeling efforts and the use of mathematical models in general. Chapters are entitled: modeling environmental policy: an introduction; modeling nonpoint source pollution in an integrated system (agri-ecological); modeling environmental and trade policy linkages: the case of EU and US agriculture; modeling ecosystem constraints in the Clean Water Act: a case study in Clearwater National Forest (subject to discharge from metal mining waste); costs and benefits of coke oven emission controls; modeling equilibria and risk under global environmental constraints (discussing energy and environmental interrelations); relative contribution of the enhanced greenhouse effect on the coastal changes in Louisiana; and the use of mathematical models in policy evaluations: comments. The paper on coke area emission controls has been abstracted separately for the IEA Coal Research CD-ROM
Mishra, Ashok K.; Singh, Vijay P.
2011-06-01
SummaryIn recent years droughts have been occurring frequently, and their impacts are being aggravated by the rise in water demand and the variability in hydro-meteorological variables due to climate change. As a result, drought hydrology has been receiving much attention. A variety of concepts have been applied to modeling droughts, ranging from simplistic approaches to more complex models. It is important to understand different modeling approaches as well as their advantages and limitations. This paper, supplementing the previous paper ( Mishra and Singh, 2010) where different concepts of droughts were highlighted, reviews different methodologies used for drought modeling, which include drought forecasting, probability based modeling, spatio-temporal analysis, use of Global Climate Models (GCMs) for drought scenarios, land data assimilation systems for drought modeling, and drought planning. It is found that there have been significant improvements in modeling droughts over the past three decades. Hybrid models, incorporating large scale climate indices, seem to be promising for long lead-time drought forecasting. Further research is needed to understand the spatio-temporal complexity of droughts under climate change due to changes in spatio-temporal variability of precipitation. Applications of copula based models for multivariate drought characterization seem to be promising for better drought characterization. Research on decision support systems should be advanced for issuing warnings, assessing risk, and taking precautionary measures, and the effective ways for the flow of information from decision makers to users need to be developed. Finally, some remarks are made regarding the future outlook for drought research.
Models as Relational Categories
Kokkonen, Tommi
2017-11-01
Model-based learning (MBL) has an established position within science education. It has been found to enhance conceptual understanding and provide a way for engaging students in authentic scientific activity. Despite ample research, few studies have examined the cognitive processes regarding learning scientific concepts within MBL. On the other hand, recent research within cognitive science has examined the learning of so-called relational categories. Relational categories are categories whose membership is determined on the basis of the common relational structure. In this theoretical paper, I argue that viewing models as relational categories provides a well-motivated cognitive basis for MBL. I discuss the different roles of models and modeling within MBL (using ready-made models, constructive modeling, and generative modeling) and discern the related cognitive aspects brought forward by the reinterpretation of models as relational categories. I will argue that relational knowledge is vital in learning novel models and in the transfer of learning. Moreover, relational knowledge underlies the coherent, hierarchical knowledge of experts. Lastly, I will examine how the format of external representations may affect the learning of models and the relevant relations. The nature of the learning mechanisms underlying students' mental representations of models is an interesting open question to be examined. Furthermore, the ways in which the expert-like knowledge develops and how to best support it is in need of more research. The discussion and conceptualization of models as relational categories allows discerning students' mental representations of models in terms of evolving relational structures in greater detail than previously done.
Arnold, Konstantin; Kiefer, Florian; Kopp, Jürgen; Battey, James N D; Podvinec, Michael; Westbrook, John D; Berman, Helen M; Bordoli, Lorenza; Schwede, Torsten
2009-03-01
Structural Genomics has been successful in determining the structures of many unique proteins in a high throughput manner. Still, the number of known protein sequences is much larger than the number of experimentally solved protein structures. Homology (or comparative) modeling methods make use of experimental protein structures to build models for evolutionary related proteins. Thereby, experimental structure determination efforts and homology modeling complement each other in the exploration of the protein structure space. One of the challenges in using model information effectively has been to access all models available for a specific protein in heterogeneous formats at different sites using various incompatible accession code systems. Often, structure models for hundreds of proteins can be derived from a given experimentally determined structure, using a variety of established methods. This has been done by all of the PSI centers, and by various independent modeling groups. The goal of the Protein Model Portal (PMP) is to provide a single portal which gives access to the various models that can be leveraged from PSI targets and other experimental protein structures. A single interface allows all existing pre-computed models across these various sites to be queried simultaneously, and provides links to interactive services for template selection, target-template alignment, model building, and quality assessment. The current release of the portal consists of 7.6 million model structures provided by different partner resources (CSMP, JCSG, MCSG, NESG, NYSGXRC, JCMM, ModBase, SWISS-MODEL Repository). The PMP is available at http://www.proteinmodelportal.org and from the PSI Structural Genomics Knowledgebase.
Modelling cointegration in the vector autoregressive model
DEFF Research Database (Denmark)
Johansen, Søren
2000-01-01
A survey is given of some results obtained for the cointegrated VAR. The Granger representation theorem is discussed and the notions of cointegration and common trends are defined. The statistical model for cointegrated I(1) variables is defined, and it is shown how hypotheses on the cointegratin...
Developing mathematical modelling competence
DEFF Research Database (Denmark)
Blomhøj, Morten; Jensen, Tomas Højgaard
2003-01-01
In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....
Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: "bats approach their prey." Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization.
Directory of Open Access Journals (Sweden)
Yongquan Zhou
2014-01-01
Full Text Available Bat algorithm (BA is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: “bats approach their prey.” Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization.
International Nuclear Information System (INIS)
Bozoki, E.
1987-01-01
There is burgeoning interest in modeling-based accelerator control. With more and more stringent requirements on the performance, the importance of knowing, controlling, predicting the behavior of the accelerator system is growing. Modeling means two things: (1) the development of programs and data which predict the outcome of a measurement, and (2) devising and performing measurements to find the machine physics parameter and their behavior under different conditions. These two sides should be tied together in an iterative process. With knowledge gained on the real system, the model will be modified, calibrated, and fine-tuned. The model of a system consists of data and the modeling program. The Modeling Based Control Programs (MBC) should in the on-line mode control, optimize, and correct the machine. In the off-line mode, the MBC is used to simulate the machine as well as explore and study its behavior and responses under a wide variety of circumstances. 15 refs., 3 figs
Inverse and Predictive Modeling
Energy Technology Data Exchange (ETDEWEB)
Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-09-27
The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.
Lawson, Andrew B
2002-01-01
Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal ...
DEFF Research Database (Denmark)
Borlund, Pia
2003-01-01
An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation...... of IIR systems as realistically as possible with reference to actual information searching and retrieval processes, though still in a relatively controlled evaluation environment; and 2) to calculate the IIR system performance taking into account the non-binary nature of the assigned relevance...... assessments. The IIR evaluation model is presented as an alternative to the system-driven Cranfield model (Cleverdon, Mills & Keen, 1966; Cleverdon & Keen, 1966) which still is the dominant approach to the evaluation of IR and IIR systems. Key elements of the IIR evaluation model are the use of realistic...
Essentials of econophysics modelling
Slanina, Frantisek
2014-01-01
This book is a course in methods and models rooted in physics and used in modelling economic and social phenomena. It covers the discipline of econophysics, which creates an interface between physics and economics. Besides the main theme, it touches on the theory of complex networks and simulations of social phenomena in general. After a brief historical introduction, the book starts with a list of basic empirical data and proceeds to thorough investigation of mathematical and computer models. Many of the models are based on hypotheses of the behaviour of simplified agents. These comprise strategic thinking, imitation, herding, and the gem of econophysics, the so-called minority game. At the same time, many other models view the economic processes as interactions of inanimate particles. Here, the methods of physics are especially useful. Examples of systems modelled in such a way include books of stock-market orders, and redistribution of wealth among individuals. Network effects are investigated in the inter...
DEFF Research Database (Denmark)
Andersen, Kasper Winther
Three main topics are presented in this thesis. The first and largest topic concerns network modelling of functional Magnetic Resonance Imaging (fMRI) and Diffusion Weighted Imaging (DWI). In particular nonparametric Bayesian methods are used to model brain networks derived from resting state f...... for their ability to reproduce node clustering and predict unseen data. Comparing the models on whole brain networks, BCD and IRM showed better reproducibility and predictability than IDM, suggesting that resting state networks exhibit community structure. This also points to the importance of using models, which...... allow for complex interactions between all pairs of clusters. In addition, it is demonstrated how the IRM can be used for segmenting brain structures into functionally coherent clusters. A new nonparametric Bayesian network model is presented. The model builds upon the IRM and can be used to infer...
Identification of physical models
DEFF Research Database (Denmark)
Melgaard, Henrik
1994-01-01
The problem of identification of physical models is considered within the frame of stochastic differential equations. Methods for estimation of parameters of these continuous time models based on descrete time measurements are discussed. The important algorithms of a computer program for ML or MAP...... design of experiments, which is for instance the design of an input signal that are optimal according to a criterion based on the information provided by the experiment. Also model validation is discussed. An important verification of a physical model is to compare the physical characteristics...... of the model with the available prior knowledge. The methods for identification of physical models have been applied in two different case studies. One case is the identification of thermal dynamics of building components. The work is related to a CEC research project called PASSYS (Passive Solar Components...
Diaspora Business Model Innovation
Directory of Open Access Journals (Sweden)
Aki Harima
2015-01-01
Full Text Available This paper explores how diasporans achieve business model innovatin by using their unique resources. The hypothesis underlying the paper is that the unique backgrounds and resources of the diaspora businesses, due to diffrent sources of informatin and experiences as well as multile networks, contributes to business model innovatin in a distictie manner. We investiate the English school market in the Philippines which is established by East Asian diaspora who innovate a business model of conventinal English schools. Two case studies were conducted with Japanese diaspora English schools. Their business is analyzed using a business model canvas (Osterwalder & Pigneur, 2010 and contrasted with the conventinal business model. The empirical cases show that diaspora businesses use knowledge about their country of origin and engage with country of residence and multile networks in diffrent locatins and constellatins to identiy unique opportunitis, leading to a business model innovatin.
Electricity market modeling trends
International Nuclear Information System (INIS)
Ventosa, Mariano; Baillo, Alvaro; Ramos, Andres; Rivier, Michel
2005-01-01
The trend towards competition in the electricity sector has led to efforts by the research community to develop decision and analysis support models adapted to the new market context. This paper focuses on electricity generation market modeling. Its aim is to help to identify, classify and characterize the somewhat confusing diversity of approaches that can be found in the technical literature on the subject. The paper presents a survey of the most relevant publications regarding electricity market modeling, identifying three major trends: optimization models, equilibrium models and simulation models. It introduces a classification according to their most relevant attributes. Finally, it identifies the most suitable approaches for conducting various types of planning studies or market analysis in this new context
Energy Technology Data Exchange (ETDEWEB)
Tan, A; Lyatskaya, I [Department of Physics, Alabama A and M University, Normal, AL 35762 (United States)], E-mail: arjun.tan@aamu.edu
2009-01-15
The interesting papers by Margaritondo (2005 Eur. J. Phys. 26 401) and by Helene and Yamashita (2006 Eur. J. Phys. 27 855) analysed the great Indian Ocean tsunami of 2004 using a simple one-dimensional canal wave model, which was appropriate for undergraduate students in physics and related fields of discipline. In this paper, two additional, easily understandable models, suitable for the same level of readership, are proposed: one, a two-dimensional model in flat space, and two, the same on a spherical surface. The models are used to study the tsunami produced by the central Kuril earthquake of November 2006. It is shown that the two alternative models, especially the latter one, give better representations of the wave amplitude, especially at far-flung locations. The latter model further demonstrates the enhancing effect on the amplitude due to the curvature of the Earth for far-reaching tsunami propagation.
North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.
1981-01-01
An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved, and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.
Macklin, Paul; Cristini, Vittorio
2013-01-01
Simulating cancer behavior across multiple biological scales in space and time, i.e., multiscale cancer modeling, is increasingly being recognized as a powerful tool to refine hypotheses, focus experiments, and enable more accurate predictions. A growing number of examples illustrate the value of this approach in providing quantitative insight on the initiation, progression, and treatment of cancer. In this review, we introduce the most recent and important multiscale cancer modeling works that have successfully established a mechanistic link between different biological scales. Biophysical, biochemical, and biomechanical factors are considered in these models. We also discuss innovative, cutting-edge modeling methods that are moving predictive multiscale cancer modeling toward clinical application. Furthermore, because the development of multiscale cancer models requires a new level of collaboration among scientists from a variety of fields such as biology, medicine, physics, mathematics, engineering, and computer science, an innovative Web-based infrastructure is needed to support this growing community. PMID:21529163
Energy Technology Data Exchange (ETDEWEB)
Bergen, Benjamin Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-07-07
This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programming models. It starts by listing their assumptions for the programming models and then details a hierarchical programming model at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.
Modelling Meat Quality Attributes.
Farrell, Terence C.
2001-01-01
Recent meat demand models incorporate demand functions for cuts of meat rather than whole carcasses. However, parameters for “meat quality” are seldom included in such models. Modelling difficulty arises as meat cuts are heterogeneous in their quality attributes. Meat quality may be assessed by measurement of attributes including tenderness, juiciness and flavour. Cooking method and cooking time are the two primary factors that affect meat-eating quality. The purpose of this paper is to show ...
Energy Technology Data Exchange (ETDEWEB)
Engel, D.W.; McGrail, B.P.
1993-11-01
The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST.
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
Arnold, Konstantin; Kiefer, Florian; Kopp, J?rgen; Battey, James N. D.; Podvinec, Michael; Westbrook, John D.; Berman, Helen M.; Bordoli, Lorenza; Schwede, Torsten
2008-01-01
Structural Genomics has been successful in determining the structures of many unique proteins in a high throughput manner. Still, the number of known protein sequences is much larger than the number of experimentally solved protein structures. Homology (or comparative) modeling methods make use of experimental protein structures to build models for evolutionary related proteins. Thereby, experimental structure determination efforts and homology modeling complement each other in the exploratio...
Modeling Frequency Comb Sources
Directory of Open Access Journals (Sweden)
Li Feng
2016-06-01
Full Text Available Frequency comb sources have revolutionized metrology and spectroscopy and found applications in many fields. Stable, low-cost, high-quality frequency comb sources are important to these applications. Modeling of the frequency comb sources will help the understanding of the operation mechanism and optimization of the design of such sources. In this paper,we review the theoretical models used and recent progress of the modeling of frequency comb sources.
Altun, Emrah; Tatlidil, Hüseyin
2016-01-01
In this study, wavelet based GARCH-Extreme Value Theory (EVT) is proposed to model financial return series to forecast daily value-at-risk. Wavelets based GARCH-EVT is hybrid model combining the wavelet analysis and EVT. Proposed model contains three stages. In first stage, return series is decomposed into wavelet series and approximation series by applying the maximal overlap discrete wavelet transform. Second stage, detrended return series and approximation series are obtained by using wave...
Energy Technology Data Exchange (ETDEWEB)
Young, Michael F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-07-01
Aerosol particles that deposit on surfaces may be subsequently resuspended by air flowing over the surface. A review of models for this liftoff process is presented and compared to available data. Based on this review, a model that agrees with existing data and is readily computed is presented for incorporation into a system level code such as MELCOR. Liftoff Model for MELCOR July 2015 4 This page is intentionally blank
International Nuclear Information System (INIS)
Wilczek, F.
1993-01-01
The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs
Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department
2017-06-22
This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.
International Nuclear Information System (INIS)
Engel, D.W.; McGrail, B.P.
1993-11-01
The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST
Vacuum inhomogeneous cosmological models
International Nuclear Information System (INIS)
Hanquin, J.-L.
1984-01-01
The author presents some results concerning the vacuum cosmological models which admit a 2-dimensional Abelian group of isometries: classifications of these space-times based on the topological nature of their space-like hypersurfaces and on their time evolution, analysis of the asymptotical behaviours at spatial infinity for hyperbolical models as well as in the neighbourhood of the singularity for the models possessing a time singularity during their evolution. (Auth.)
International Nuclear Information System (INIS)
Pleitez, V.
1994-01-01
The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs
Yum, Soo-Young; Yoon, Ki-Young; Lee, Choong-Il; Lee, Byeong-Chun; Jang, Goo
2016-01-01
Animal models, particularly pigs, have come to play an important role in translational biomedical research. There have been many pig models with genetically modifications via somatic cell nuclear transfer (SCNT). However, because most transgenic pigs have been produced by random integration to date, the necessity for more exact gene-mutated models using recombinase based conditional gene expression like mice has been raised. Currently, advanced genome-editing technologies enable us to generat...
Levy, R.; Mcginness, H.
1976-01-01
Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.
Gillespie, Ronald J; Robinson, Edward A
2005-05-01
Although the structure of almost any molecule can now be obtained by ab initio calculations chemists still look for simple answers to the question "What determines the geometry of a given molecule?" For this purpose they make use of various models such as the VSEPR model and qualitative quantum mechanical models such as those based on the valence bond theory. The present state of such models, and the support for them provided by recently developed methods for analyzing calculated electron densities, are reviewed and discussed in this tutorial review.
Croatian Cadastre Database Modelling
Directory of Open Access Journals (Sweden)
Zvonko Biljecki
2013-04-01
Full Text Available The Cadastral Data Model has been developed as a part of a larger programme to improve products and production environment of the Croatian Cadastral Service of the State Geodetic Administration (SGA. The goal of the project was to create a cadastral data model conforming to relevant standards and specifications in the field of geoinformation (GI adapted by international organisations for standardisation under the competence of GI (ISO TC211 and OpenGIS and it implementations.The main guidelines during the project have been object-oriented conceptual modelling of the updated users' requests and a "new" cadastral data model designed by SGA - Faculty of Geodesy - Geofoto LLC project team. The UML of the conceptual model is given per all feature categories and is described only at class level. The next step was the UML technical model, which was developed from the UML conceptual model. The technical model integrates different UML schemas in one united schema.XML (eXtensible Markup Language was applied for XML description of UML models, and then the XML schema was transferred into GML (Geography Markup Language application schema. With this procedure we have completely described the behaviour of each cadastral feature and rules for the transfer and storage of cadastral features into the database.
Højsgaard, Søren; Lauritzen, Steffen
2012-01-01
Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add
Energy Technology Data Exchange (ETDEWEB)
Brown-VanHoozer, S. A.
1999-06-02
Conscious awareness of our environment is based on a feedback loop comprised of sensory input transmitted to the central nervous system leading to construction of our ''model of the world,'' (Lewis et al, 1982). We then assimilate the neurological model at the unconscious level into information we can later consciously consider useful in identifying belief systems and behaviors for designing diverse systems. Thus, we can avoid potential problems based on our open-to-error perceived reality of the world. By understanding how our model of reality is organized, we allow ourselves to transcend content and develop insight into how effective choices and belief systems are generated through sensory derived processes. These are the processes which provide the designer the ability to meta model (build a model of a model) the user; consequently, matching the mental model of the user with that of the designer's and, coincidentally, forming rapport between the two participants. The information shared between the participants is neither assumed nor generalized, it is closer to equivocal; thus minimizing error through a sharing of each other's model of reality. How to identify individual mental mechanisms or processes, how to organize the individual strategies of these mechanisms into useful patterns, and to formulate these into models for success and knowledge based outcomes is the subject of the discussion that follows.
Dynamic Latent Classification Model
DEFF Research Database (Denmark)
Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre
Monitoring a complex process often involves keeping an eye on hundreds or thousands of sensors to determine whether or not the process is under control. We have been working with dynamic data from an oil production facility in the North sea, where unstable situations should be identified as soon...... as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...... in the process as well as modeling dependences between attributes....
Diffeomorphic Statistical Deformation Models
DEFF Research Database (Denmark)
Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus
2007-01-01
In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...
Inside - Outside Model Viewing
DEFF Research Database (Denmark)
Nikolov, Ivan Adriyanov
2016-01-01
components of the model, their proportions compared to each other and the overall design. A variety of augmented reality(AR) applications have been created for overall visualization of large scale models. For tours inside 3D renderings of models many immersive virtual reality (VR) applications exist. Both...... types of applications have their limitation, omitting either important details in the AR case or the full picture in the case of VR. This paper presents a low-cost way to demonstrate models using a hybrid virtual environment system (HVE), combining virtual reality and augmented reality visualization...
International Nuclear Information System (INIS)
Leino-Forsman, H.; Olin, M.
1991-01-01
The first Seminar on Groundwater Modelling was arranged by VTT (Reactor Laboratory) in Espoo Finland in May 1991. The one day seminar dealt both with modelling of geochemistry and transport of groundwater, as well as mathematical methods for modelling. The seminar concentrated on giving a broad picture of the applications of groundwater modelling e.g. nuclear waste, groundwater resources including artificial groundwater and pollution. The participants came from research institutes and universities as well as engineering companies. Articles are published in Finnish with English abstracts
Zagorsek, Branislav
2013-01-01
The purpose of this paper is to present possible ways how could a company maintain or even gain its competitive advantage in high dynamic business environment from a per-spective of business models. After a short introduction on evolution of innovation, this paper is divided in three parts. In first part it discusses the business model itself, how to design a business model and how to deal with it. Second part discusses business model innovations. When and how to innovate or reinvent your bus...
Computer Modeling and Simulation
Energy Technology Data Exchange (ETDEWEB)
Pronskikh, V. S. [Fermilab
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
Sapyta, Joe; Reid, Hank; Walton, Lew
The topics are presented in viewgraph form and include the following: particle bed reactor (PBR) core cross section; PBR bleed cycle; fuel and moderator flow paths; PBR modeling requirements; characteristics of PBR and nuclear thermal propulsion (NTP) modeling; challenges for PBR and NTP modeling; thermal hydraulic computer codes; capabilities for PBR/reactor application; thermal/hydralic codes; limitations; physical correlations; comparison of predicted friction factor and experimental data; frit pressure drop testing; cold frit mask factor; decay heat flow rate; startup transient simulation; and philosophy of systems modeling.
MARKETING MODELS APPLICATION EXPERIENCE
Directory of Open Access Journals (Sweden)
A. Yu. Rymanov
2011-01-01
Full Text Available Marketing models are used for the assessment of such marketing elements as sales volume, market share, market attractiveness, advertizing costs, product pushing and selling, profit, profitableness. Classification of buying process decision taking models is presented. SWOT- and GAPbased models are best for selling assessments. Lately, there is a tendency to transfer from the assessment on the ba-sis of financial indices to that on the basis of those non-financial. From the marketing viewpoint, most important are long-term company activities and consumer drawingmodels as well as market attractiveness operative models.
Multifamily Envelope Leakage Model
Energy Technology Data Exchange (ETDEWEB)
Faakye, O. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Griffiths, D. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)
2015-05-01
The objective of the 2013 research project was to develop the model for predicting fully guarded test results (FGT), using unguarded test data and specific building features of apartment units. The model developed has a coefficient of determination R2 value of 0.53 with a root mean square error (RMSE) of 0.13. Both statistical metrics indicate that the model is relatively strong. When tested against data that was not included in the development of the model, prediction accuracy was within 19%, which is reasonable given that seasonal differences in blower door measurements can vary by as much as 25%.
Faraway, Julian J
2014-01-01
A Hands-On Way to Learning Data AnalysisPart of the core of statistics, linear models are used to make predictions and explain the relationship between the response and the predictors. Understanding linear models is crucial to a broader competence in the practice of statistics. Linear Models with R, Second Edition explains how to use linear models in physical science, engineering, social science, and business applications. The book incorporates several improvements that reflect how the world of R has greatly expanded since the publication of the first edition.New to the Second EditionReorganiz
Plasticity: modeling & computation
National Research Council Canada - National Science Library
Borja, Ronaldo Israel
2013-01-01
.... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...
Energy Technology Data Exchange (ETDEWEB)
Valgas, Helio Moreira; Pinto, Roberto del Giudice R.; Franca, Carlos [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil); Lambert-Torres, Germano; Silva, Alexandre P. Alves da; Pires, Robson Celso; Costa Junior, Roberto Affonso [Escola Federal de Engenharia de Itajuba, MG (Brazil)
1994-12-31
Accurate dynamic load models allow more precise calculations of power system controls and stability limits, which are critical mainly in the operation planning of power systems. This paper describes the development of a computer program (software) for static and dynamic load model studies using the measurement approach for the CEMIG system. Two dynamic load model structures are developed and tested. A procedure for applying a set of measured data from an on-line transient recording system to develop load models is described. (author) 6 refs., 17 figs.
Directory of Open Access Journals (Sweden)
Ranasinghe P. K. C. Malmini
2008-09-01
Full Text Available We model the price prediction in Sri Lankan stock market using Ising model and some recent developments in statistical physics techniques. In contrast to usual agent-models, the influence does not flow inward from the surrounding neighbors to the centre, but spreads outward from the center to the neighbors. Monte Carlo simulations were used to study this problem. The analysis was based on All share price index, Milanka price index in Colombo Stock Exchange and Simulated Price Process. The monthly and daily influences of the above indices to the Sri Lankan economy were also investigated. The model thus describes the spread of opinions traders.
Tanwir, Savera
2014-01-01
There has been a phenomenal growth in video applications over the past few years. An accurate traffic model of Variable Bit Rate (VBR) video is necessary for performance evaluation of a network design and for generating synthetic traffic that can be used for benchmarking a network. A large number of models for VBR video traffic have been proposed in the literature for different types of video in the past 20 years. Here, the authors have classified and surveyed these models and have also evaluated the models for H.264 AVC and MVC encoded video and discussed their findings.
Wiegelmann, Thomas; Petrie, Gordon J. D.; Riley, Pete
2017-09-01
Coronal magnetic field models use photospheric field measurements as boundary condition to model the solar corona. We review in this paper the most common model assumptions, starting from MHD-models, magnetohydrostatics, force-free and finally potential field models. Each model in this list is somewhat less complex than the previous one and makes more restrictive assumptions by neglecting physical effects. The magnetohydrostatic approach neglects time-dependent phenomena and plasma flows, the force-free approach neglects additionally the gradient of the plasma pressure and the gravity force. This leads to the assumption of a vanishing Lorentz force and electric currents are parallel (or anti-parallel) to the magnetic field lines. Finally, the potential field approach neglects also these currents. We outline the main assumptions, benefits and limitations of these models both from a theoretical (how realistic are the models?) and a practical viewpoint (which computer resources to we need?). Finally we address the important problem of noisy and inconsistent photospheric boundary conditions and the possibility of using chromospheric and coronal observations to improve the models.
Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.
Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.
International Nuclear Information System (INIS)
Musakhanov, M.M.
1980-01-01
The chiral bag model is considered. It is suggested that pions interact only with the surface of a quark ''bag'' and do not penetrate inside. In the case of a large bag the pion field is rather weak and goes to the linearized chiral bag model. Within that model the baryon mass spectrum, β decay axial constant, magnetic moments of baryons, pion-baryon coupling constants and their form factors are calculated. It is shown that pion corrections to the calculations according to the chiral bag model is essential. The obtained results are found to be in a reasonable agreement with the experimental data
SME International Business Models
DEFF Research Database (Denmark)
Child, John; Hsieh, Linda; Elbanna, Said
2017-01-01
This paper addresses two questions through a study of 180 SMEs located in contrasting industry and home country contexts. First, which business models for international markets prevail among SMEs and do they configure into different types? Second, which factors predict the international business...... models that SMEs follow? Three distinct international business models (traditional market-adaptive, technology exploiter, and ambidextrous explorer) are found among the SMEs studied. The likelihood of SMEs adopting one business model rather than another is to a high degree predictable with reference...... to a small set of factors: industry, level of home economy development, and decision-maker international experience....
Modeling Optical Lithography Physics
Neureuther, Andrew R.; Rubinstein, Juliet; Chin, Eric; Wang, Lynn; Miller, Marshal; Clifford, Chris; Yamazoe, Kenji
2010-06-01
Key physical phenomena associated with resists, illumination, lenses and masks are used to show the progress in models and algorithms for modeling optical projection printing as well as current simulation challenges in managing process complexity for manufacturing. The amazing current capability and challenges for projection printing are discussed using the 22 nm device generation. A fundamental foundation for modeling resist exposure, partial coherent imaging and defect printability is given. The technology innovations of resolution enhancement and chemically amplified resist systems and their modeling challenges are overviewed. Automated chip-level applications in pattern pre-compensation and design-anticipation of residual process variations require new simulation approaches.
International Nuclear Information System (INIS)
Kazantsev, A.A.
2009-01-01
A model of turbine stage for calculations of NPP turbine department dynamics in real time was developed. The simulation results were compared with manufacturer calculations for NPP low-speed and fast turbines. The comparison results have shown that the model is valid for real time simulation of all modes of turbines operation. The model allows calculating turbine stage parameters with 1% accuracy. It was shown that the developed turbine stage model meets the accuracy requirements if the data of turbine blades setting angles for all turbine stages are available [ru
Mixed models for predictive modeling in actuarial science
Antonio, K.; Zhang, Y.
2012-01-01
We start with a general discussion of mixed (also called multilevel) models and continue with illustrating specific (actuarial) applications of this type of models. Technical details on (linear, generalized, non-linear) mixed models follow: model assumptions, specifications, estimation techniques
Model Based Temporal Reasoning
Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.
1988-03-01
Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.
Magretta, Joan
2002-05-01
"Business model" was one of the great buzz-words of the Internet boom. A company didn't need a strategy, a special competence, or even any customers--all it needed was a Web-based business model that promised wild profits in some distant, ill-defined future. Many people--investors, entrepreneurs, and executives alike--fell for the fantasy and got burned. And as the inevitable counterreaction played out, the concept of the business model fell out of fashion nearly as quickly as the .com appendage itself. That's a shame. As Joan Magretta explains, a good business model remains essential to every successful organization, whether it's a new venture or an established player. To help managers apply the concept successfully, she defines what a business model is and how it complements a smart competitive strategy. Business models are, at heart, stories that explain how enterprises work. Like a good story, a robust business model contains precisely delineated characters, plausible motivations, and a plot that turns on an insight about value. It answers certain questions: Who is the customer? How do we make money? What underlying economic logic explains how we can deliver value to customers at an appropriate cost? Every viable organization is built on a sound business model, but a business model isn't a strategy, even though many people use the terms interchangeably. Business models describe, as a system, how the pieces of a business fit together. But they don't factor in one critical dimension of performance: competition. That's the job of strategy. Illustrated with examples from companies like American Express, EuroDisney, WalMart, and Dell Computer, this article clarifies the concepts of business models and strategy, which are fundamental to every company's performance.
Comparing root architectural models
Schnepf, Andrea; Javaux, Mathieu; Vanderborght, Jan
2017-04-01
Plant roots play an important role in several soil processes (Gregory 2006). Root architecture development determines the sites in soil where roots provide input of carbon and energy and take up water and solutes. However, root architecture is difficult to determine experimentally when grown in opaque soil. Thus, root architectural models have been widely used and been further developed into functional-structural models that are able to simulate the fate of water and solutes in the soil-root system (Dunbabin et al. 2013). Still, a systematic comparison of the different root architectural models is missing. In this work, we focus on discrete root architecture models where roots are described by connected line segments. These models differ (a) in their model concepts, such as the description of distance between branches based on a prescribed distance (inter-nodal distance) or based on a prescribed time interval. Furthermore, these models differ (b) in the implementation of the same concept, such as the time step size, the spatial discretization along the root axes or the way stochasticity of parameters such as root growth direction, growth rate, branch spacing, branching angles are treated. Based on the example of two such different root models, the root growth module of R-SWMS and RootBox, we show the impact of these differences on simulated root architecture and aggregated information computed from this detailed simulation results, taking into account the stochastic nature of those models. References Dunbabin, V.M., Postma, J.A., Schnepf, A., Pagès, L., Javaux, M., Wu, L., Leitner, D., Chen, Y.L., Rengel, Z., Diggle, A.J. Modelling root-soil interactions using three-dimensional models of root growth, architecture and function (2013) Plant and Soil, 372 (1-2), pp. 93 - 124. Gregory (2006) Roots, rhizosphere and soil: the route to a better understanding of soil science? European Journal of Soil Science 57: 2-12.
Biosphere Process Model Report
Energy Technology Data Exchange (ETDEWEB)
J. Schmitt
2000-05-25
To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor
Biosphere Process Model Report
International Nuclear Information System (INIS)
Schmitt, J.
2000-01-01
To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor
International Nuclear Information System (INIS)
King, S.F.
1989-01-01
Recent work on technicolor theories with small β-functions has shown that the flavour changing neutral current problem which besets any realistic extended technicolor model might be solved by Holdom's original suggestion of raising the extended technicolor scales. In this paper we apply these field theoretic ideas to the problem of constructing a realistic model of the quark and lepton mass spectrum. We discuss two closely related models: (1) An extended technicolor model based on the gauge group SO(10) ETC x SO(10) GUT ; (2) A composite/elementary extended technicolor model based on the gauge group SO(10) MC x SO(10) ETC x SU(5) GUT . Model (1) is relatively simple, and contains three families of quarks and leptons plus an SO(7) TC family of technifermions. The technicolor sector corresponds to one of the examples of walking technicolor discussed by Appelquist et al. The model is fully discussed with particular emphasis on the resulting quark and lepton mass spectrum. Charged lepton masses are adequately described, but the quark masses are degenerate in pairs with zero mixig angles. Model (2) shares the desirable low energy spectrum of Model (1) but in addition provides a mechanism for enhancing the mass of u-type quarks relative to d-type quarks, based on non-perturbative compositeness corrections. We discuss these compositeness corrections, as far as a perturbative treatment allows, and develop techniques for calculating quark masses and mixing angles. We apply these techniques to the first two families of quarks, and are encouraged to find that we can reproduce the observed features of u-d mass inversion for the first family, and Cabibbo mixing. Model (2) leads to the prediction of D 0 -anti D 0 mixing, K L → e ± μ -+ , K + → π + e - μ + , all at rates close to current experimental limits. The model also predicts three families and a top quark mass m t ≅ 50 GeV. (orig.)
Existing Model Metrics and Relations to Model Quality
Mohagheghi, Parastoo; Dehlen, Vegard
2009-01-01
This paper presents quality goals for models and provides a state-of-the-art analysis regarding model metrics. While model-based software development often requires assessing the quality of models at different abstraction and precision levels and developed for multiple purposes, existing work on model metrics do not reflect this need. Model size metrics are descriptive and may be used for comparing models but their relation to model quality is not welldefined. Code metrics are proposed to be ...
Multiple Model Approaches to Modelling and Control,
DEFF Research Database (Denmark)
appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simplerlinear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasinglypopular `local...... to problems in the process industries, biomedical applications and autonomoussystems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework isneeded to better allow the integration of human knowledge with automated learning....... The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together,which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others havefocused...
Spiral model pilot project information model
1991-01-01
The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.
Climatology of salt transitions and implications for stone weathering
International Nuclear Information System (INIS)
Grossi, C.M.; Brimblecombe, P.; Menendez, B.; Benavente, D.; Harris, I.; Deque, M.
2011-01-01
This work introduces the notion of salt climatology. It shows how climate affects salt thermodynamic and the potential to relate long-term salt damage to climate types. It mainly focuses on specific sites in Western Europe, which include some cities in France and Peninsular Spain. Salt damage was parameterised using the number of dissolution-crystallisation events for unhydrated (sodium chloride) and hydrated (sodium sulphate) systems. These phase transitions have been calculated using daily temperature and relative humidity from observation meteorological data and Climate Change models' output (HadCM3 and ARPEGE). Comparing the number of transitions with meteorological seasonal data allowed us to develop techniques to estimate the frequency of salt transitions based on the local climatology. Results show that it is possible to associate the Koeppen-Geiger climate types with potential salt weathering. Temperate fully humid climates seem to offer the highest potential for salt damage and possible higher number of transitions in summer. Climates with dry summers tend to show a lesser frequency of transitions in summer. The analysis of temperature, precipitation and relative output from Climate Change models suggests changes in the Koeppen-Geiger climate types and changes in the patterns of salt damage. For instance, West Europe areas with a fully humid climate may change to a more Mediterranean like or dry climates, and consequently the seasonality of different salt transitions. The accuracy and reliability of the projections might be improved by simultaneously running multiple climate models (ensembles). - Research highlights: → We introduce the notion of salt climatology for heritage conservation. → Climate affects salt thermodynamics on building materials. → We associate Koeppen-Geiger climate types with potential salt weathering. → We offer future projections of salt damage in Western Europe due to climate change. → Humid climate areas may change to
Climatology of salt transitions and implications for stone weathering
Energy Technology Data Exchange (ETDEWEB)
Grossi, C.M., E-mail: c.grossi-sampedro@uea.ac.uk [School of Environmental Sciences, University of East Anglia, Norwich NR4 7TJ (United Kingdom); Brimblecombe, P. [School of Environmental Sciences, University of East Anglia, Norwich NR4 7TJ (United Kingdom); Menendez, B. [Geosciences et Environnement Cergy, Universite de Cergy-Pontoise 95031 Cergy-Pontoise cedex (France); Benavente, D. [Lab. Petrologia Aplicada, Unidad Asociada UA-CSIC, Dpto. Ciencias de la Tierra y del Medio Ambiente, Universidad de Alicante, Alicante 03080 (Spain); Harris, I. [Climatic Research Unit, School of Environmental Sciences, University of East Anglia, Norwich NR4 7TJ (United Kingdom); Deque, M. [Meteo-France/CNRM, CNRS/GAME, 42 Avenue Coriolis, F-31057 Toulouse, Cedex 01 (France)
2011-06-01
This work introduces the notion of salt climatology. It shows how climate affects salt thermodynamic and the potential to relate long-term salt damage to climate types. It mainly focuses on specific sites in Western Europe, which include some cities in France and Peninsular Spain. Salt damage was parameterised using the number of dissolution-crystallisation events for unhydrated (sodium chloride) and hydrated (sodium sulphate) systems. These phase transitions have been calculated using daily temperature and relative humidity from observation meteorological data and Climate Change models' output (HadCM3 and ARPEGE). Comparing the number of transitions with meteorological seasonal data allowed us to develop techniques to estimate the frequency of salt transitions based on the local climatology. Results show that it is possible to associate the Koeppen-Geiger climate types with potential salt weathering. Temperate fully humid climates seem to offer the highest potential for salt damage and possible higher number of transitions in summer. Climates with dry summers tend to show a lesser frequency of transitions in summer. The analysis of temperature, precipitation and relative output from Climate Change models suggests changes in the Koeppen-Geiger climate types and changes in the patterns of salt damage. For instance, West Europe areas with a fully humid climate may change to a more Mediterranean like or dry climates, and consequently the seasonality of different salt transitions. The accuracy and reliability of the projections might be improved by simultaneously running multiple climate models (ensembles). - Research highlights: {yields} We introduce the notion of salt climatology for heritage conservation. {yields} Climate affects salt thermodynamics on building materials. {yields} We associate Koeppen-Geiger climate types with potential salt weathering. {yields} We offer future projections of salt damage in Western Europe due to climate change. {yields} Humid
Parks, Melissa
2014-01-01
Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…
New Mexico Univ., Albuquerque. American Indian Law Center.
The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…
Archaeological predictive model set.
2015-03-01
This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...
DEFF Research Database (Denmark)
Olesen, H. R.
1998-01-01
Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....
Modeling volcanic ash dispersal
CERN. Geneva
2010-01-01
The assessment of volcanic fallout hazard is an important scientific, economic, and political issue, especially in densely populated areas. From a scientific point of view, considerable progress has been made during the last two decades through the use of increasingly powerful computational models and capabilities. Nowadays, models are used to quantify hazard...
(Non) linear regression modelling
Cizek, P.; Gentle, J.E.; Hardle, W.K.; Mori, Y.
2012-01-01
We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables Y = (Y1,…,Yl), l ∈ N, which are explained by a model, and independent (exogenous, explanatory) variables X = (X1,…,Xp),p ∈ N, which explain or
DEFF Research Database (Denmark)
Galle, Per
2000-01-01
In preparation of an analysis of product modelling in terms of communication, this report presents a brief analysis of symbols; that is, the entities by means of which communication takes place. Symbols are defined in such a way as to admit artefacts and models (the latter including linguistic...
Generalized instrumental variable models
Andrew Chesher; Adam Rosen
2014-01-01
This paper develops characterizations of identified sets of structures and structural features for complete and incomplete models involving continuous or discrete variables. Multiple values of unobserved variables can be associated with particular combinations of observed variables. This can arise when there are multiple sources of heterogeneity, censored or discrete endogenous variables, or inequality restrictions on functions of observed and unobserved variables. The models g...
International Nuclear Information System (INIS)
Nishimura, Hiroshi.
1993-05-01
Object-Oriented Programming has been used extensively to model the LBL Advanced Light Source 1.5 GeV electron storage ring. This paper is on the present status of the class library construction with emphasis on a dynamic modeling
2014-08-01
CENTURY is a computer model of plant-soil ecosystems that simulates the dynamics of grasslands, forest , crops, and savannas with a focus on nutri...Kamnalrut, and J. L. Kinyamario. 1993. Observations and modeling of biomass and soil organic matter dynamics for the grassland biome worldwide. Global
Validation of simulation models
DEFF Research Database (Denmark)
Rehman, Muniza; Pedersen, Stig Andur
2012-01-01
In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...
International Nuclear Information System (INIS)
Liu Baojie
2012-01-01
In recent years, the Unified Process and Agile Modeling is attracting people's attention for having integrated theory and maneuver ability. Based on the practice of China North Nuclear PWR's production and quality MIS project, the paper introduces Unified Process and Agile Modeling at first, then scratch the application of scientific idea to soft project for reference by similar projects afterwards. (author)
A Situational Maintenance Model
DEFF Research Database (Denmark)
Luxhoj, James T.; Thorsteinsson, Uffe; Riis, Jens Ove
1997-01-01
An overview of trend in maintenance management and presentation of a situational model and an analytical tools for identification of managerial efforts in maintenance.......An overview of trend in maintenance management and presentation of a situational model and an analytical tools for identification of managerial efforts in maintenance....
Business Model Innovation Leadership
DEFF Research Database (Denmark)
Lindgren, Peter
2012-01-01
When SME´s practice business model (BM) innovation (BMI), leading strategically BMs through the innovation process can be the difference between success and failure to a BM. Business Model Innovation Leadership (BMIL) is however extremely complex to carry out especially to small and medium size...
Verweij, J.F.; Verweij, J.F.; Brombacher, A.C.; Brombacher, A.C.; Lunenborg, M.M.; Lunenborg, M.M.
1994-01-01
There are two approaches to component lifetime modelling. The first one uses a reliability prediction method as described in the (military) handbooks with the appropriate models and parameters. The advantages are: (a) It takes into account all possible failure mechanisms. (b) It is easy to use. The
Entrepreneurship and Role Models
N. Bosma (Niels); S.J.A. Hessels (Jolanda); V. Schutjens (Veronique); M. van Praag (Mirjam); I. Verheul (Ingrid)
2011-01-01
textabstractIn the media role models are increasingly being acknowledged as an influential factor in explaining the reasons for the choice of occupation and career. Various conceptual studies have proposed links between role models and entrepreneurial intentions. However, empirical research aimed at
Taniguchi, Tadahiro; Sawaragi, Tetsuo
In this paper, a new machine-learning method, called Dual-Schemata model, is presented. Dual-Schemata model is a kind of self-organizational machine learning methods for an autonomous robot interacting with an unknown dynamical environment. This is based on Piaget's Schema model, that is a classical psychological model to explain memory and cognitive development of human beings. Our Dual-Schemata model is developed as a computational model of Piaget's Schema model, especially focusing on sensori-motor developing period. This developmental process is characterized by a couple of two mutually-interacting dynamics; one is a dynamics formed by assimilation and accommodation, and the other dynamics is formed by equilibration and differentiation. By these dynamics schema system enables an agent to act well in a real world. This schema's differentiation process corresponds to a symbol formation process occurring within an autonomous agent when it interacts with an unknown, dynamically changing environment. Experiment results obtained from an autonomous facial robot in which our model is embedded are presented; an autonomous facial robot becomes able to chase a ball moving in various ways without any rewards nor teaching signals from outside. Moreover, emergence of concepts on the target movements within a robot is shown and discussed in terms of fuzzy logics on set-subset inclusive relationships.
DEFF Research Database (Denmark)
Hejlesen, Aske K.; Ovesen, Nis
2012-01-01
This paper presents an experimental approach to teaching 3D modelling techniques in an Industrial Design programme. The approach includes the use of tangible free form models as tools for improving the overall learning. The paper is based on lecturer and student experiences obtained through...
Modeling prosody: Different approaches
Carmichael, Lesley M.
2002-11-01
Prosody pervades all aspects of a speech signal, both in terms of raw acoustic outcomes and linguistically meaningful units, from the phoneme to the discourse unit. It is carried in the suprasegmental features of fundamental frequency, loudness, and duration. Several models have been developed to account for the way prosody organizes speech, and they vary widely in terms of their theoretical assumptions, organizational primitives, actual procedures of application to speech, and intended use (e.g., to generate speech from text vs. to model the prosodic phonology of a language). In many cases, these models overtly contradict one another with regard to their fundamental premises or their identification of the perceptible objects of linguistic prosody. These competing models are directly compared. Each model is applied to the same speech samples. This parallel analysis allows for a critical inspection of each model and its efficacy in assessing the suprasegmental behavior of the speech. The analyses illustrate how different approaches are better equipped to account for different aspects of prosody. Viewing the models and their successes from an objective perspective allows for creative possibilities in terms of combining strengths from models which might otherwise be considered fundamentally incompatible.
Structural Equation Model Trees
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2013-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…
SWMM is a model for urban hydrology. It has a long history and is relied upon by professional engineers in the US and around the world. SWMM provides both gray and green Infrastructure modeling capabilities. As such, it is a convenient tool for understanding the tradeoff between ...
Models for Dynamic Applications
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina
2011-01-01
This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor...
Directory of Open Access Journals (Sweden)
Unnati Ahluwalia
2012-12-01
Full Text Available In an attempt to explore the understanding of protein folding mechanism, various models have been proposed in the literature. Advances in recent experimental and computational techniques rationalized our understanding on some of the fundamental features of the protein folding pathways. The goal of this review is to revisit the various models and outline the essential aspects of the folding reaction.
International Nuclear Information System (INIS)
Kull', L.M.
1987-01-01
Papers dealing with study on mechanisms of submicricrack formation and propagation using dislocation representations are analyzed. Cases of brittle and ductile fracture of materials as well as models of dislocationless (amorphous) zone at the growing crack tip are considered. Dislocation models of fracture may be used when studying the processes of deformation and accumulation of damages in elements of nuclear facilities
Directory of Open Access Journals (Sweden)
JANUSZ K. GRABARA
2011-01-01
Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.
International Nuclear Information System (INIS)
Diaz-Cruz, J. Lorenzo
1996-01-01
This article presents a review of the minimal supersymmetric extension of the Standard Model (MSSM), concentrating mainly in the steps needed to derive the lagrangian of the model within the superspace formalism. Some attention is also given to the reduction of parameters that results from incorporating the hypothesis of Grand Unification and low energy Supergravity; the most salient phenomenological consequences are also discussed
Business model innovation paths
Chesbrough, H.; Di Minin, Alberto; Piccaluga, A.
2013-01-01
This chapter explains the business model concept and explores the reasons why “innovation” and “innovation in services” are no longer exclusively a technological issue. Rather, we highlight that business models are critical components at the centre of business innovation processes. We also attempt
Erpylev, N. P.; Smirnov, M. A.; Bagrov, A. V.
A night sky model is proposed. It includes different components of light polution, such as solar twilight, moon scattered light, zodiacal light, Milky Way, air glow and artificial light pollution. The model is designed for calculating the efficiency of astronomical installations.
Indian Academy of Sciences (India)
In this talk I review studies of hadron properties in bosonized chiral quark models for the quark ﬂavor dynamics. Mesons are constructed from Bethe–Salpeter equations and baryons emerge as chiral solitons. Such models require regularization and I show that the two-fold Pauli–Villars regularization scheme not only fully ...
Kiss, S.; Sarfraz, M.
2004-01-01
Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling
Kiss, S.; Banissi, E.; Khosrowshahi, F.; Sarfraz, M.; Ursyn, A.
2001-01-01
Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling
An efficiency correction model
Francke, M.K.; de Vos, A.F.
2009-01-01
We analyze a dataset containing costs and outputs of 67 American local exchange carriers in a period of 11 years. This data has been used to judge the efficiency of BT and KPN using static stochastic frontier models. We show that these models are dynamically misspecified. As an alternative we
Slats, P.A.; Bhola, B.; Evers, J.J.M.; Dijkhuizen, G.
1995-01-01
Logistic chain modelling is very important in improving the overall performance of the total logistic chain. Logistic models provide support for a large range of applications, such as analysing bottlenecks, improving customer service, configuring new logistic chains and adapting existing chains to
The generalized circular model
Webers, H.M.
1995-01-01
In this paper we present a generalization of the circular model. In this model there are two concentric circular markets, which enables us to study two types of markets simultaneously. There are switching costs involved for moving from one circle to the other circle, which can also be thought of as
DEFF Research Database (Denmark)
Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming
In this document, we consider a specific Chinese Smart Grid implementation and try to address the verification problem for certain quantitative properties including performance and battery consumption. We employ stochastic model checking approach and present our modelling and analysis study using...
International Nuclear Information System (INIS)
1997-01-01
This report documents a numerical simulation model of the natural gas market in Germany, France, the Netherlands and Belgium. It is a part of a project called ''Internationalization and structural change in the gas market'' aiming to enhance the understanding of the factors behind the current and upcoming changes in the European gas market, especially the downstream part of the gas chain. The model takes European border prices of gas as given, adds transmission and distribution cost and profit margins as well as gas taxes to calculate gas prices. The model includes demand sub-models for households, chemical industry, other industry, the commercial sector and electricity generation. Demand responses to price changes are assumed to take time, and the long run effects are significantly larger than the short run effects. For the household sector and the electricity sector, the dynamics are modeled by distinguishing between energy use in the old and new capital stock. In addition to prices and the activity level (GDP), the model includes the extension of the gas network as a potentially important variable in explaining the development of gas demand. The properties of numerical simulation models are often described by dynamic multipliers, which describe the behaviour of important variables when key explanatory variables are changed. At the end, the report shows the results of a model experiment where the costs in transmission and distribution were reduced. 6 refs., 9 figs., 1 tab
Modelling Hyperboloid Sound Scattering
DEFF Research Database (Denmark)
Burry, Jane; Davis, Daniel; Peters, Brady
2011-01-01
The Responsive Acoustic Surfaces workshop project described here sought new understandings about the interaction between geometry and sound in the arena of sound scattering. This paper reports on the challenges associated with modelling, simulating, fabricating and measuring this phenomenon using...... both physical and digital models at three distinct scales. The results suggest hyperboloid geometry, while difficult to fabricate, facilitates sound scattering....
Entrepreneurship and role models
Bosma, N.; Hessels, J.; Schutjens, V.; van Praag, M.; Verheul, I.
2012-01-01
In the media role models are increasingly being acknowledged as an influential factor in explaining the reasons for the choice of occupation and career. Various conceptual studies have proposed links between role models and entrepreneurial intentions. However, empirical research aimed at
Bun, M.J.G.; Sarafidis, V.
2013-01-01
This Chapter reviews the recent literature on dynamic panel data models with a short time span and a large cross-section. Throughout the discussion we considerlinear models with additional endogenous covariates. First we give a broad overview of available inference methods placing emphasis on GMM.
Realistic split fermion models
Indian Academy of Sciences (India)
wall fermions, namely, a bulk scalar field with non-trivial VEV that couples to the fermions. In addition, the ... yields the flavor hierarchy. We consider a model with two scalar fields that couple to the fermions [5]. .... model will correctly reproduce the quark flavor parameters the following relation should hold [2]:. Γ-1Щmax ~03.
Stochastic modelling of turbulence
DEFF Research Database (Denmark)
Sørensen, Emil Hedevang Lohse
previously been shown to be closely connected to the energy dissipation. The incorporation of the small scale dynamics into the spatial model opens the door to a fully fledged stochastic model of turbulence. Concerning the interaction of wind and wind turbine, a new method is proposed to extract wind turbine...
Probability matrix decomposition models
Maris, E.; DeBoeck, P.; Mechelen, I. van
1996-01-01
In this paper, we consider a class of models for two-way matrices with binary entries of 0 and 1. First, we consider Boolean matrix decomposition, conceptualize it as a latent response model (LRM) and, by making use of this conceptualization, generalize it to a larger class of matrix decomposition
DEFF Research Database (Denmark)
Beauquier, Maxime; Schürmann, Carsten
2011-01-01
In this paper, we present a model based on relations for bigraphical reactive system [Milner09]. Its defining characteristics are that validity and reaction relations are captured as traces in a multi-set rewriting system. The relational model is derived from Milner's graphical definition...
Perelson, Alan; Conway, Jessica; Cao, Youfang
A large effort is being made to find a means to cure HIV infection. I will present a dynamical model of post-treatment control (PTC) or ``functional cure'' of HIV-infection. Some patients treated with suppressive antiviral therapy have been taken off of therapy and then spontaneously control HIV infection such that the amount of virus in the circulation is maintained undetectable by clinical assays for years. The model explains PTC occurring in some patients by having a parameter regime in which the model exhibits bistability, with both a low and high steady state viral load being stable. The model makes a number of predictions about how to attain the low PTC steady state. Bistability in this model depends upon the immune response becoming exhausted when over stimulated. I will also present a generalization of the model in which immunotherapy can be used to reverse immune exhaustion and compare model predictions with experiments in SIV infected macaques given immunotherapy and then taken off of antiretroviral therapy. Lastly, if time permits, I will discuss one of the hurdles to true HIV eradication, latently infected cells, and present clinical trial data and a new model addressing pharmacological means of flushing out the latent reservoir. Supported by NIH Grants AI028433 and OD011095.
Directory of Open Access Journals (Sweden)
R.I. Parovik
2012-06-01
Full Text Available In a model of radioactive decay of radon in the sample (222Rn. The model assumes that the probability of the decay of radon and its half-life depends on the fractal properties of the geological environment. The dependencies of the decay parameters of the fractal dimension of the medium.
Energy Technology Data Exchange (ETDEWEB)
Kapetanakis, D. (Technische Univ. Muenchen, Garching (Germany). Physik Dept.); Mondragon, M. (Technische Univ. Muenchen, Garching (Germany). Physik Dept.); Zoupanos, G. (National Technical Univ., Athens (Greece). Physics Dept.)
1993-09-01
We present phenomenologically viable SU(5) unified models which are finite to all orders before the spontaneous symmetry breaking. In the case of two models with three families the top quark mass is predicted to be 178.8 GeV. (orig.)
International Nuclear Information System (INIS)
Kapetanakis, D.; Mondragon, M.; Zoupanos, G.
1993-01-01
We present phenomenologically viable SU(5) unified models which are finite to all orders before the spontaneous symmetry breaking. In the case of two models with three families the top quark mass is predicted to be 178.8 GeV. (orig.)
Response model parameter linking
Barrett, M.L.D.
2015-01-01
With a few exceptions, the problem of linking item response model parameters from different item calibrations has been conceptualized as an instance of the problem of equating observed scores on different test forms. This thesis argues, however, that the use of item response models does not require
Computational human body models
Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van
2005-01-01
Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies
Energy Technology Data Exchange (ETDEWEB)
Jacob J. Jacobson; Gretchen Matthern
2007-04-01
System Dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, System Dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The real power of System Dynamic modeling is gaining insights into total system behavior as time, and system parameters are adjusted and the effects are visualized in real time. System Dynamic models allow decision makers and stakeholders to explore long-term behavior and performance of complex systems, especially in the context of dynamic processes and changing scenarios without having to wait decades to obtain field data or risk failure if a poor management or design approach is used. The Idaho National Laboratory recently has been developing a System Dynamic model of the US Nuclear Fuel Cycle. The model is intended to be used to identify and understand interactions throughout the entire nuclear fuel cycle and suggest sustainable development strategies. This paper describes the basic framework of the current model and presents examples of useful insights gained from the model thus far with respect to sustainable development of nuclear power.
International Nuclear Information System (INIS)
Cuypers, F.
1997-05-01
These lecture notes are intended as a pedagogical introduction to several popular extensions of the standard model of strong and electroweak interactions. The topics include the Higgs sector, the left-right symmetric model, grand unification and supersymmetry. Phenomenological consequences and search procedures are emphasized. (author) figs., tabs., 18 refs
Morgante, Enrico
2018-01-01
I review the construction of Simplified Models for Dark Matter searches. After discussing the philosophy and some simple examples, I turn the attention to the aspect of the theoretical consistency and to the implications of the necessary extensions of these models.
International Nuclear Information System (INIS)
Kimpland, R.H.
1996-01-01
A normalized form of the point kinetics equations, a prompt jump approximation, and the Nordheim-Fuchs model are used to model nuclear systems. Reactivity feedback mechanisms considered include volumetric expansion, thermal neutron temperature effect, Doppler effect and void formation. A sample problem of an excursion occurring in a plutonium solution accidentally formed in a glovebox is presented
Building information modelling (BIM)
CSIR Research Space (South Africa)
Conradie, Dirk CU
2009-02-01
Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...
Energy Technology Data Exchange (ETDEWEB)
Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.
1991-12-01
The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.
International Nuclear Information System (INIS)
Barchet, W.R.; Dennis, R.L.; Seilkop, S.K.; Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K.; Byun, D.; McHenry, J.N.; Karamchandani, P.; Venkatram, A.; Fung, C.; Misra, P.K.; Hansen, D.A.; Chang, J.S.
1991-12-01
The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs
Energy Technology Data Exchange (ETDEWEB)
NONE
1997-12-31
This report documents a numerical simulation model of the natural gas market in Germany, France, the Netherlands and Belgium. It is a part of a project called ``Internationalization and structural change in the gas market`` aiming to enhance the understanding of the factors behind the current and upcoming changes in the European gas market, especially the downstream part of the gas chain. The model takes European border prices of gas as given, adds transmission and distribution cost and profit margins as well as gas taxes to calculate gas prices. The model includes demand sub-models for households, chemical industry, other industry, the commercial sector and electricity generation. Demand responses to price changes are assumed to take time, and the long run effects are significantly larger than the short run effects. For the household sector and the electricity sector, the dynamics are modeled by distinguishing between energy use in the old and new capital stock. In addition to prices and the activity level (GDP), the model includes the extension of the gas network as a potentially important variable in explaining the development of gas demand. The properties of numerical simulation models are often described by dynamic multipliers, which describe the behaviour of important variables when key explanatory variables are changed. At the end, the report shows the results of a model experiment where the costs in transmission and distribution were reduced. 6 refs., 9 figs., 1 tab.