Green, Yvette N. J.; Weaver, Pamela A.
This is a study of the approaches, techniques, and information technology systems utilized for restaurant sales forecasting in the full-service restaurant segment. Companies were examined using a qualitative research methods design and long interviews to gather information on approaches, techniques, and technology systems utilized in the sales forecasting process. The results of the interviews were presented along with ensuing discussion.
Electricity demand forecasting plays an important role in power generation. The two areas of data that have to be forecasted in a power system are peak demand which determines the capacity (MW) of the plant required and annual energy demand (GWH). Methods used in electricity demand forecasting include time trend analysis and econometric methods. In forecasting, identification of manpower demand, identification of key planning factors, decision on planning horizon, differentiation between prediction and projection (i.e. development of different scenarios) and choosing from different forecasting techniques are important
Roper, A. T
.... The scope of this edition has broadened to include management of technology content that is relevant to now to executives in organizations while updating and strengthening the technology forecasting...
Stuber, Eric; Prasadh, Nishant; Edwards, Stephen; Mavris, Dimitri N.
Forecasting method is a normative forecasting technique that allows the designer to quantify the effects of adding new technologies on a given design. This method can be used to assess and identify the necessary technological improvements needed to close the gap that exists between the current design and one that satisfies all constraints imposed on the design. The TIF methodology allows for more design knowledge to be brought to the earlier phases of the design process, making use of tools such as Quality Function Deployments, Morphological Matrices, Response Surface Methodology, and Monte Carlo Simulations.2 This increased knowledge allows for more informed decisions to be made earlier in the design process, resulting in shortened design cycle time. This paper will investigate applying the TIF method, which has been widely used in aircraft applications, to the conceptual design of a hydrocarbon rocket engine. In order to reinstate a manned presence in space, the U.S. must develop an affordable and sustainable launch capability. Hydrocarbon-fueled rockets have drawn interest from numerous major government and commercial entities because they offer a low-cost heavy-lift option that would allow for frequent launches1. However, the development of effective new hydrocarbon rockets would likely require new technologies in order to overcome certain design constraints. The use of advanced design methods, such as the TIF method, enables the designer to identify key areas in need of improvement, allowing one to dial in a proposed technology and assess its impact on the system. Through analyses such as this one, a conceptual design for a hydrocarbon-fueled vehicle that meets all imposed requirements can be achieved.
Robinson, B. E.
The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.
Mishra, D.; Goyal, P.
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Sridhar, M. S.
Examines the nature and limitations of demand forecasting, discuses plausible methods of forecasting demand for information, suggests some useful hints for demand forecasting and concludes by emphasizing unified approach to demand forecasting.
The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends
Hoisl, Karin; Stelzer, Tobias; Biala, Stefanie
in the ICT industry. The conjoint approach allows for a simulation of the forecasting process and considers utility trade-offs. The results show that for both types of experts the perceived benefit of users most highly contributes to predicting technological discontinuities. Internal experts assign more...
Roper, A. T
... what the authors see as the innovations to technology management in the last 17 years: the Internet; the greater focus on group decision-making including process management and mechanism design; and desktop software that has transformed the analytical capabilities of technology managers"--Provided by publisher.
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
Roper, A. T
"The new, revised edition of this book will build on this knowledge in the context of business organizations that now place a greater emphasis on technology to stay on the cutting edge of development...
Full Text Available Geostatistical spatial models are widely used in many applied fields to forecast data observed on continuous three-dimensional surfaces. We propose to extend their use to finance and, in particular, to forecasting yield curves. We present the results of an empirical application where we apply the proposed method to forecast Euro Zero Rates (2003–2014 using the Ordinary Kriging method based on the anisotropic variogram. Furthermore, a comparison with other recent methods for forecasting yield curves is proposed. The results show that the model is characterized by good levels of predictions’ accuracy and it is competitive with the other forecasting models considered.
This review deals with how geomagnetic storms can be predicted with the use of Artificial Intelligence (AI) techniques. Today many different Al techniques have been developed, such as symbolic systems (expert and fuzzy systems) and connectionism systems (neural networks). Even integrations of AI techniques exist, so called Intelligent Hybrid Systems (IHS). These systems are capable of learning the mathematical functions underlying the operation of non-linear dynamic systems and also to explain the knowledge they have learned. Very few such powerful systems exist at present. Two such examples are the Magnetospheric Specification Forecast Model of Rice University and the Lund Space Weather Model of Lund University. Various attempts to predict geomagnetic storms on long to short-term are reviewed in this article. Predictions of a month to days ahead most often use solar data as input. The first SOHO data are now available. Due to the high temporal and spatial resolution new solar physics have been revealed. These SOHO data might lead to a breakthrough in these predictions. Predictions hours ahead and shorter rely on real-time solar wind data. WIND gives us real-time data for only part of the day. However, with the launch of the ACE spacecraft in 1997, real-time data during 24 hours will be available. That might lead to the second breakthrough for predictions of geomagnetic storms.
Brinkman, Paul T.; McIntyre, Chuck
There is no right way to forecast college enrollments; in many instances, it will be prudent to use both qualitative and quantitative methods. Methods chosen must be relevant to questions addressed, policies and decisions at stake, and time and talent required. While it is tempting to start quickly, enrollment forecasting is an area in which…
Chullen, Cinda; Westheimer, David T.
The goal of NASA s current EVA technology effort is to further develop technologies that will be used to demonstrate a robust EVA system that has application for a variety of future missions including microgravity and surface EVA. Overall the objectives will be to reduce system mass, reduce consumables and maintenance, increase EVA hardware robustness and life, increase crew member efficiency and autonomy, and enable rapid vehicle egress and ingress. Over the past several years, NASA realized a tremendous increase in EVA system development as part of the Exploration Technology Development Program and the Constellation Program. The evident demand for efficient and reliable EVA technologies, particularly regenerable technologies was apparent under these former programs and will continue to be needed as future mission opportunities arise. The technological need for EVA in space has been realized over the last several decades by the Gemini, Apollo, Skylab, Space Shuttle, and the International Space Station (ISS) programs. EVAs were critical to the success of these programs. Now with the ISS extension to 2028 in conjunction with a current forecasted need of at least eight EVAs per year, the EVA hardware life and limited availability of the Extravehicular Mobility Units (EMUs) will eventually become a critical issue. The current EMU has successfully served EVA demands by performing critical operations to assemble the ISS and provide repairs of satellites such as the Hubble Space Telescope. However, as the life of ISS and the vision for future mission opportunities are realized, a new EVA systems capability will be needed and the current architectures and technologies under development offer significant improvements over the current flight systems. In addition to ISS, potential mission applications include EVAs for missions to Near Earth Objects (NEO), Phobos, or future surface missions. Surface missions could include either exploration of the Moon or Mars. Providing an
Full Text Available Exchange rates forecasting is, and has been a challenging task in finance. Statistical and econometrical models are widely used in analysis and forecasting of foreign exchange rates. This paper investigates the behavior of daily exchange rates of the Romanian Leu against the Euro, United States Dollar, British Pound, Japanese Yen, Chinese Renminbi and the Russian Ruble. Smoothing techniques are generated and compared with each other. These models include the Simple Exponential Smoothing technique, as the Double Exponential Smoothing technique, the Simple Holt-Winters, the Additive Holt-Winters, namely the Autoregressive Integrated Moving Average model.
Gopi Naveen Chander
Full Text Available A technique of fabricating feldspathic porcelain pressable ingots was proposed. A 5 ml disposable syringe was used to condense the powder slurry. The condensed porcelain was sintered at 900C to produce porcelain ingots. The fabricated porcelain ingots were used in pressable ceramic machines. The technological advantages of pressable system improve the properties, and the fabricated ingot enhances the application of feldspathic porcelain.
Woodard, Crystal J.; Carey, L. D.; Petersen, W. A.; Roeder, W. P.
The objective of this NASA MSFC and NOAA CSTAR funded study is to develop and test operational forecast algorithms for the prediction of lightning initiation utilizing the C-band dual-polarimetric radar, UAHuntsville's Advanced Radar for Meteorological and Operational Research (ARMOR). Although there is a rich research history of radar signatures associated with lightning initiation, few studies have utilized dual-polarimetric radar signatures (e.g., Z(sub dr) columns) and capabilities (e.g., fuzzy-logic particle identification [PID] of precipitation ice) in an operational algorithm for first flash forecasting. The specific goal of this study is to develop and test polarimetric techniques that enhance the performance of current operational radar reflectivity based first flash algorithms. Improving lightning watch and warning performance will positively impact personnel safety in both work and leisure environments. Advanced warnings can provide space shuttle launch managers time to respond appropriately to secure equipment and personnel, while they can also provide appropriate warnings for spectators and players of leisure sporting events to seek safe shelter. Through the analysis of eight case dates, consisting of 35 pulse-type thunderstorms and 20 non-thunderstorm case studies, lightning initiation forecast techniques were developed and tested. The hypothesis is that the additional dual-polarimetric information could potentially reduce false alarms while maintaining high probability of detection and increasing lead-time for the prediction of the first lightning flash relative to reflectivity-only based techniques. To test the hypothesis, various physically-based techniques using polarimetric variables and/or PID categories, which are strongly correlated to initial storm electrification (e.g., large precipitation ice production via drop freezing), were benchmarked against the operational reflectivity-only based approaches to find the best compromise between
P. Exterkate (Peter)
textabstractThis thesis discusses various novel techniques for economic forecasting. The focus is on methods that exploit the information in large data sets effectively. Each of these methods is compared to established techniques for forecasting yields on U.S. Treasury Bills, housing prices,
Zied Ben Bouallègue
Full Text Available COSMO-DE-EPS, a convection-permitting ensemble prediction system based on the high-resolution numerical weather prediction model COSMO-DE, is pre-operational since December 2010, providing probabilistic forecasts which cover Germany. This ensemble system comprises 20 members based on variations of the lateral boundary conditions, the physics parameterizations and the initial conditions. In order to increase the sample size in a computationally inexpensive way, COSMO-DE-EPS is combined with alternative ensemble techniques: the neighborhood method and the time-lagged approach. Their impact on the quality of the resulting probabilistic forecasts is assessed. Objective verification is performed over a six months period, scores based on the Brier score and its decomposition are shown for June 2011. The combination of the ensemble system with the alternative approaches improves probabilistic forecasts of precipitation in particular for high precipitation thresholds. Moreover, combining COSMO-DE-EPS with only the time-lagged approach improves the skill of area probabilities for precipitation and does not deteriorate the skill of 2 m-temperature and wind gusts forecasts.
Cheng-Shih Su; Shu-Chen Hsu
This paper is presented fuzzy preference relations approach to forecast the success of implementing sensors advanced manufacturing technology (AMT). In the manufacturing environment, performance measurement is based on different quantitative and qualitative factors. This study proposes an analytic hierarchical prediction model based on fuzzy preference relations to help the organizations become aware of the essential factors affecting the AMT implementation, forecasting the chance of successf...
Alet, Pierre-Jean; Efthymiou, Venizelos; Graditi, Giorgio
– Photovoltaics (ETIP PV) reviews the different use cases for these technologies, their current status, and the need for future developments. Power system operations require a real-time view of PV production for managing power reserves and for feeding shortterm forecasts. They also require forecasts on all......Forecasting and monitoring technologies for photovoltaics are required on different spatial and temporal scales by multiple actors, from the owners of PV systems to transmission system operators. In this paper the Grid integration working group of the European Technology and Innovation Platform...... timescales from the short (for dispatching purposes), where statistical models work best, to the very long (for infrastructure planning), where physics-based models are more accurate. Power system regulations are driving the development of these techniques. This application also provides a good basis...
Full Text Available Artificial intelligence is a promising futuristic concept in the field of science and technology, and is widely used in new industries. The deep-learning technology leads to performance enhancement and generalization of artificial intelligence technology. The global leader in the field of information technology has declared its intention to utilize the deep-learning technology to solve environmental problems such as climate change, but few environmental applications have so far been developed. This study uses deep-learning technologies in the environmental field to predict the status of pro-environmental consumption. We predicted the pro-environmental consumption index based on Google search query data, using a recurrent neural network (RNN model. To verify the accuracy of the index, we compared the prediction accuracy of the RNN model with that of the ordinary least square and artificial neural network models. The RNN model predicts the pro-environmental consumption index better than any other model. We expect the RNN model to perform still better in a big data environment because the deep-learning technologies would be increasingly sophisticated as the volume of data grows. Moreover, the framework of this study could be useful in environmental forecasting to prevent damage caused by climate change.
of directions and targets for a R and D project, monitoring of a given area by a public agency, and evaluation of the future competitive situation for a company. This paper gives a brief introduction to the field of technological forecasting especially in relation to the strategic planning process...
Full Text Available Stock market is considered too uncertain to be predictable. Many individuals have developed methodologies or models to increase the probability of making a profit in their stock investment. The overall hit rates of these methodologies and models are generally too low to be practical for real-world application. One of the major reasons is the huge fluctuation of the market. Therefore, the current research focuses in the stock forecasting area is to improve the accuracy of stock trading forecast. This paper introduces a system that addresses the particular need. The system integrates various data mining techniques and supports the decision-making for stock trades. The proposed system embeds the top-down trading theory, artificial neural network theory, technical analysis, dynamic time series theory, and Bayesian probability theory. To experimentally examine the trading return of the presented system, two examples are studied. The first uses the Taiwan Semiconductor Manufacturing Company (TSMC data-set that covers an investment horizon of 240 trading days from 16 February 2011 to 23 January 2013. Eighty four transactions were made using the proposed approach and the investment return of the portfolio was 54% with an 80.4% hit rate during a 12-month period in which the TSMC stock price increased by 25% (from $NT 78.5 to $NT 101.5. The second example examines the stock data of Evergreen Marine Corporation, an international marine shipping company. Sixty four transactions were made and the investment return of the portfolio was 128% in 12 months. Given the remarkable investment returns in trading the example TSMC and Evergreen stocks, the proposed system demonstrates promising potentials as a viable tool for stock market forecasting.
Full Text Available Technology forecasting (TF is forecasting the future state of a technology. It is exciting to know the future of technologies, because technology changes the way we live and enhances the quality of our lives. In particular, TF is an important area in the management of technology (MOT for R&D strategy and new product development. Consequently, there are many studies on TF. Patent analysis is one method of TF because patents contain substantial information regarding developed technology. The conventional methods of patent analysis are based on quantitative approaches such as statistics and machine learning. The most traditional TF methods based on patent analysis have a common problem. It is the sparsity of patent keyword data structured from collected patent documents. After preprocessing with text mining techniques, most frequencies of technological keywords in patent data have values of zero. This problem creates a disadvantage for the performance of TF, and we have trouble analyzing patent keyword data. To solve this problem, we propose an interval estimation method (IEM. Using an adjusted Wald confidence interval called the Agresti–Coull confidence interval, we construct our IEM for efficient TF. In addition, we apply the proposed method to forecast the technology of an innovative company. To show how our work can be applied in the real domain, we conduct a case study using Apple technology.
This article reports the state-of-art of DEIS activites. DEIS activiteis are basically based on the activites of 8-10 investigation committees’ under DEIS committee. Recent DEIS activites are categlized into three functions in this article and remarkable activity or trend of each category is mentioned. Those are activities on insulation diagnosis (AI application and asset management), activities on new insulation technology for power tansmission (high Tc super conducting cable insulation and all solid sinulated substation), and activities on new insulating materials (Nanocomposite).
This article reports the state-of-art of TC-DEI ( Technical Committee of Dielectrics and Electrical Insulation of IEEJ) activites. The activiteis are basically based on the activites of 8-10 investigation committees under TC-DEI. Recent activites were categorized into three functions in this article and remarkable activity or trend for each category is mentioned as was done in the article of 2003. Thoese are activities on asset management (AI application and insulation diagnosis), activities on new insulating and functional materials (Nano composite) and activities on new insulation technology for power tansmission (high Tc superconducting cable insulation).
This article reports the state-of-art of TC-DEI ( Technical Committee of Dielectrics and Electrical Insulation of IEEJ) activites. The activiteis are basically based on the activites of 8-10 investigation committees under TC-DEI. Recent activites were categorized into three functions in this article and remarkable activity or trend for each category is mentioned as was seen in the articles of 2005. Those are activities on asset management (AI application and insulation diagnosis), activities on new insulating and functional materials (Nano composite) and activities on new insulation technology for power tansmission (high Tc superconducting cable insulation).
Thierion, V.; Ayral, P.-A.; Angelini, V.; Sauvagnargues-Lesage, S.; Nativi, S.; Payrastre, O.
Flash flood events of south of France such as the 8th and 9th September 2002 in the Grand Delta territory caused important economic and human damages. Further to this catastrophic hydrological situation, a reform of flood warning services have been initiated (set in 2006). Thus, this political reform has transformed the 52 existing flood warning services (SAC) in 22 flood forecasting services (SPC), in assigning them territories more hydrological consistent and new effective hydrological forecasting mission. Furthermore, national central service (SCHAPI) has been created to ease this transformation and support local services in their new objectives. New functioning requirements have been identified: - SPC and SCHAPI carry the responsibility to clearly disseminate to public organisms, civil protection actors and population, crucial hydrologic information to better anticipate potential dramatic flood event, - a new effective hydrological forecasting mission to these flood forecasting services seems essential particularly for the flash floods phenomenon. Thus, models improvement and optimization was one of the most critical requirements. Initially dedicated to support forecaster in their monitoring mission, thanks to measuring stations and rainfall radar images analysis, hydrological models have to become more efficient in their capacity to anticipate hydrological situation. Understanding natural phenomenon occuring during flash floods mainly leads present hydrological research. Rather than trying to explain such complex processes, the presented research try to manage the well-known need of computational power and data storage capacities of these services. Since few years, Grid technology appears as a technological revolution in high performance computing (HPC) allowing large-scale resource sharing, computational power using and supporting collaboration across networks. Nowadays, EGEE (Enabling Grids for E-science in Europe) project represents the most important
Lund, P D [Helsinki Univ. of Technology, Espoo (Finland). Advanced Energy Systems
An improved market penetration model with application to wind energy forecasting is presented. In the model, a technology diffusion model and manufacturing learning curve are combined. Based on a 85% progress ratio that was found for European wind manufactures and on wind market statistics, an additional wind power capacity of ca 4 GW is needed in Europe to reach a 30 % price reduction. A full breakthrough to low-cost utility bulk power markets could be achieved at a 24 GW level. (author)
An improved market penetration model with application to wind energy forecasting is presented. In the model, a technology diffusion model and manufacturing learning curve are combined. Based on a 85% progress ratio that was found for European wind manufactures and on wind market statistics, an additional wind power capacity of ca 4 GW is needed in Europe to reach a 30 % price reduction. A full breakthrough to low-cost utility bulk power markets could be achieved at a 24 GW level. (author)
Lund, P.D. [Helsinki Univ. of Technology, Espoo (Finland). Advanced Energy Systems
An improved market penetration model with application to wind energy forecasting is presented. In the model, a technology diffusion model and manufacturing learning curve are combined. Based on a 85% progress ratio that was found for European wind manufactures and on wind market statistics, an additional wind power capacity of ca 4 GW is needed in Europe to reach a 30 % price reduction. A full breakthrough to low-cost utility bulk power markets could be achieved at a 24 GW level. (author)
I. V. Danilin
Full Text Available Purpose: this paper analyzes and forecasts medium- to long-term dynamics of Smart Grid technology developments considering both patent activity and socio-economic (demand-side issues and requirements of economy and power system factors. Methods: for the analysis of Smart Grid patent data (IIP, USPTO, and WIPO patent databases used we apply syntactic semantic analysis of texts in natural languages and logistic curve-based method. We propose Exactus Patent system for intelligent full-text search and analysis of patents (results verified with Thomson Innovation and TotalPatent patent search systems. For interpretation of revealed dynamics and forecasting of future conditions we identify key long-term socio-economic factors drivers for Smart Grid development. Elements of C. Christensen (disruptive innovations and G. Dosi (technological trajectories theories were applied. Results: the study reveals a fast technological transformation within the Smart Grid domain due to the long-term socio-economic factors such as rise of renewables; energy efficiency and energy security issues; environmental constraints and shift of values; requirements for accelerated grid construction (in developing economies and grid modernization (in developed ones; ongoing economy-wide digitalization. Due to the limited economic effects of Smart Grid roll-outs (considering major requirements of economic agents and society and considering progressions of patent dynamics, authors forecasts technology stagnation (in terms of number of patents growth by the end of 2010-s as end of Gartner`s hype development stage. Conclusions and Relevance: a foreseen change in dynamics of Smart Grid technology development is interpreted as a manifestation of sinusoidal fluctuations in technology development for disruptive technologies (supported with OECD data. A longer cycle (in comparison with other disruptive technologies is interpreted as consequence of technology and industry specifics
Schettert, Plinio G.; Oliveira, Wagner S.; Aquino, Afonso R.
With base in the introduction in long time of the nuclear fusion inside of a system of viable energy, taking in consideration economic factors, would imply on investment in a long period. The objective of this project utilizing the method of the Delphi technique is the technological forecast a long time of the scientific-technological development of the nuclear fusion and its impact. This research project will be carried through different stages of improvement of variables. A questionnaire based on information and analysis of the literature validated for specialists in nuclear fusion becomes this project a tool in the elaboration future of a database contends variables on the theme nuclear fusion and its perspectives. The database will be composed for the answers and suggestions obtained, with exploratory and extrapolatory elements, on the theme a great number of specialists involving in the nuclear fusion area. The database is analyzed for the configuration of variables that represent elements as scientific-technological factors, economical, political, social and environmental among others. As final result of the research with the Delphi technique, different scenes obtained with the variables will be indicated by convergent factors or not on the approached perspectives. The analysis of the data will be possible through of improve of statistical analysis tools. This is the first analyzes of the answers. The questionnaire was validated with nuclear fusion specialists from the Institute of Physics of the University of Sao Paulo in Brazil and the Center of Nuclear Fusion of the Technical University of Lisbon in Portugal. (author)
Full Text Available This paper is presented fuzzy preference relations approach to forecast the success of implementing sensors advanced manufacturing technology (AMT. In the manufacturing environment, performance measurement is based on different quantitative and qualitative factors. This study proposes an analytic hierarchical prediction model based on fuzzy preference relations to help the organizations become aware of the essential factors affecting the AMT implementation, forecasting the chance of successful implementing sensors AMT, as well as identifying the actions necessary before implementing sensors AMT. Then predicted success/failure values are obtained to enable organizations to decide whether to initiate sensors AMT, inhibit adoption or take remedial actions to increase the possibility of successful sensors AMT initiatives. This proposed approach is demonstrated with a real case study involving six influential factors assessed by nine evaluators solicited from a semiconductor engineering incorporation located in Taiwan.
Shin, Jungwoo; Lee, Chul-Yong; Kim, Hongbum
Among the various alternatives available to reduce greenhouse gas (GHG) emissions, carbon capture and storage (CCS) is considered to be a prospective technology that could both improve economic growth and meet GHG emission reduction targets. Despite the importance of CCS, however, studies of technology and demand forecasting for CCS are scarce. This study bridges this gap in the body of knowledge on this topic by forecasting CCS technology and demand based on an integrated model. For technology forecasting, a logistic model and patent network analysis are used to compare the competitiveness of CCS technology for selected countries. For demand forecasting, a competition diffusion model is adopted to consider competition among renewable energies and forecast demand. The results show that the number of patent applications for CCS technology will increase to 16,156 worldwide and to 4,790 in Korea by 2025. We also find that the United States has the most competitive CCS technology followed by Korea and France. Moreover, about 5 million tCO_2e of GHG will be reduced by 2040 if CCS technology is adopted in Korea after 2020. - Highlights: • Carbon capture and storage (CCS) can help mitigate climate change globally. • It can both improve economic growth and meet GHG emission reduction targets. • We forecast CCS technology and demand based on an integrated model. • The US has the most competitive CCS technology followed by Korea and France. • 5 million tCO_2e of GHG will be reduced by 2040 if CCS is adopted in Korea.
Catalão, João P S
Overview of Electric Power Generation SystemsCláudio MonteiroUncertainty and Risk in Generation SchedulingRabih A. JabrShort-Term Load ForecastingAlexandre P. Alves da Silva and Vitor H. FerreiraShort-Term Electricity Price ForecastingNima AmjadyShort-Term Wind Power ForecastingGregor Giebel and Michael DenhardPrice-Based Scheduling for GencosGovinda B. Shrestha and Songbo QiaoOptimal Self-Schedule of a Hydro Producer under UncertaintyF. Javier Díaz and Javie
Kantha, L.; Carniel, S.; Sclavo, M.
The multi model super ensemble (S E) technique has been used with considerable success to improve meteorological forecasts and is now being applied to ocean models. Although the technique has been shown to produce deterministic forecasts that can be superior to the individual models in the ensemble or a simple multi model ensemble forecast, there is a clear need to understand its strengths and limitations. This paper is an attempt to do so in simple, easily understood contexts. The results demonstrate that the S E forecast is almost always better than the simple ensemble forecast, the degree of improvement depending on the properties of the models in the ensemble. However, the skill of the S E forecast with respect to the true forecast depends on a number of factors, principal among which is the skill of the models in the ensemble. As can be expected, if the ensemble consists of models with poor skill, the S E forecast will also be poor, although better than the ensemble forecast. On the other hand, the inclusion of even a single skillful model in the ensemble increases the forecast skill significantly.
Chullen, Cinda; Westheimer, David T.
Beginning in Fiscal Year (FY) 2011, Extravehicular activity (EVA) technology development became a technology foundational domain under a new program Enabling Technology Development and Demonstration. The goal of the EVA technology effort is to further develop technologies that will be used to demonstrate a robust EVA system that has application for a variety of future missions including microgravity and surface EVA. Overall the objectives will be reduce system mass, reduce consumables and maintenance, increase EVA hardware robustness and life, increase crew member efficiency and autonomy, and enable rapid vehicle egress and ingress. Over the past several years, NASA realized a tremendous increase in EVA system development as part of the Exploration Technology Development Program and the Constellation Program. The evident demand for efficient and reliable EVA technologies, particularly regenerable technologies was apparent under these former programs and will continue to be needed as future mission opportunities arise. The technological need for EVA in space has been realized over the last several decades by the Gemini, Apollo, Skylab, Space Shuttle, and the International Space Station (ISS) programs. EVAs were critical to the success of these programs. Now with the ISS extension to 2028 in conjunction with a current forecasted need of at least eight EVAs per year, the EVA technology life and limited availability of the EMUs will become a critical issue eventually. The current Extravehicular Mobility Unit (EMU) has vastly served EVA demands by performing critical operations to assemble the ISS and provide repairs of satellites such as the Hubble Space Telescope. However, as the life of ISS and the vision for future mission opportunities are realized, a new EVA systems capability could be an option for the future mission applications building off of the technology development over the last several years. Besides ISS, potential mission applications include EVAs for
Hamann, Hendrik F. [IBM, Yorktown Heights, NY (United States). Thomas J. Watson Research Center
The goal of the project was the development and demonstration of a significantly improved solar forecasting technology (short: Watt-sun), which leverages new big data processing technologies and machine-learnt blending between different models and forecast systems. The technology aimed demonstrating major advances in accuracy as measured by existing and new metrics which themselves were developed as part of this project. Finally, the team worked with Independent System Operators (ISOs) and utilities to integrate the forecasts into their operations.
Jang, Sangmin; Yoon, Sunkwon; Rhee, Jinyoung; Park, Kyungwon
Due to the recent extreme weather and climate change, a frequency and size of localized heavy rainfall increases and it may bring various hazards including sediment-related disasters, flooding and inundation. To prevent and mitigate damage from such disasters, very short range forecasting and nowcasting of precipitation amounts are very important. Weather radar data very useful in monitoring and forecasting because weather radar has high resolution in spatial and temporal. Generally, extrapolation based on the motion vector is the best method of precipitation forecasting using radar rainfall data for a time frame within a few hours from the present. However, there is a need for improvement due to the radar rainfall being less accurate than rain-gauge on surface. To improve the radar rainfall and to take advantage of the COMS (Communication, Ocean and Meteorological Satellite) data, a technique to blend the different data types for very short range forecasting purposes was developed in the present study. The motion vector of precipitation systems are estimated using 1.5km CAPPI (Constant Altitude Plan Position Indicator) reflectivity by pattern matching method, which indicates the systems' direction and speed of movement and blended radar-COMS rain field is used for initial data. Since the original horizontal resolution of COMS is 4 km while that of radar is about 1 km, spatial downscaling technique is used to downscale the COMS data from 4 to 1 km pixels in order to match with the radar data. The accuracies of rainfall forecasting data were verified utilizing AWS (Automatic Weather System) observed data for an extreme rainfall occurred in the southern part of Korean Peninsula on 25 August 2014. The results of this study will be used as input data for an urban stream real-time flood early warning system and a prediction model of landslide. Acknowledgement This research was supported by a grant (13SCIPS04) from Smart Civil Infrastructure Research Program funded by
Sridhar, M.; Venkata Ratnam, D.; Padma Raju, K.; Sai Praharsha, D.; Saathvika, K.
The forecasting and modeling of ionospheric scintillation effects are crucial for precise satellite positioning and navigation applications. In this paper, a Neural Network model, trained using Particle Swarm Optimization (PSO) algorithm, has been implemented for the prediction of amplitude scintillation index (S4) observations. The Global Positioning System (GPS) and Ionosonde data available at Darwin, Australia (12.4634° S, 130.8456° E) during 2013 has been considered. The correlation analysis between GPS S4 and Ionosonde drift velocities (hmf2 and fof2) data has been conducted for forecasting the S4 values. The results indicate that forecasted S4 values closely follow the measured S4 values for both the quiet and disturbed conditions. The outcome of this work will be useful for understanding the ionospheric scintillation phenomena over low latitude regions.
Yuan, Yuan; Shih, Frank Y.; Jing, Ju; Wang, Hai-Min
We present a new method for automatically forecasting the occurrence of solar flares based on photospheric magnetic measurements. The method is a cascading combination of an ordinal logistic regression model and a support vector machine classifier. The predictive variables are three photospheric magnetic parameters, i.e., the total unsigned magnetic flux, length of the strong-gradient magnetic polarity inversion line, and total magnetic energy dissipation. The output is true or false for the occurrence of a certain level of flares within 24 hours. Experimental results, from a sample of 230 active regions between 1996 and 2005, show the accuracies of a 24-hour flare forecast to be 0.86, 0.72, 0.65 and 0.84 respectively for the four different levels. Comparison shows an improvement in the accuracy of X-class flare forecasting.
Yuan Yuan; Shih, Frank Y.; Jing Ju; Wang Haimin
We present a new method for automatically forecasting the occurrence of solar flares based on photospheric magnetic measurements. The method is a cascading combination of an ordinal logistic regression model and a support vector machine classifier. The predictive variables are three photospheric magnetic parameters, i.e., the total unsigned magnetic flux, length of the strong-gradient magnetic polarity inversion line, and total magnetic energy dissipation. The output is true or false for the occurrence of a certain level of flares within 24 hours. Experimental results, from a sample of 230 active regions between 1996 and 2005, show the accuracies of a 24-hour flare forecast to be 0.86, 0.72, 0.65 and 0.84 respectively for the four different levels. Comparison shows an improvement in the accuracy of X-class flare forecasting. (research papers)
Full Text Available Multistage expert surveys like the Delphi method are proven concepts for technology forecasting that enable the prediction of content-related and temporal development in fields of innovation (e.g., [1, 2]. Advantages of these qualitative multistage methods are a simple and easy to understand concept while still delivering valid results . Nevertheless, the literature also points out certain disadvantages especially in large-scale technology forecasts in particularly abstract fields of innovation . The proposed approach highlights the usefulness of the repertory grid method as an alternative for technology forecasting and as a first step for preference measurement. The basic approach from Baier and Kohler  is modified in-so-far that an online survey reduces the cognitive burden for the experts and simplifies the data collection process. Advantages over alternative approaches through its simple structure and through combining qualitative and quantitative methods are shown and an adaption on an actual field of innovation – civil drones in Germany – is done. The measurement of a common terminology for all experts minimizes misunderstandings during the interview and the achievement of an inter-individual comparable level of abstraction is forced by the laddering technique  during the interview.
Hudlow, Michael D.
The hydrologic forecasting service of the United States spans applications and scales ranging from those associated with the issuance of flood and flash warnings to those pertaining to seasonal water supply forecasts. New technological developments (underway in or planned by the National Weather Service (NWS) in support of the Hydrologic Program) are carried out as combined efforts by NWS headquarters and field personnel in cooperation with other organizations. These developments fall into two categories: hardware and software systems technology, and hydrometeorological analysis and prediction technology. Research, development, and operational implementation in progress in both of these areas are discussed. Cornerstones of an overall NWS modernization effort include implementation of state-of-the-art data acquisition systems (including the Next Generation Weather Radar) and communications and computer processing systems. The NWS Hydrologic Service will capitalize on these systems and will incorporate results from specific hydrologic projects including collection and processing of multivariate data sets, conceptual hydrologic modeling systems, integrated hydrologic modeling systems with meteorological interfaces and automatic updating of model states, and extended streamflow prediction techniques. The salient aspects of ongoing work in these areas are highlighted in this paper, providing some perspective on the future U.S. hydrologic forecasting service and its transitional period into the 1990s.
Mihaela Bratu (Simionescu
Full Text Available In this study, transformations of SPF inflation forecasts were made in order to get moreaccurate predictions. The filters application and Holt Winters technique were chosen as possiblestrategies of improving the predictions accuracy. The quarterly inflation rate forecasts (1975 Q1-2012Q3 of USAmade by SPF were transformed using an exponential smoothing technique-HoltWinters-and these new predictions are better than the initial ones for all forecasting horizons of 4quarters. Some filters were applied to SPF forecasts (Hodrick-Prescott,Band-Pass and Christiano-Fitzegerald filters, but Holt Winters method was superior.Full sample asymmetric (Christiano-Fitzegerald and Band-Pass filtersmoothed values are more accurate than the SPF expectations onlyfor some forecast horizons.
Full Text Available This study proposes a computationally efficient solution to stream flow forecasting for river basins where historical time series data are available. Two data-driven modeling techniques are investigated, namely support vector regression...
Jan D. Keller
Full Text Available The quantitative forecast of precipitation requires a probabilistic background particularly with regard to forecast lead times of more than 3 days. As only ensemble simulations can provide useful information of the underlying probability density function, we built a new ensemble forecasting system (GME-EFS based on the GME model of the German Meteorological Service (DWD. For the generation of appropriate initial ensemble perturbations we chose the breeding technique developed by Toth and Kalnay (1993, 1997, which develops perturbations by estimating the regions of largest model error induced uncertainty. This method is applied and tested in the framework of quasi-operational forecasts for a three month period in 2007. The performance of the resulting ensemble forecasts are compared to the operational ensemble prediction systems ECMWF EPS and NCEP GFS by means of ensemble spread of free atmosphere parameters (geopotential and temperature and ensemble skill of precipitation forecasting. This comparison indicates that the GME ensemble forecasting system (GME-EFS provides reasonable forecasts with spread skill score comparable to that of the NCEP GFS. An analysis with the continuous ranked probability score exhibits a lack of resolution for the GME forecasts compared to the operational ensembles. However, with significant enhancements during the 3 month test period, the first results of our work with the GME-EFS indicate possibilities for further development as well as the potential for later operational usage.
Wu, Jie; Wang, Jianzhou; Lu, Haiyan; Dong, Yao; Lu, Xiaoxiao
Highlights: ► The seasonal and trend items of the data series are forecasted separately. ► Seasonal item in the data series is verified by the Kendall τ correlation testing. ► Different regression models are applied to the trend item forecasting. ► We examine the superiority of the combined models by the quartile value comparison. ► Paired-sample T test is utilized to confirm the superiority of the combined models. - Abstract: For an energy-limited economy system, it is crucial to forecast load demand accurately. This paper devotes to 1-week-ahead daily load forecasting approach in which load demand series are predicted by employing the information of days before being similar to that of the forecast day. As well as in many nonlinear systems, seasonal item and trend item are coexisting in load demand datasets. In this paper, the existing of the seasonal item in the load demand data series is firstly verified according to the Kendall τ correlation testing method. Then in the belief of the separate forecasting to the seasonal item and the trend item would improve the forecasting accuracy, hybrid models by combining seasonal exponential adjustment method (SEAM) with the regression methods are proposed in this paper, where SEAM and the regression models are employed to seasonal and trend items forecasting respectively. Comparisons of the quartile values as well as the mean absolute percentage error values demonstrate this forecasting technique can significantly improve the accuracy though models applied to the trend item forecasting are eleven different ones. This superior performance of this separate forecasting technique is further confirmed by the paired-sample T tests
Full Text Available A nonlinear forecasting method was used to predict the behavior of a cloud coverage time series several hours in advance. The method is based on the reconstruction of a chaotic strange attractor using four years of cloud absorption data obtained from half-hourly Meteosat infrared images from Northwestern Spain. An exhaustive nonlinear analysis of the time series was carried out to reconstruct the phase space of the underlying chaotic attractor. The forecast values are used by a non-hydrostatic meteorological model ARPS for daily weather prediction and their results compared with surface temperature measurements from a meteorological station and a vertical sounding. The effect of noise in the time series is analyzed in terms of the prediction results.Key words: Meterology and atmospheric dynamics (mesoscale meteorology; general – General (new fields
Full Text Available A nonlinear forecasting method was used to predict the behavior of a cloud coverage time series several hours in advance. The method is based on the reconstruction of a chaotic strange attractor using four years of cloud absorption data obtained from half-hourly Meteosat infrared images from Northwestern Spain. An exhaustive nonlinear analysis of the time series was carried out to reconstruct the phase space of the underlying chaotic attractor. The forecast values are used by a non-hydrostatic meteorological model ARPS for daily weather prediction and their results compared with surface temperature measurements from a meteorological station and a vertical sounding. The effect of noise in the time series is analyzed in terms of the prediction results.
Key words: Meterology and atmospheric dynamics (mesoscale meteorology; general – General (new fields
Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.
A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.
Kock, Anders Bredahl; Teräsvirta, Timo
In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a cla...... during the economic crisis 2007–2009. Forecast accuracy is measured by the root mean square forecast error. Hypothesis testing is also used to compare the performance of the different techniques with each other....
Nataliya N. Andrienko
Full Text Available The paper reviews a methodology of inefficient capital outflow forecasting in Ukraine, as one of the new instruments for investment activity revivification under the system crisis conditions. An analogy is made between the foreseeable and unexpected losses in crediting as well as the efficient and inefficient capital outflows in the form of reserve funds accrual and subsequent reverse procedure. Phenomenological approach and generalization of the experience in negative investment analysis are applied. Exposed is the substantiation of phenomenological approach in choosing one of the proposed beta distribution options with economic interpretation of this approach development. Considered is the maximum entropy principle as a stochastic dominance revealed therein.
Danner, Travis W.
modeling technique begins to diminish. With the introduction of multiple objectives, researchers often abandon technology growth models for scoring models and technology frontiers. While both approaches possess advantages over current growth models for the assessment of multi-objective technologies, each lacks a necessary dimension for comprehensive technology assessment. By collapsing multiple system metrics into a single, non-intuitive technology measure, scoring models provide a succinct framework for multi-objective technology assessment and forecasting. Yet, with no consideration of physical limits, scoring models provide no insight as to the feasibility of a particular combination of system capabilities. They only indicate that a given combination of system capabilities yields a particular score. Conversely, technology frontiers are constructed with the distinct objective of providing insight into the feasibility of system capability combinations. Yet again, upper limits to overall system performance are ignored. Furthermore, the data required to forecast subsequent technology frontiers is often inhibitive. In an attempt to reincorporate the fundamental nature of technology advancement as bound by physical principles, researchers have sought to normalize multi-objective systems whereby the variability of a single system objective is eliminated as a result of changes in the remaining objectives. This drastically limits the applicability of the resulting technology model because it is only applicable for a single setting of all other system attributes. Attempts to maintain the interaction between the growth curves of each technical objective of a complex system have thus far been limited to qualitative and subjective consideration. This research proposes the formulation of multidimensional growth models as an approach to simulating the advancement of multi-objective technologies towards their upper limits. Multidimensional growth models were formulated by noticing and
Gilshannon, S.T.; Brown, D.R.
In 1993 the DOE Office of Energy Efficiency and Renewable Energy (EE) initiated a program called Quality Metrics. Quality Metrics was developed to measure the costs and benefits of technologies being developed by EE R ampersand D programs. The impact of any new technology is directly related to its adoption by the market. The techniques employed to project market adoption are critical to measuring a new technology's impact. Our purpose was to review current market penetration theories and models and develop a recommended approach for evaluating the market penetration of DOE technologies. The following commonly cited innovation diffusion theories were reviewed to identify analytical approaches relevant to new energy technologies: (1) the normal noncumulative adopter distribution method, (2) the Bass Model, (3) the Mansfield-Blackman Model, (4) the Fisher-Pry Model, (5) a meta-analysis of innovation diffusion studies. Of the theories reviewed, the Bass and Mansfield-Blackman models were found most applicable to forecasting the market penetration of electricity supply technologies. Their algorithms require input estimates which characterize the technology adoption behavior of the electricity supply industry. But, inadequate work has been done to quantify the technology adoption characteristics of this industry. The following energy technology market penetration models were also reviewed: (1) DOE's Renewable Energy Penetration (REP) Model, (2) DOE's Electricity Capacity Planning Submodule of the National Energy Modeling System (NEMS), (3) the Assessment of Energy Technologies (ASSET) model by Regional Economic Research, Inc., (4) the Market TREK model by the Electric Power Research Institute (EPRI). The two DOE models were developed for electricity generation technologies whereas the Regional Economic Research and EPRI models were designed for demand- side energy technology markets. Therefore, the review and evaluation focused on the DOE models
Qin, Jiang-Lin; Yang, Xiu-Hao; Yang, Zhong-Wu; Luo, Ji-Tong; Lei, Xiu-Feng
Near surface air temperature and rainfall are major weather factors affecting forest insect dynamics. The recent developments in remote sensing retrieval and geographic information system spatial analysis techniques enable the utilization of weather factors to significantly enhance forest pest forecasting and warning systems. The current study focused on building forest pest digital data structures as a platform of correlation analysis between weather conditions and forest pest dynamics for better pest forecasting and warning systems using the new technologies. The study dataset contained 3 353 425 small polygons with 174 defined attributes covering 95 counties of Guangxi province of China currently registering 292 forest pest species. Field data acquisition and information transfer systems were established with four software licences that provided 15-fold improvement compared to the systems currently used in China. Nine technical specifications were established including codes of forest districts, pest species and host tree species, and standard practices of forest pest monitoring and information management. Attributes can easily be searched using ArcGIS9.3 and/or the free QGIS2.16 software. Small polygons with pest relevant attributes are a new tool of precision farming and detailed forest insect pest management that are technologically advanced. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Manoja Kumar Behera
Full Text Available Prediction of photovoltaic power is a significant research area using different forecasting techniques mitigating the effects of the uncertainty of the photovoltaic generation. Increasingly high penetration level of photovoltaic (PV generation arises in smart grid and microgrid concept. Solar source is irregular in nature as a result PV power is intermittent and is highly dependent on irradiance, temperature level and other atmospheric parameters. Large scale photovoltaic generation and penetration to the conventional power system introduces the significant challenges to microgrid a smart grid energy management. It is very critical to do exact forecasting of solar power/irradiance in order to secure the economic operation of the microgrid and smart grid. In this paper an extreme learning machine (ELM technique is used for PV power forecasting of a real time model whose location is given in the Table 1. Here the model is associated with the incremental conductance (IC maximum power point tracking (MPPT technique that is based on proportional integral (PI controller which is simulated in MATLAB/SIMULINK software. To train single layer feed-forward network (SLFN, ELM algorithm is implemented whose weights are updated by different particle swarm optimization (PSO techniques and their performance are compared with existing models like back propagation (BP forecasting model. Keywords: PV array, Extreme learning machine, Maximum power point tracking, Particle swarm optimization, Craziness particle swarm optimization, Accelerate particle swarm optimization, Single layer feed-forward network
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
To describe and classify health technologies predicted in forecasting studies. A portrait describing health technologies predicted in 15 forecasting studies published between 1986 and 2010 that were identified in a previous systematic review. Health technologies are classified according to their type, purpose and clinical use; relating these to the original purpose and timing of the forecasting studies. All health-related technologies predicted in 15 forecasting studies identified in a previously published systematic review. Outcomes related to (1) each forecasting study including country, year, intention and forecasting methods used and (2) the predicted technologies including technology type, purpose, targeted clinical area and forecast timeframe. Of the 896 identified health-related technologies, 685 (76.5%) were health technologies with an explicit or implied health application and included in our study. Of these, 19.1% were diagnostic or imaging tests, 14.3% devices or biomaterials, 12.6% information technology systems, eHealth or mHealth and 12% drugs. The majority of the technologies were intended to treat or manage disease (38.1%) or diagnose or monitor disease (26.1%). The most frequent targeted clinical areas were infectious diseases followed by cancer, circulatory and nervous system disorders. The most frequent technology types were for: infectious diseases-prophylactic vaccines (45.8%), cancer-drugs (40%), circulatory disease-devices and biomaterials (26.3%), and diseases of the nervous system-equally devices and biomaterials (25%) and regenerative medicine (25%). The mean timeframe for forecasting was 11.6 years (range 0-33 years, median=10, SD=6.6). The forecasting timeframe significantly differed by technology type (p=0.002), the intent of the forecasting group (p<0.001) and the methods used (p<001). While description and classification of predicted health-related technologies is crucial in preparing healthcare systems for adopting new innovations
Buden, D.; Albert, T.
A new generation of reactors for electric power will be available for space missions to satisfy military and civilian needs in the 1990s and beyond. To ensure a useful product, nuclear power plant development must be cognizant of other space power technologies. Major advances in solar and chemical technologies need to be considered in establishing the goals of future nuclear power plants. In addition, the mission needs are evolving into new regimes. Civilian and military power needs are forecasted to exceed anything used in space to date. Technology trend forecasts have been mapped as a function of time for solar, nuclear, chemical, and storage systems to illustrate areas where each technology provides minimum mass. Other system characteristics may dominate the usefulness of a technology on a given mission. This paper will discuss some of these factors, as well as forecast future military and civilian power needs and the status of technologies for the 1990s and 2000s. 6 references
Falconer, David A; Moore, Ronald L; Barghouty, Abdulnasser F; Khazanov, Igor
MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free magnetic energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the “Present MAG4” technique and each of three alternative techniques, called “McIntosh Active-Region Class,” “Total Magnetic Flux,” and “Next MAG4.” We do this by using (1) the MAG4 database of magnetograms and major flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4). Key Points Quantitative comparison of performance of pairs of forecasting techniques Next MAG4 forecasts major flares more accurately than Present MAG4 Present MAG4 forecast outperforms McIntosh AR Class and total magnetic flux PMID:26213517
The theory and technological implementation of stochastic cooling is described. Theoretical and technological limitations are discussed. Data from existing stochastic cooling systems are shown to illustrate some useful techniques
Xu Chang-kai; Wang Yao-cai; Wang Jun-wei [CUMT, Xuzhou (China). School of Information and Electrical Engineering
The safe production of coalmine can be further improved by forecasting the quantity of gas emission based on the real-time data and historical data which the gas monitoring system has saved. By making use of the advantages of data warehouse and data mining technology for processing large quantity of redundancy data, the method and its application of forecasting mine gas emission quantity based on FDM were studied. The constructing fuzzy resembling relation and clustering analysis were proposed, which the potential relationship inside the gas emission data may be found. The mode finds model and forecast model were presented, and the detailed approach to realize this forecast was also proposed, which have been applied to forecast the gas emission quantity efficiently.
Amjady, Nima; Keynia, Farshid
With the introduction of restructuring into the electric power industry, the price of electricity has become the focus of all activities in the power market. Electricity price forecast is key information for electricity market managers and participants. However, electricity price is a complex signal due to its non-linear, non-stationary, and time variant behavior. In spite of performed research in this area, more accurate and robust price forecast methods are still required. In this paper, a new forecast strategy is proposed for day-ahead price forecasting of electricity markets. Our forecast strategy is composed of a new two stage feature selection technique and cascaded neural networks. The proposed feature selection technique comprises modified Relief algorithm for the first stage and correlation analysis for the second stage. The modified Relief algorithm selects candidate inputs with maximum relevancy with the target variable. Then among the selected candidates, the correlation analysis eliminates redundant inputs. Selected features by the two stage feature selection technique are used for the forecast engine, which is composed of 24 consecutive forecasters. Each of these 24 forecasters is a neural network allocated to predict the price of 1 h of the next day. The whole proposed forecast strategy is examined on the Spanish and Australia's National Electricity Markets Management Company (NEMMCO) and compared with some of the most recent price forecast methods.
Srivastava, Gaurav; Panda, Sudhindra N.; Mondal, Pratap; Liu, Junguo
SummaryForecasting of rainfall is imperative for rainfed agriculture of arid and semi-arid regions of the world where agriculture consumes nearly 80% of the total water demand. Fuzzy-Ranking Algorithm (FRA) is used to identify the significant input variables for rainfall forecast. A case study is carried out to forecast monthly rainfall in India with several ocean-atmospheric predictor variables. Three different scenarios of ocean-atmospheric predictor variables are used as a set of possible input variables for rainfall forecasting model: (1) two climate indices, i.e. Southern Oscillation Index (SOI) and Pacific Decadal Oscillation Index (PDOI); (2) Sea Surface Temperature anomalies (SSTa) in the 5° × 5° grid points in Indian Ocean; and (3) both the climate indices and SSTa. To generate a set of possible input variables for these scenarios, we use climatic indices and the SSTa data with different lags between 1 and 12 months. Nonlinear relationship between identified inputs and rainfall is captured with an Artificial Neural Network (ANN) technique. A new approach based on fuzzy c-mean clustering is proposed for dividing data into representative subsets for training, testing, and validation. The results show that this proposed approach overcomes the difficulty in determining optimal numbers of clusters associated with the data division technique of self-organized map. The ANN model developed with both the climate indices and SSTa shows the best performance for the forecast of the monthly August rainfall in India. Similar approach can be applied to forecast rainfall of any period at selected climatic regions of the world where significant relationship exists between the rainfall and climate indices.
Gafurov, O.; Gafurov, D.; Syryamkin, V.
The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.
This handbook equips both radiologists and radiologists in training with a thorough working knowledge of the mechanisms and processes of computed tomography (CT) image generation, the common causes of image artifacts, and useful examination protocols for each area of the body. The author explains the fundamental technological principles of CT, focusing on those concepts crucial to successful CT examinations. The first part of the book succinctly reviews the fundamentals of CT technology. It begins with a methodical introduction to key principles of X-ray physics and technology, in which topics such as the modulation transfer function, magnification, and the X-ray tube are discussed in understandable, nonmathematical terms. The author then explains the basic technology of CT scanners, the principles of scan projection radiography, and the essential rules for radiation dosage determination and radiation protection. Careful attention is given to selectable scan factors in both routine and dynamic scanning, as well as to the processes involved in image creation and refinement and the chief determinants of image quality. Basic and specialized program features and the technology of image display, recording, and storage are also thoroughly described
Martin, Sergio; Diaz, Gabriel; Sancristobal, Elio; Gil, Rosario; Castro, Manuel; Peire, Juan
Each year since 2004, a new Horizon Report has been released. Each edition attempts to forecast the most promising technologies likely to impact on education along three horizons: the short term (the year of the report), the mid-term (the next 2 years) and the long term (the next 4 years). This paper analyzes the evolution of technology trends…
N. I. Komkov
Full Text Available In article laws of scientifically-technological development are considered. Their number concern traditional, base and new, formed. Possibilities and ways of the account of the listed laws are shown at forecasting of prospects of scientifically-technological development.
Guo, Zhenhai; Chi, Dezhong; Wu, Jie; Zhang, Wenyu
Highlights: • Impact of meteorological factors on wind speed forecasting is taken into account. • Forecasted wind speed results are corrected by the associated rules. • Forecasting accuracy is improved by the new wind speed forecasting strategy. • Robust of the proposed model is validated by data sampled from different sites. - Abstract: Wind energy has been the fastest growing renewable energy resource in recent years. Because of the intermittent nature of wind, wind power is a fluctuating source of electrical energy. Therefore, to minimize the impact of wind power on the electrical grid, accurate and reliable wind power forecasting is mandatory. In this paper, a new wind speed forecasting approach based on based on the chaotic time series modelling technique and the Apriori algorithm has been developed. The new approach consists of four procedures: (I) Clustering by using the k-means clustering approach; (II) Employing the Apriori algorithm to discover the association rules; (III) Forecasting the wind speed according to the chaotic time series forecasting model; and (IV) Correcting the forecasted wind speed data using the associated rules discovered previously. This procedure has been verified by 31-day-ahead daily average wind speed forecasting case studies, which employed the wind speed and other meteorological data collected from four meteorological stations located in the Hexi Corridor area of China. The results of these case studies reveal that the chaotic forecasting model can efficiently improve the accuracy of the wind speed forecasting, and the Apriori algorithm can effectively discover the association rules between the wind speed and other meteorological factors. In addition, the correction results demonstrate that the association rules discovered by the Apriori algorithm have powerful capacities in handling the forecasted wind speed values correction when the forecasted values do not match the classification discovered by the association rules
Omotola Awojobi; Glenn P. Jenkins
Hydropower investments have been subject to intense criticism over environmental issues and the common experience with cost uncertainty. In this study we address the issue of uncertainty in cost projections by applying reference class forecasting (RCF) in order to improve the reliability of costs used for making decisions under uncertainty. This technique makes it possible to closely link contingency estimates to the likely incidence of uncertainty of construction costs for hydroelectric dams...
Ashrafi, S.; Roszman, L.; Cooley, J.
This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Fourier series and statistical communication theory techniques are utilized in the estimation of river water temperature increases caused by external thermal inputs. An example estimate assuming a constant thermal input is demonstrated. A regression fit of the Fourier series approximation of temperature is then used to forecast daily average water temperatures. Also, a 60-day prediction of daily average water temperature is made with the aid of the Fourier regression fit by using significant Fourier components
Folmer, Michael; Halverson, Jeffrey; Berndt, Emily; Dunion, Jason; Goodman, Steve; Goldberg, Mitch
The Geostationary Operational Environmental Satellites R-Series (GOES-R) and Joint Polar Satellite System (JPSS) Satellite Proving Grounds have introduced multiple proxy and operational products into operations over the last few years. Some of these products have proven to be useful in current operations at various National Weather Service (NWS) offices and national centers as a first look at future satellite capabilities. Forecasters at the National Hurricane Center (NHC), Ocean Prediction Center (OPC), NESDIS Satellite Analysis Branch (SAB) and the NASA Hurricane and Severe Storms Sentinel (HS3) field campaign have had access to a few of these products to assist in monitoring extratropical transitions of hurricanes. The red, green, blue (RGB) Air Mass product provides forecasters with an enhanced view of various air masses in one complete image to help differentiate between possible stratospheric/tropospheric interactions, moist tropical air masses, and cool, continental/maritime air masses. As a compliment to this product, a new Atmospheric Infrared Sounder (AIRS) and Cross-track Infrared Sounder (CrIS) Ozone product was introduced in the past year to assist in diagnosing the dry air intrusions seen in the RGB Air Mass product. Finally, a lightning density product was introduced to forecasters as a precursor to the new Geostationary Lightning Mapper (GLM) that will be housed on GOES-R, to monitor the most active regions of convection, which might indicate a disruption in the tropical environment and even signal the onset of extratropical transition. This presentation will focus on a few case studies that exhibit extratropical transition and point out the usefulness of these new satellite techniques in aiding forecasters forecast these challenging events.
Kock, Anders Bredahl; Teräsvirta, Timo
. The performances of these three model selectors are compared by looking at the accuracy of the forecasts of the estimated neural network models. We apply the neural network model and the three modelling techniques to monthly industrial production and unemployment series from the G7 countries and the four......In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact...... that they form a class of universal approximators and may be expected to work well during exceptional periods such as major economic crises. Neural network models are often difficult to estimate, and we follow the idea of White (2006) of transforming the specification and nonlinear estimation problem...
Sfetsos, A. [7 Pirsou Str., Athens (Greece); Coonick, A.H. [Imperial Coll. of Science Technology and Medicine, Dept. of Electrical and Electronic Engineering, London (United Kingdom)
This paper introduces a new approach for the forecasting of mean hourly global solar radiation received by a horizontal surface. In addition to the traditional linear methods, several artificial-intelligence-based techniques are studied. These include linear, feed-forward, recurrent Elman and Radial Basis neural networks alongside the adaptive neuro-fuzzy inference scheme. The problem is examined initially for the univariate case, and is extended to include additional meteorological parameters in the process of estimating the optimum model. The results indicate that the developed artificial intelligence models predict the solar radiation time series more effectively compared to the conventional procedures based on the clearness index. The forecasting ability of some models can be further enhanced with the use of additional meteorological parameters. (Author)
Banik, Shipra; Khodadad Khan, A F M; Anwer, Mohammad
Forecasting stock market has been a difficult job for applied researchers owing to nature of facts which is very noisy and time varying. However, this hypothesis has been featured by several empirical experiential studies and a number of researchers have efficiently applied machine learning techniques to forecast stock market. This paper studied stock prediction for the use of investors. It is always true that investors typically obtain loss because of uncertain investment purposes and unsighted assets. This paper proposes a rough set model, a neural network model, and a hybrid neural network and rough set model to find optimal buy and sell of a share on Dhaka stock exchange. Investigational findings demonstrate that our proposed hybrid model has higher precision than the single rough set model and the neural network model. We believe this paper findings will help stock investors to decide about optimal buy and/or sell time on Dhaka stock exchange.
Doos, Lucy; Packer, Claire; Ward, Derek; Simpson, Sue; Stevens, Andrew
Forecasting can support rational decision-making around the introduction and use of emerging health technologies and prevent investment in technologies that have limited long-term potential. However, forecasting methods need to be credible. We performed a systematic search to identify the methods used in forecasting studies to predict future health technologies within a 3-20-year timeframe. Identification and retrospective assessment of such methods potentially offer a route to more reliable prediction. Systematic search of the literature to identify studies reported on methods of forecasting in healthcare. People are not needed in this study. The authors searched MEDLINE, EMBASE, PsychINFO and grey literature sources, and included articles published in English that reported their methods and a list of identified technologies. Studies reporting methods used to predict future health technologies within a 3-20-year timeframe with an identified list of individual healthcare technologies. Commercially sponsored reviews, long-term futurology studies (with over 20-year timeframes) and speculative editorials were excluded. 15 studies met our inclusion criteria. Our results showed that the majority of studies (13/15) consulted experts either alone or in combination with other methods such as literature searching. Only 2 studies used more complex forecasting tools such as scenario building. The methodological fundamentals of formal 3-20-year prediction are consistent but vary in details. Further research needs to be conducted to ascertain if the predictions made were accurate and whether accuracy varies by the methods used or by the types of technologies identified. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Azimi, R.; Ghayekhloo, M.; Ghofrani, M.
Highlights: • A novel clustering approach is proposed based on the data transformation approach. • A novel cluster selection method based on correlation analysis is presented. • The proposed hybrid clustering approach leads to deep learning for MLPNN. • A hybrid forecasting method is developed to predict solar radiations. • The evaluation results show superior performance of the proposed forecasting model. - Abstract: Accurate forecasting of renewable energy sources plays a key role in their integration into the grid. This paper proposes a hybrid solar irradiance forecasting framework using a Transformation based K-means algorithm, named TB K-means, to increase the forecast accuracy. The proposed clustering method is a combination of a new initialization technique, K-means algorithm and a new gradual data transformation approach. Unlike the other K-means based clustering methods which are not capable of providing a fixed and definitive answer due to the selection of different cluster centroids for each run, the proposed clustering provides constant results for different runs of the algorithm. The proposed clustering is combined with a time-series analysis, a novel cluster selection algorithm and a multilayer perceptron neural network (MLPNN) to develop the hybrid solar radiation forecasting method for different time horizons (1 h ahead, 2 h ahead, …, 48 h ahead). The performance of the proposed TB K-means clustering is evaluated using several different datasets and compared with different variants of K-means algorithm. Solar datasets with different solar radiation characteristics are also used to determine the accuracy and processing speed of the developed forecasting method with the proposed TB K-means and other clustering techniques. The results of direct comparison with other well-established forecasting models demonstrate the superior performance of the proposed hybrid forecasting method. Furthermore, a comparative analysis with the benchmark solar
The Partitive Analytical Forecasting (PAF) technique is applied to the overall long-term program plans for the Division of Controlled Thermonuclear Research (DCTR) of the United States Energy Research and Development Administration (ERDA). As part of the PAF technique, the Graphical Evaluation and Review Technique (GERTS) IIIZ computer code is used to perform simulations on a logic network describing the DCTR long-term program plan. Logic networks describing the tokamak, mirror, and theta-pinch developments are simulated individually and then together to form an overall DCTR program network. The results of the simulation of the overall network using various funding schemes and strategies are presented. An economic sensitivity analysis is provided for the tokamak logic networks. An analysis is also performed of the fusion-fission hybrid concept in the context of the present DCTR goals. The results mentioned above as well as the PAF technique itself are evaluated, and recommendations for further research are discussed
An outline is given of the mission objectives and requirements, system elements, system concepts, technology requirements and forecasting, and priority analysis for LANDSAT D. User requirements and mission analysis and technological forecasting are emphasized. Mission areas considered include agriculture, range management, forestry, geology, land use, water resources, environmental quality, and disaster assessment.
Cassel, T.A.V.; Shimamoto, G.T.; Amundsen, C.B.; Blair, P.D.; Finan, W.F.; Smith, M.R.; Edeistein, R.H.
The backgrund, structure and use of modern forecasting methods for estimating the future development of geothermal energy in the United States are documented. The forecasting instrument may be divided into two sequential submodels. The first predicts the timing and quality of future geothermal resource discoveries from an underlying resource base. This resource base represents an expansion of the widely-publicized USGS Circular 790. The second submodel forecasts the rate and extent of utilization of geothermal resource discoveries. It is based on the joint investment behavior of resource developers and potential users as statistically determined from extensive industry interviews. It is concluded that geothermal resource development, especially for electric power development, will play an increasingly significant role in meeting US energy demands over the next 2 decades. Depending on the extent of R and D achievements in related areas of geosciences and technology, expected geothermal power development will reach between 7700 and 17300 Mwe by the year 2000. This represents between 8 and 18% of the expected electric energy demand (GWh) in western and northwestern states.
Fernandes, José Antonio
The effect of different factors (spawning biomass, environmental conditions) on recruitment is a subject of great importance in the management of fisheries, recovery plans and scenario exploration. In this study, recently proposed supervised classification techniques, tested by the machine-learning community, are applied to forecast the recruitment of seven fish species of North East Atlantic (anchovy, sardine, mackerel, horse mackerel, hake, blue whiting and albacore), using spawning, environmental and climatic data. In addition, the use of the probabilistic flexible naive Bayes classifier (FNBC) is proposed as modelling approach in order to reduce uncertainty for fisheries management purposes. Those improvements aim is to improve probability estimations of each possible outcome (low, medium and high recruitment) based in kernel density estimation, which is crucial for informed management decision making with high uncertainty. Finally, a comparison between goodness-of-fit and generalization power is provided, in order to assess the reliability of the final forecasting models. It is found that in most cases the proposed methodology provides useful information for management whereas the case of horse mackerel is an example of the limitations of the approach. The proposed improvements allow for a better probabilistic estimation of the different scenarios, i.e. to reduce the uncertainty in the provided forecasts.
C. W. Dawson
Full Text Available While engineers have been quantifying rainfall-runoff processes since the mid-19th century, it is only in the last decade that artificial neural network models have been applied to the same task. This paper evaluates two neural networks in this context: the popular multilayer perceptron (MLP, and the radial basis function network (RBF. Using six-hourly rainfall-runoff data for the River Yangtze at Yichang (upstream of the Three Gorges Dam for the period 1991 to 1993, it is shown that both neural network types can simulate river flows beyond the range of the training set. In addition, an evaluation of alternative RBF transfer functions demonstrates that the popular Gaussian function, often used in RBF networks, is not necessarily the ‘best’ function to use for river flow forecasting. Comparisons are also made between these neural networks and conventional statistical techniques; stepwise multiple linear regression, auto regressive moving average models and a zero order forecasting approach. Keywords: Artificial neural network, multilayer perception, radial basis function, flood forecasting
Li, Ziwei; Liu, Yutong; Cao, Hongjie
In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.
Shayeghi, H.; Ghasemi, A.
Highlights: • Presenting a hybrid CGSA-LSSVM scheme for price forecasting. • Considering uncertainties for filtering in input data and feature selection to improve efficiency. • Using DWT input featured LSSVM approach to classify next-week prices. • Used three real markets to illustrate performance of the proposed price forecasting model. - Abstract: At the present time, day-ahead electricity market is closely associated with other commodity markets such as fuel market and emission market. Under such an environment, day-ahead electricity price forecasting has become necessary for power producers and consumers in the current deregulated electricity markets. Seeking for more accurate price forecasting techniques, this paper proposes a new combination of a Feature Selection (FS) technique based mutual information (MI) technique and Wavelet Transform (WT) in this study. Moreover, in this paper a new modified version of Gravitational Search Algorithm (GSA) optimization based chaos theory, namely Chaotic Gravitational Search Algorithm (CGSA) is developed to find the optimal parameters of Least Square Support Vector Machine (LSSVM) to predict electricity prices. The performance and price forecast accuracy of the proposed technique is assessed by means of real data from Iran’s, Ontario’s and Spain’s price markets. The simulation results from numerical tables and figures in different cases show that the proposed technique increases electricity price market forecasting accuracy than the other classical and heretical methods in the scientific researches
DC: Center for Technology and National Security Policy, National Defense University, August 2005). 6 James D. Watson and Francis Crick , “A...occurred within the life sciences disciplines. Most notably this occurred early on in 1953 via the discovery of DNA’s double helix structure by Watson and... Crick .6 A confluence of organic chemistry, physics, genomics, and information technology further provided the ability to amplify and replicate the
Ettrich, Rüdiger; Thanos, Dimitris; Butcher, Sarah; Kotrcova, Marcela; Stanford, Natalie; Goble, Carole; Oberthuer, Angela; Hoefer, Thomas
This second continuous technology report is the outcome of the joint effort of the WP9 members and the Technology and Science Watch committee appointed by the steering committee appointed in April 2014. While the first report was designed to serve as a guide for building up the infrastructure, this second report takes into account the recommendations of the SAB that sees the possible role of the future infrastructure in fostering systems biology research by using existing experimental facilit...
Lage, A.; Taboada, J. J.
Precipitation is the most obvious of the weather elements in its effects on normal life. Numerical weather prediction (NWP) is generally used to produce quantitative precip- itation forecast (QPF) beyond the 1-3 h time frame. These models often fail to predict small-scale variations of rain because of spin-up problems and their coarse spatial and temporal resolution (Antolik, 2000). Moreover, there are some uncertainties about the behaviour of the NWP models in extreme situations (de Bruijn and Brandsma, 2000). Hybrid techniques, combining the benefits of NWP and statistical approaches in a flexible way, are very useful to achieve a good QPF. In this work, a new technique of QPF for Galicia (NW of Spain) is presented. This region has a percentage of rainy days per year greater than 50% with quantities that may cause floods, with human and economical damages. The technique is composed of a NWP model (ARPS) and a statistical downscaling process based on an automated classification scheme of at- mospheric circulation patterns for the Iberian Peninsula (J. Ribalaygua and R. Boren, 1995). Results show that QPF for Galicia is improved using this hybrid technique.  Antolik, M.S. 2000 "An Overview of the National Weather Service's centralized statistical quantitative precipitation forecasts". Journal of Hydrology, 239, pp:306- 337.  de Bruijn, E.I.F and T. Brandsma "Rainfall prediction for a flooding event in Ireland caused by the remnants of Hurricane Charley". Journal of Hydrology, 239, pp:148-161.  Ribalaygua, J. and Boren R. "Clasificación de patrones espaciales de precipitación diaria sobre la España Peninsular". Informes N 3 y 4 del Servicio de Análisis e Investigación del Clima. Instituto Nacional de Meteorología. Madrid. 53 pp.
Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd
An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).
Full Text Available Travel time forecasting is an interesting topic for many ITS services. Increased availability of data collection sensors increases the availability of the predictor variables but also highlights the high processing issues related to this big data availability. In this paper we aimed to analyse the potential of big data and supervised machine learning techniques in effectively forecasting travel times. For this purpose we used fused data from three data sources (Global Positioning System vehicles tracks, road network infrastructure data and meteorological data and four machine learning techniques (k-nearest neighbours, support vector machines, boosting trees and random forest. To evaluate the forecasting results we compared them in-between different road classes in the context of absolute values, measured in minutes, and the mean squared percentage error. For the road classes with the high average speed and long road segments, machine learning techniques forecasted travel times with small relative error, while for the road classes with the small average speeds and segment lengths this was a more demanding task. All three data sources were proven itself to have a high impact on the travel time forecast accuracy and the best results (taking into account all road classes were achieved for the k-nearest neighbours and random forest techniques.
The nuclear application of conventional powder metallurgy routes is centred on the fabrication of ceramic fuels. The stringent demands in terms of product performance required by the nuclear industry militate against the use of conventional powder metallurgy to produce metallic components such as the fuel cladding. However, the techniques developed in powder metallurgy find widespread application throughout nuclear technology. Illustrations of the use of these techniques are given in the fields of absorber materials, ceramic cladding materials, oxide fuels, cermet fuels, and the disposal of highly active waste. (author)
Saez Gallego, Javier
patterns that the load traditionally exhibited. On the other hand, this thesis is motivated by the decision-making processes of market players. In response to these challenges, this thesis provides mathematical models for decision-making under uncertainty in electricity markets. Demand-side bidding refers......This thesis deals with the development of new mathematical models that support the decision-making processes of market players. It addresses the problems of demand-side bidding, price-responsive load forecasting and reserve determination. From a methodological point of view, we investigate a novel...... approach to model the response of aggregate price-responsive load as a constrained optimization model, whose parameters are estimated from data by using inverse optimization techniques. The problems tackled in this dissertation are motivated, on one hand, by the increasing penetration of renewable energy...
Karimi, Sepideh; Kisi, Ozgur; Shiri, Jalal; Makarynskyy, Oleg
Accurate predictions of sea level with different forecast horizons are important for coastal and ocean engineering applications, as well as in land drainage and reclamation studies. The methodology of tidal harmonic analysis, which is generally used for obtaining a mathematical description of the tides, is data demanding requiring processing of tidal observation collected over several years. In the present study, hourly sea levels for Darwin Harbor, Australia were predicted using two different, data driven techniques, adaptive neuro-fuzzy inference system (ANFIS) and artificial neural network (ANN). Multi linear regression (MLR) technique was used for selecting the optimal input combinations (lag times) of hourly sea level. The input combination comprises current sea level as well as five previous level values found to be optimal. For the ANFIS models, five different membership functions namely triangular, trapezoidal, generalized bell, Gaussian and two Gaussian membership function were tested and employed for predicting sea level for the next 1 h, 24 h, 48 h and 72 h. The used ANN models were trained using three different algorithms, namely, Levenberg-Marquardt, conjugate gradient and gradient descent. Predictions of optimal ANFIS and ANN models were compared with those of the optimal auto-regressive moving average (ARMA) models. The coefficient of determination, root mean square error and variance account statistics were used as comparison criteria. The obtained results indicated that triangular membership function was optimal for predictions with the ANFIS models while adaptive learning rate and Levenberg-Marquardt were most suitable for training the ANN models. Consequently, ANFIS and ANN models gave similar forecasts and performed better than the developed for the same purpose ARMA models for all the prediction intervals.
Jamroz, Witold; Kurek, Mateusz; Lyszczarz, Ewelina; Brniak, Witold; Jachowicz, Renata
In the last few years there has been a huge progress in a development of printing techniques and their application in pharmaceutical sciences and particularly in the pharmaceutical technology. The variety of printing methods makes it necessary to systemize them, explain the principles of operation, and specify the possibilities of their use in pharmaceutical technology. This paper aims to review the printing techniques used in a drug development process. The growing interest in 2D and 3D printing methods results in continuously increasing number of scientific papers. Introduction of the first printed drug Spritam@ to the market seems to be a milestone of the 3D printing development. Thus, a particular aim of this review is to show the latest achievements of the researchers in the field of the printing medicines.
Full Text Available One of numerous problems experiencing in supply chain management is the demand. Most demands are appeared in terms of uncertainty. The broiler meat industry is inevitably encountering the same problem. In this research, hybrid forecasting model of ARIMA and Support Vector Machine (SVMs are developed to forecast broiler meat export. In addition, ARIMA, SVMs, and Moving Average (MA are chosen for comparing the forecasting efficiency. All the forecasting models are tested and validated using the data of Brazil’s export, Canada’s export, and Thailand’s export. The hybrid model provides accuracy of the forecasted values that are 98.71%, 97.50%, and 93.01%, respectively. In addition, the hybrid model presents the least error of all MAE, RMSE, and MAPE comparing with other forecasting models. As forecasted data are applied to transportation planning, the mean absolute percentage error (MAPE of optimal value of forecasted value and actual value is 14.53%. The hybrid forecasting model shows an ability to reduce risk of total cost of transportation when broiler meat export is forecasted by using MA(2, MA(3, ARIMA, and SVM are 50.59%, 60.18%, 68.01%, and 46.55%, respectively. The results indicate that the developed forecasting model is recommended to broiler meat industries’ supply chain decision.
Kanchymalay, Kasturi; Salim, N.; Sukprasert, Anupong; Krishnan, Ramesh; Raba'ah Hashim, Ummi
The aim of this paper was to study the correlation between crude palm oil (CPO) price, selected vegetable oil prices (such as soybean oil, coconut oil, and olive oil, rapeseed oil and sunflower oil), crude oil and the monthly exchange rate. Comparative analysis was then performed on CPO price forecasting results using the machine learning techniques. Monthly CPO prices, selected vegetable oil prices, crude oil prices and monthly exchange rate data from January 1987 to February 2017 were utilized. Preliminary analysis showed a positive and high correlation between the CPO price and soy bean oil price and also between CPO price and crude oil price. Experiments were conducted using multi-layer perception, support vector regression and Holt Winter exponential smoothing techniques. The results were assessed by using criteria of root mean square error (RMSE), means absolute error (MAE), means absolute percentage error (MAPE) and Direction of accuracy (DA). Among these three techniques, support vector regression(SVR) with Sequential minimal optimization (SMO) algorithm showed relatively better results compared to multi-layer perceptron and Holt Winters exponential smoothing method.
Slides used in a presentation at The Power of Change Conference in Vancouver, BC in April 1995 about the changing needs for load forecasting were presented. Technological innovations and population increase were said to be the prime driving forces behind the changing needs in load forecasting. Structural changes, market place changes, electricity supply planning changes, and changes in planning objectives were other factors discussed. It was concluded that load forecasting was a form of information gathering, that provided important market intelligence
Kosek, W.; Kalarus, M.; Johnson, T. J.; Wooden, W. H.; McCarthy, D. D.; Popiński, W.
Stochastic prediction techniques including autocovariance, autoregressive, autoregressive moving average, and neural networks were applied to the UT1-UTC and Length of Day (LOD) International Earth Rotation and Reference Systems Servive (IERS) EOPC04 time series to evaluate the capabilities of each method. All known effects such as leap seconds and solid Earth zonal tides were first removed from the observed values of UT1-UTC and LOD. Two combination procedures were applied to predict the resulting LODR time series: 1) the combination of the least-squares (LS) extrapolation with a stochastic predition method, and 2) the combination of the discrete wavelet transform (DWT) filtering and a stochastic prediction method. The results of the combination of the LS extrapolation with different stochastic prediction techniques were compared with the results of the UT1-UTC prediction method currently used by the IERS Rapid Service/Prediction Centre (RS/PC). It was found that the prediction accuracy depends on the starting prediction epochs, and for the combined forecast methods, the mean prediction errors for 1 to about 70 days in the future are of the same order as those of the method used by the IERS RS/PC.
A. E. Sklyarov
Full Text Available Innovation activities, as well as innovations, are closely related meanings, and like many others economical definitions, have a broad range of meanings. Main characteristics and attributes of innovation involves new or significantly improved product, that’s being used, or in other words, found its application, and innovative activities – activities focused on realization of innovations. In this article, innovations are mainly considered in terms of high-technology production, evidence from Russian space industry. There are 5 basic stages of lifecycle of innovative project in considered industry: initiation, development, realization, expansion, consumption. Practically, third or fourth, or even both of these stages, often missing because there is no need of them. R&D activities, or even further serial production, based on previous developments, is an innovation activity, because these activities are stages of innovative projects lifecycle itself. Then it seems legit, to draw a conclusion, that in terms of high-technology production, company’s primary activity equals innovative activity. Basic characteristics of innovative activity of high-technology companies as assessment and forecasting object involves high level of uncertainty at every stage of projects lifecycle, high dependency on funding level of this activity, and high level and erratic structure of risk. All the above mentioned, means that assessment and forecasting of innovative activity of high-technology companies, needs development of its own methodological tools for each industry.
Domicio Da Silva Souza, Ivan; Juliana Pinheiro, Bárbara; Passarini Takahashi, Vania
Patents represent a free and open source of data for studying innovation and forecasting technological trends. Thus, we suggest that new discussions about the role of patent information are needed. To illustrate the relevance of this issue, we performed a survey of patents involving skin care products, which were granted by the United States Patent and Trademark Office (USPTO) between 2006 and 2010, to identify opportunities for innovation and technological trends. We quantified the use of technologies in 333 patents. We plotted a life cycle of technologies related to natural ingredients. We also determined the cross impact of the technologies identified. We observed technologies related to processes applied to cosmetics (2.2%), functional packaging and applicators (2.9%), excipients and active compounds (21.5%), and cosmetic preparations (73.5%). Further, 21.6% of the patents were related to the use of natural ingredients. Several opportunities for innovation were discussed throughout this paper, for example, the use of peptides as active compounds or intracellular carriers (only 3.9% of the technologies in cosmetic preparations). We also observed technological cross impacts that suggested a trend toward multifunctional cosmetics, among others. Patent surveys may help researchers with product innovation because they allow us to identify available and unexplored technologies and turn them into whole new concepts.
This work aims to use the sophisticated artificial intelligence and statistic techniques to forecast pollution and assess its social impact. To achieve the target of the research, this study is divided into several research sub-objectives as follows: First research sub-objective: propose a framework for relocating and reconfiguring the existing pollution monitoring networks by using feature selection, artificial intelligence techniques, and information theory. Second research sub-objective: c...
Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier
The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.
Dewmini Danushika Illeperuma, Thashika Rupasinghe
A demand forecasting methodology for a stationery company in Sri Lanka is being investigated. Different forecasting methods available are looked at including judgemental methods, quantitative methods and Artificial Intelligence methods. Importance of using a combination of methods available instead of using a single method is emphasised by the literature.
Kock, Anders Bredahl; Teräsvirta, Timo
When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (QuickNet) that conv...
Full Text Available Due to the availability of internet-based abstract services and patent databases, bibliometric analysis has become one of key technology forecasting approaches. Recently, latent semantic analysis (LSA has been applied to improve the accuracy in document clustering. In this paper, a new LSA method, probabilistic latent semantic analysis (PLSA which uses probabilistic methods and algebra to search latent space in the corpus is further applied in document clustering. The results show that PLSA is more accurate than LSA and the improved iteration method proposed by authors can simplify the computing process and improve the computing efficiency
Full Text Available An accurate forecast of the exploitable energy from Renewable Energy Sources is extremely important for the stability issues of the electric grid and the reliability of the bidding markets. This paper presents a comparison among different forecasting methods of the photovoltaic output power introducing a new method that mixes some peculiarities of the others: the Physical Hybrid Artificial Neural Network and the five parameters model estimated by the Social Network Optimization. In particular, the day-ahead forecasts evaluated against real data measured for two years in an existing photovoltaic plant located in Milan, Italy, are compared by means both new and the most common error indicators. Results reported in this work show the best forecasting capability of the new “mixed method” which scored the best forecast skill and Enveloped Mean Absolute Error on a yearly basis (47% and 24.67%, respectively.
Kapoor Amar S
Full Text Available Abstract Background Diverse modeling approaches viz. neural networks and multiple regression have been followed to date for disease prediction in plant populations. However, due to their inability to predict value of unknown data points and longer training times, there is need for exploiting new prediction softwares for better understanding of plant-pathogen-environment relationships. Further, there is no online tool available which can help the plant researchers or farmers in timely application of control measures. This paper introduces a new prediction approach based on support vector machines for developing weather-based prediction models of plant diseases. Results Six significant weather variables were selected as predictor variables. Two series of models (cross-location and cross-year were developed and validated using a five-fold cross validation procedure. For cross-year models, the conventional multiple regression (REG approach achieved an average correlation coefficient (r of 0.50, which increased to 0.60 and percent mean absolute error (%MAE decreased from 65.42 to 52.24 when back-propagation neural network (BPNN was used. With generalized regression neural network (GRNN, the r increased to 0.70 and %MAE also improved to 46.30, which further increased to r = 0.77 and %MAE = 36.66 when support vector machine (SVM based method was used. Similarly, cross-location validation achieved r = 0.48, 0.56 and 0.66 using REG, BPNN and GRNN respectively, with their corresponding %MAE as 77.54, 66.11 and 58.26. The SVM-based method outperformed all the three approaches by further increasing r to 0.74 with improvement in %MAE to 44.12. Overall, this SVM-based prediction approach will open new vistas in the area of forecasting plant diseases of various crops. Conclusion Our case study demonstrated that SVM is better than existing machine learning techniques and conventional REG approaches in forecasting plant diseases. In this direction, we have also
Sfetsos, A. [7 Pirsou Street, Athens (Greece)
This paper presents a comparison of various forecasting approaches, using time series analysis, on mean hourly wind speed data. In addition to the traditional linear (ARMA) models and the commonly used feed forward and recurrent neural networks, other approaches are also examined including the Adaptive Neuro-Fuzzy Inference Systems (ANFIS) and Neural Logic Networks. The developed models are evaluated for their ability to produce accurate and fast forecasts. (Author)
Today, technology is developing very fast around the world. This technological development (hardware and software) affects our life. There is a relationship among technology, society, culture, organization, machines, technical operation, and technical phenomenon. Educators should know this relationship because technology begins to affect teaching…
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events. MAG4 does not forecast that a flare will occur at a particular time in the next 24 or 48 hours; rather the probability of one occurring.
Sezgen, O.; Koomey, J.G.
In the United States, energy consumption is increasing most rapidly in the commercial sector. Consequently, the commercial sector is becoming an increasingly important target for state and federal energy policies and also for utility-sponsored demand side management (DSM) programs. The rapid growth in commercial-sector energy consumption also makes it important for analysts working on energy policy and DSM issues to have access to energy end-use forecasting models that include more detailed representations of energy-using technologies in the commercial sector. These new forecasting models disaggregate energy consumption not only by fuel type, end use, and building type, but also by specific technology. The disaggregation of the refrigeration end use in terms of specific technologies, however, is complicated by several factors. First, the number of configurations of refrigeration cases and systems is quite large. Also, energy use is a complex function of the refrigeration-case properties and the refrigeration-system properties. The Electric Power Research Institute`s (EPRI`s) Commercial End-Use Planning System (COMMEND 4.0) and the associated data development presented in this report attempt to address the above complications and create a consistent forecasting framework. Expanding end-use forecasting models so that they address individual technology options requires characterization of the present floorstock in terms of service requirements, energy technologies used, and cost-efficiency attributes of the energy technologies that consumers may choose for new buildings and retrofits. This report describes the process by which we collected refrigeration technology data. The data were generated for COMMEND 4.0 but are also generally applicable to other end-use forecasting frameworks for the commercial sector.
Hussain, A.; Oiungen, B.; Raposo, C. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)
All the easy oil and gas is gone, and, as a result the Oil and Gas industry is continuously targeting deeper and more remote fields. The exploration and development of deep water oil and gas fields is associated with enormous costs and multiple uncertainties with regard to equipment reliability and performance. Proper risk management can be used to evaluate the impact of these uncertainties thereby helping to ensure optimal business performance of the future assets, as well as helping the decision maker target investment towards areas where the financial impact will be the greatest. This paper reviews the principles of Technology Qualification and Production Forecasting methodology, both of which are risk management solutions with a proven track record for deep water field developments. (author)
Technology Alignment and Portfolio Prioritization (TAPP): Advanced Methods in Strategic Analysis, Technology Forecasting and Long Term Planning for Human Exploration and Operations, Advanced Exploration Systems and Advanced Concepts
Funaro, Gregory V.; Alexander, Reginald A.
The Advanced Concepts Office (ACO) at NASA, Marshall Space Flight Center is expanding its current technology assessment methodologies. ACO is developing a framework called TAPP that uses a variety of methods, such as association mining and rule learning from data mining, structure development using a Technological Innovation System (TIS), and social network modeling to measure structural relationships. The role of ACO is to 1) produce a broad spectrum of ideas and alternatives for a variety of NASA's missions, 2) determine mission architecture feasibility and appropriateness to NASA's strategic plans, and 3) define a project in enough detail to establish an initial baseline capable of meeting mission objectives ACO's role supports the decision-making process associated with the maturation of concepts for traveling through, living in, and understanding space. ACO performs concept studies and technology assessments to determine the degree of alignment between mission objectives and new technologies. The first step in technology assessment is to identify the current technology maturity in terms of a technology readiness level (TRL). The second step is to determine the difficulty associated with advancing a technology from one state to the next state. NASA has used TRLs since 1970 and ACO formalized them in 1995. The DoD, ESA, Oil & Gas, and DoE have adopted TRLs as a means to assess technology maturity. However, "with the emergence of more complex systems and system of systems, it has been increasingly recognized that TRL assessments have limitations, especially when considering [the] integration of complex systems." When performing the second step in a technology assessment, NASA requires that an Advancement Degree of Difficulty (AD2) method be utilized. NASA has used and developed or used a variety of methods to perform this step: Expert Opinion or Delphi Approach, Value Engineering or Value Stream, Analytical Hierarchy Process (AHP), Technique for the Order of
Seo, Youngmin; Kim, Sungwon; Kisi, Ozgur; Singh, Vijay P.
Reliable water level forecasting for reservoir inflow is essential for reservoir operation. The objective of this paper is to develop and apply two hybrid models for daily water level forecasting and investigate their accuracy. These two hybrid models are wavelet-based artificial neural network (WANN) and wavelet-based adaptive neuro-fuzzy inference system (WANFIS). Wavelet decomposition is employed to decompose an input time series into approximation and detail components. The decomposed time series are used as inputs to artificial neural networks (ANN) and adaptive neuro-fuzzy inference system (ANFIS) for WANN and WANFIS models, respectively. Based on statistical performance indexes, the WANN and WANFIS models are found to produce better efficiency than the ANN and ANFIS models. WANFIS7-sym10 yields the best performance among all other models. It is found that wavelet decomposition improves the accuracy of ANN and ANFIS. This study evaluates the accuracy of the WANN and WANFIS models for different mother wavelets, including Daubechies, Symmlet and Coiflet wavelets. It is found that the model performance is dependent on input sets and mother wavelets, and the wavelet decomposition using mother wavelet, db10, can further improve the efficiency of ANN and ANFIS models. Results obtained from this study indicate that the conjunction of wavelet decomposition and artificial intelligence models can be a useful tool for accurate forecasting daily water level and can yield better efficiency than the conventional forecasting models.
Kock, Anders Bredahl; Teräsvirta, Timo
such as the neural network model is not appropriate if the data is generated by a linear mechanism. Hence, it might be appropriate to test the null of linearity prior to building a nonlinear model. We investigate whether this kind of pretesting improves the forecast accuracy compared to the case where...
Trexler, M.C. [Trexler and Associates, Inc., Portland, OR (United States)
The Intergovernmental Panel on Climate Change (IPCC) has identified forestry and other land-use based mitigation measures as possible sources and sinks of greenhouse gases. An overview of sequestration and biotic storage is presented, and the potential impacts of the use of carbon sequestration as a mitigation technology are briefly noted. Carbon sequestration is also compare to other mitigation technologies. Biotic mitigation technologies are concluded to be a legitimate and potentially important part of greenhouse gas mitigation due to their relatively low costs, ancillary benefits, and climate impact. However, not all biotic mitigation techniques perfectly match the idealized definition of a mitigation measure, and policies are becoming increasingly biased against biotic technologies.
Ren, Lei; Hartnett, Michael
Accurate forecasting of coastal surface currents is of great economic importance due to marine activities such as marine renewable energy and fish farms in coastal regions in recent twenty years. Advanced oceanographic observation systems such as satellites and radars can provide many parameters of interest, such as surface currents and waves, with fine spatial resolution in near real time. To enhance modelling capability, data assimilation (DA) techniques which combine the available measurements with the hydrodynamic models have been used since the 1990s in oceanography. Assimilating measurements into hydrodynamic models makes the original model background states follow the observation trajectory, then uses it to provide more accurate forecasting information. Galway Bay is an open, wind dominated water body on which two coastal radars are deployed. An efficient and easy to implement sequential DA algorithm named Optimal Interpolation (OI) was used to blend radar surface current data into a three-dimensional Environmental Fluid Dynamics Code (EFDC) model. Two empirical parameters, horizontal correlation length and DA cycle length (CL), are inherent within OI. No guidance has previously been published regarding selection of appropriate values of these parameters or how sensitive OI DA is to variations in their values. Detailed sensitivity analysis has been performed on both of these parameters and results presented. Appropriate value of DA CL was examined and determined on producing the minimum Root-Mean-Square-Error (RMSE) between radar data and model background states. Analysis was performed to evaluate assimilation index (AI) of using an OI DA algorithm in the model. AI of the half-day forecasting mean vectors' directions was over 50% in the best assimilation model. The ability of using OI to improve model forecasts was also assessed and is reported upon.
Sforna, M [ENEL, s.p.a, Italian Power Company (Italy); Lamedica, R; Prudenzi, A [Rome Univ. ` La Sapienza` , Rome (Italy); Caciotta, M; Orsolini Cencelli, V [Rome Univ. III, Rome (Italy)
The paper illustrates a part of the research activity conducted by authors in the field of electric Short Term Load Forecasting (STLF) based on Artificial Neural Network (ANN) architectures. Previous experiences with basic ANN architectures have shown that, even though these architecture provide results comparable with those obtained by human operators for most normal days, they evidence some accuracy deficiencies when applied to `anomalous` load conditions occurring during holidays and long weekends. For these periods a specific procedure based upon a combined (unsupervised/supervised) approach has been proposed. The unsupervised stage provides a preventive classification of the historical load data by means of a Kohonen`s Self Organizing Map (SOM). The supervised stage, performing the proper forecasting activity, is obtained by using a multi-layer percept ron with a back propagation learning algorithm similar to the ones above mentioned. The unconventional use of information deriving from the classification stage permits the proposed procedure to obtain a relevant enhancement of the forecast accuracy for anomalous load situations.
Cesaretti, Isabel Umbelina Ribeiro
The effective nurse role, stomatherapist or not, to select the devices used by the ostomy patient is only possible with the support of advanced technological improvements reached by the specific collecting systems in the stoma care and which are commercially available. With technological advances reached and associated with the proportional technical evolution, it is possible to give greater care to the stomas which will be ultimately reflected in the ostomy patient quality of life. Consideri...
Sezgen, O.; Koomey, J.G.
Commercial-sector conservation analyses have traditionally focused on lighting and space conditioning because of their relatively-large shares of electricity and fuel consumption in commercial buildings. In this report we focus on water heating, which is one of the neglected end uses in the commercial sector. The share of the water-heating end use in commercial-sector electricity consumption is 3%, which corresponds to 0.3 quadrillion Btu (quads) of primary energy consumption. Water heating accounts for 15% of commercial-sector fuel use, which corresponds to 1.6 quads of primary energy consumption. Although smaller in absolute size than the savings associated with lighting and space conditioning, the potential cost-effective energy savings from water heaters are large enough in percentage terms to warrant closer attention. In addition, water heating is much more important in particular building types than in the commercial sector as a whole. Fuel consumption for water heating is highest in lodging establishments, hospitals, and restaurants (0.27, 0.22, and 0.19 quads, respectively); water heating`s share of fuel consumption for these building types is 35%, 18% and 32%, respectively. At the Lawrence Berkeley National Laboratory, we have developed and refined a base-year data set characterizing water heating technologies in commercial buildings as well as a modeling framework. We present the data and modeling framework in this report. The present commercial floorstock is characterized in terms of water heating requirements and technology saturations. Cost-efficiency data for water heating technologies are also developed. These data are intended to support models used for forecasting energy use of water heating in the commercial sector.
Proponents and opponents of euthanasia have argued passionately about whether it should be legalized. In Australia in the mid-1990s, following the world's first legal euthanasia deaths, Dr. Philip Nitschke initiated a different approach: a search for do-it-yourself technological means of dying with dignity. The Australian government has opposed…
Hunter, J. Mark; Garrison, James W.
Scientific management and hierarchical accountability tend to destroy dialogue and issue ideas as orders to be obeyed. Instructional technology packages can actually enslave teachers. The emendation or feedback loop built into all instructional systems should allow educators to alter design in the context of practice and help technologists design…
Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan
In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.
The management of occupational dose exposure has been transformed in recent years though the use of facilities such as computerised databases, remote instrumentation and electronic data transfer. Use of this technology has allowed increases both in the amount of data capable of being processed and in the speed at which the data is made available to operators and Health Physics personnel. The dose management system being used in support of the UK's naval nuclear plants has incorporated advances in the areas of dosimetry, data handling and data analysis. Physical dispersion of sites servicing the nuclear plants means that effective communication links have also been vital for good dose management. This paper focuses on some of the most recent dose management technology - the use of virtual reality models linked with a predictive computer code for work planning, a remote area monitoring system with diagnostic software, a personnel dosimetry telemetry system, and electronically linked computer databases. (author)
Contrast echocardiography is the only clinical imaging technique in which the imaging modality (ultrasound) can cause a change in the contrast agent (microbubbles). The change in the contrast agent can range from small oscillations of the microbubbles at a low mechanical index to their disruption at a high mechanical index. The specific mechanical index required to produce these various effects may be different for each contrast agent, depending on the bubble dimension as well as shell and gas characteristics. These alterations in bubbles result in changes in ultrasound backscatter that are specific for the bubbles themselves, rather than for tissue, and are therefore exploited for imaging their presence in tissue. These signal-processing techniques have resulted in an increased signal-to-noise ratio from bubbles vis-à-vis the tissue and have made online assessment of myocardial perfusion possible.
Bennett, M.; Morrissey, M.
The development of real-time radiation dosimetry, improved remote monitoring capability and the use of computerized databases and electronic data transfer has led to a significant improvement in effective dose management. This paper describes how these advances have improved occupational dose control for personnel supporting the United Kingdom's naval nuclear plants. Until recently health physics personnel have experienced high levels of occupational exposure compared to other groups, because of manual survey work, but new technology, such as virtual reality models linked with predictive computer codes and other computer based innovations in remote monitoring and telemetry based dosimetry, is reversing this situation. (UK)
The management of occupational dose exposure has been transformed in recent years through the use of facilities such as computerized databases, remote instrumentation and electronic data transfer. Use of this technology has allowed increases both in the amount of data capable of being processed and in the speed at which the data is made available to operators and Health Physics personnel. These developments have significantly improved the quality and efficiency of dose management. The dose management system being used in support of the UK's naval nuclear plants has incorporated advances in the areas of dosimetry, data handling and data analysis. Physical dispersion of sites servicing the nuclear plants means that effective communication links have also been vital for good dose management. (author)
Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao
Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....
Rajawat, R.K.; Desai, S.V.; Kulkarni, M.R.; Dolly Rani; Nagesh, K.V.; Sethi, R.C.
Modern day accelerator development encompasses a myriad technologies required for their diverse needs. Whereas RF, high voltage, vacuum, cryogenics etc., technologies meet their functional requirements, high finish lapping processes, ceramic-metal joining, oven brazing, spark erosion or wire cutting etc., are a must to meet their fabrication requirements. Electromagnetic (EM) forming technique falls in the latter category and is developed as a special technology. It is currently catering to the development as a nuclear reactor technology, but has the potential to meet accelerator requirements too. This paper highlights the general principle of its working, simple design guidelines, advantages, and suggests some specific areas where this could benefit accelerator technologies
Geiger, James D; Hirschl, Ronald B
The pace of medical innovation continues to increase. The deployment of new technologies in surgery creates many ethical challenges including how to determine safety of the technology, what is the timing and process for deployment of a new technology, how are patients informed before undergoing a new technology or technique, how are the outcomes of a new technology evaluated and how are the responsibilities of individual patients and society at large balanced. Ethical considerations relevant to the implementation of ECMO and robotic surgery are explored to further discussion of how we can optimize the delicate balance between innovation and regulation. Copyright © 2015 Elsevier Inc. All rights reserved.
Protalinsky, O. M.; Shcherbatov, I. A.; Stepanov, P. V.
A growing number of severe accidents in RF call for the need to develop a system that could prevent emergency situations. In a number of cases accident rate is stipulated by careless inspections and neglects in developing repair programs. Across the country rates of accidents are growing because of a so-called “human factor”. In this regard, there has become urgent the problem of identification of the actual state of technological facilities in power engineering using data on engineering processes running and applying artificial intelligence methods. The present work comprises four model states of manufacturing equipment of engineering companies: defect, failure, preliminary situation, accident. Defect evaluation is carried out using both data from SCADA and ASEPCR and qualitative information (verbal assessments of experts in subject matter, photo- and video materials of surveys processed using pattern recognition methods in order to satisfy the requirements). Early identification of defects makes possible to predict the failure of manufacturing equipment using mathematical techniques of artificial neural network. In its turn, this helps to calculate predicted characteristics of reliability of engineering facilities using methods of reliability theory. Calculation of the given parameters provides the real-time estimation of remaining service life of manufacturing equipment for the whole operation period. The neural networks model allows evaluating possibility of failure of a piece of equipment consistent with types of actual defects and their previous reasons. The article presents the grounds for a choice of training and testing samples for the developed neural network, evaluates the adequacy of the neural networks model, and shows how the model can be used to forecast equipment failure. There have been carried out simulating experiments using a computer and retrospective samples of actual values for power engineering companies. The efficiency of the developed
Chmielewski, A.G.; Walis, L.
Development of radiation technologies and techniques in Poland has been shown. Especially thermoshrinkable olefins with shape memory, fast thermistors and radiation sterilization have been presented. Also the radiometric gages produced in the Institute of Nuclear Chemistry and Technology, Warsaw for air monitoring have been described. A broad group of radiotracer techniques being used for environmental study have been presented as well. Radiation technologies with electron beam use for flue gas purification, sewage sludge hygienization and food processing have been shown and their development has been discussed
Chaudhuri, Sutapa; Goswami, Sayantika; Das, Debanjana; Middey, Anirban
Forecasting summer monsoon rainfall with precision becomes crucial for the farmers to plan for harvesting in a country like India where the national economy is mostly based on regional agriculture. The forecast of monsoon rainfall based on artificial neural network is a well-researched problem. In the present study, the meta-heuristic ant colony optimization (ACO) technique is implemented to forecast the amount of summer monsoon rainfall for the next day over Kolkata (22.6°N, 88.4°E), India. The ACO technique belongs to swarm intelligence and simulates the decision-making processes of ant colony similar to other adaptive learning techniques. ACO technique takes inspiration from the foraging behaviour of some ant species. The ants deposit pheromone on the ground in order to mark a favourable path that should be followed by other members of the colony. A range of rainfall amount replicating the pheromone concentration is evaluated during the summer monsoon season. The maximum amount of rainfall during summer monsoon season (June—September) is observed to be within the range of 7.5-35 mm during the period from 1998 to 2007, which is in the range 4 category set by the India Meteorological Department (IMD). The result reveals that the accuracy in forecasting the amount of rainfall for the next day during the summer monsoon season using ACO technique is 95 % where as the forecast accuracy is 83 % with Markov chain model (MCM). The forecast through ACO and MCM are compared with other existing models and validated with IMD observations from 2008 to 2012.
Full Text Available The research and development of new materials such as particle-reinforced aluminum matrix composites (AMCs will only result in a successful innovation if these materials show significant advantages not only from a technological, but also from an economic point of view. Against this background, in the Collaborative Research Center SFB 692, the concept of an integrated technology, user, and market analysis and forecast has been developed as a means for assessing the technological and commercial potential of new materials in early life cycle stages. After briefly describing this concept, it is applied to AMCs and the potential field of manufacturing aircraft components. Results show not only technological advances, but also considerable economic potential—the latter one primarily resulting from the possible weight reduction being enabled by the increased yield strength of the new material.
Full Text Available One of the most important research topics in smart grid technology is load forecasting, because accuracy of load forecasting highly influences reliability of the smart grid systems. In the past, load forecasting was obtained by traditional analysis techniques such as time series analysis and linear regression. Since the load forecast focuses on aggregated electricity consumption patterns, researchers have recently integrated deep learning approaches with machine learning techniques. In this study, an accurate deep neural network algorithm for short-term load forecasting (STLF is introduced. The forecasting performance of proposed algorithm is compared with performances of five artificial intelligence algorithms that are commonly used in load forecasting. The Mean Absolute Percentage Error (MAPE and Cumulative Variation of Root Mean Square Error (CV-RMSE are used as accuracy evaluation indexes. The experiment results show that MAPE and CV-RMSE of proposed algorithm are 9.77% and 11.66%, respectively, displaying very high forecasting accuracy.
Full Text Available The increase of energy consumption in the world is reflected in the consumption of natural gas. However, this increment requires additional investment. This effect leads imbalances in terms of demand forecasting, such as applying penalties in the case of error rates occurring beyond the acceptable limits. As the forecasting errors increase, penalties increase exponentially. Therefore, the optimal use of natural gas as a scarce resource is important. There are various demand forecast ranges for natural gas and the most difficult range among these demands is the day-ahead forecasting, since it is hard to implement and makes predictions with low error rates. The objective of this study is stabilizing gas tractions on day-ahead demand forecasting using low-consuming subscriber data for minimizing error using univariate artificial bee colony-based artificial neural networks (ANN-ABC. For this purpose, households and low-consuming commercial users’ four-year consumption data between the years of 2011–2014 are gathered in daily periods. Previous consumption values are used to forecast day-ahead consumption values with sliding window technique and other independent variables are not taken into account. Dataset is divided into two parts. First, three-year daily consumption values are used with a seven day window for training the networks, while the last year is used for the day-ahead demand forecasting. Results show that ANN-ABC is a strong, stable, and effective method with a low error rate of 14.9 mean absolute percentage error (MAPE for training utilizing MAPE with a univariate sliding window technique.
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
in a stand-alone fashion . vii Contents 1. Introduction... consultations with subject matter experts. The respondents at GAO expressed that one of their most significant challenges is forecasting the global economic...supplant the role of the analyst, and it should not be operated in a stand-alone fashion . In the following chapter, the specific capabilities of
MacDougall, Pamela; Kosek, Anna Magdalena; Bindner, Henrik W.
network as well as the multi-variant linear regression. It is found that it is possible to estimate the longevity of flexibility with machine learning. The linear regression algorithm is, on average, able to estimate the longevity with a 15% error. However, there was a significant improvement with the ANN...... approach to investigating the longevity of aggregated response of a virtual power plant using historic bidding and aggregated behaviour with machine learning techniques. The two supervised machine learning techniques investigated and compared in this paper are, multivariate linear regression and single...... algorithm achieving, on average, a 5.3% error. This is lowered 2.4% when learning for the same virtual power plant. With this information it would be possible to accurately offer residential VPP flexibility for market operations to safely avoid causing further imbalances and financial penalties....
Aspects of the recent report by the Ontario Electricity Conservation and Supply Task Force and Independent Market Operator which forecasts acute power supply shortages in Ontario, are discussed. Immediate action is recommended to avert the problem. The principal recommendation concerns the adoption of Demand Side Management as a tool to reduce the widening gap between supply and demand, citing supply shortage, imports, high prices, deregulated market and environmental concerns as the driving forces which push for the adoption of DSM. It is claimed that DSM, through its tools such as Demand/Load Response Programs and Time-of-Use rates has the capacity to create the necessary balance between supply and demand more efficiently, and in a more timely fashion than supply side management. The demand for adoption of DSM is justified on the basis of a careful examination of the magnitude and significance of each of the driving forces affecting the electricity supply in Ontario, as well as the benefits and techniques of DSM designed to manage power shortages. Energy Conservation and Efficiency and Demand/Load Response Programs are discussed as the principal DSM techniques, while Dynamic/Real Time Pricing, Time-of-Use Rates, Automated /Smart Metering, Web-based/Communication Systems, Reliability-based Programs, Market/Price-based programs, and Types of Load Control are described as the principal tools used by DSM. DSM program approaches and strategies are also reviewed, along with a brief list of successful examples of DSM applications. 3 figs
Aspects of the recent report by the Ontario Electricity Conservation and Supply Task Force and Independent Market Operator which forecasts acute power supply shortages in Ontario, are discussed. Immediate action is recommended to avert the problem. The principal recommendation concerns the adoption of Demand Side Management as a tool to reduce the widening gap between supply and demand, citing supply shortage, imports, high prices, deregulated market and environmental concerns as the driving forces which push for the adoption of DSM. It is claimed that DSM, through its tools such as Demand/Load Response Programs and Time-of-Use rates has the capacity to create the necessary balance between supply and demand more efficiently, and in a more timely fashion than supply side management. The demand for adoption of DSM is justified on the basis of a careful examination of the magnitude and significance of each of the driving forces affecting the electricity supply in Ontario, as well as the benefits and techniques of DSM designed to manage power shortages. Energy Conservation and Efficiency and Demand/Load Response Programs are discussed as the principal DSM techniques, while Dynamic/Real Time Pricing, Time-of-Use Rates, Automated /Smart Metering, Web-based/Communication Systems, Reliability-based Programs, Market/Price-based programs, and Types of Load Control are described as the principal tools used by DSM. DSM program approaches and strategies are also reviewed, along with a brief list of successful examples of DSM applications. 3 figs.
Tafur Monroy, Idelfonso; Zibar, Darko; Guerrero Gonzalez, Neil
We present the approach of cognition applied to heterogeneous optical networks developed in the framework of the EU project CHRON: Cognitive Heterogeneous Reconfigurable Optical Network. We introduce and discuss in particular the technologies and techniques that will enable a cognitive optical...... network to observe, act, learn and optimizes its performance, taking into account its high degree of heterogeneity with respect to quality of service, transmission and switching techniques....
Panigrahi, Binay Kumar; Das, Soumya; Nath, Tushar Kumar; Senapati, Manas Ranjan
In the present study, with a view to speculate the water flow of two rivers in eastern India namely river Daya and river Bhargavi, the focus was on developing Cascaded Functional Link Artificial Neural Network (C-FLANN) model. Parameters of C-FLANN architecture were updated using Harmony Search (HS) and Differential Evolution (DE). As the numbers of samples are very low, there is a risk of over fitting. To avoid this Map reduce based ANOVA technique is used to select important features. These features were used and provided to the architecture which is used to predict the water flow in both the rivers, one day, one week and two weeks ahead. The results of both the techniques were compared with Radial Basis Functional Neural Network (RBFNN) and Multilayer Perceptron (MLP), two widely used artificial neural network for prediction. From the result it was confirmed that C-FLANN trained through HS gives better prediction result than being trained through DE or RBFNN or MLP and can be used for predicting water flow in different rivers.
This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...
Kaur, Amanpreet; Nonnenmacher, Lukas; Coimbra, Carlos F.M.
We discuss methods for net load forecasting and their significance for operation and management of power grids with high renewable energy penetration. Net load forecasting is an enabling technology for the integration of microgrid fleets with the macrogrid. Net load represents the load that is traded between the grids (microgrid and utility grid). It is important for resource allocation and electricity market participation at the point of common coupling between the interconnected grids. We compare two inherently different approaches: additive and integrated net load forecast models. The proposed methodologies are validated on a microgrid with 33% annual renewable energy (solar) penetration. A heuristics based solar forecasting technique is proposed, achieving skill of 24.20%. The integrated solar and load forecasting model outperforms the additive model by 10.69% and the uncertainty range for the additive model is larger than the integrated model by 2.2%. Thus, for grid applications an integrated forecast model is recommended. We find that the net load forecast errors and the solar forecasting errors are cointegrated with a common stochastic drift. This is useful for future planning and modeling because the solar energy time-series allows to infer important features of the net load time-series, such as expected variability and uncertainty. - Highlights: • Net load forecasting methods for grids with renewable energy generation are discussed. • Integrated solar and load forecasting outperforms the additive model by 10.69%. • Net load forecasting reduces the uncertainty between the interconnected grids.
“Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...
Full Text Available The conformation at the present time, of an economic model renovated in Cuba, it should be based in the efficient use of the productive factors which it counts the country, with emphasis in the substitution of imports. In the chapter VII, article 184 of the Limits of the Economic and Social Politics of the Party and the Revolution it can be read: "To prioritize, in short term, the substitution of imports of those foods that can be produced efficiently in the country; also it will owe to multiply the application of the results of the science and the technique."1 In fact the objective of the present investigation, using econometrics technical to carry out presage of the cost of sugar production, using factors in the productive process: days of harvest, use of the potential, recovered capacity and industrial yield. The results indicate that the factor with more influences in the decrease of the costs is the industrial yield. It is also obtained a cost presage for the county Santiago de Cuba in different harvest stages that oscillates between $372,45 and 517,52 and it stops extreme values of $ 303,21 and $ 777,6.
Papiu, G. A.; Suciu, N.
The relationship between art and technique has been along the time one that is inseparable and systematic, artists appealing to various technologies, tools and practices that help them stimulate their imagination. Today there is a new category of artists, coming from a technical or scientific field, that are being 'trapped’ in this ‘game of art”. The mosaic, even if it is an old technique, responded to the social requirements and it evolved over time, being constantly related to aesthetic and artistic thinking, discoveries of science, assimilating permanent new techniques and technologies, diversifying its artistic forms of expression and methods of transposition. Not being bound any more to a religious institution, which was its birth place, today, she migrated to all public spaces. Works of art in public space have become today an active factor in reshaping the urban aesthetic landscape.
Abdul Bahari Othman; Mohd Zamri Yusoff
One of the important decisions to be made by the management of hydroelectric power plant associated with major storage reservoir is to determine the best turbine water release decision for the next financial year. The water release decision enables firm energy generated estimation for the coming financial year to be done. This task is usually a simple and straightforward task provided that the amount of turbine water release is known. The more challenging task is to determine the best water release decision that is able to resolve the two conflicting operational objectives which are minimizing the drop of turbine gross head and maximizing upper reserve margin of the reservoir. Most techniques from literature emphasize on utilizing the statistical simulations approach. Markovians models, for example, are a class of statistical model that utilizes the past and the present system states as a basis for predicting the future . This paper illustrates that rigorous solution criterion can be mathematically proven to resolve those two conflicting operational objectives. Thus, best water release decision that maximizes potential energy for the prevailing natural inflow is met. It is shown that the annual water release decision shall be made in such a manner that annual return inflow that has return frequency smaller than critical return frequency (f c ) should not be considered. This criterion enables target turbine gross head to be set to the well-defined elevation. In the other words, upper storage margin of the reservoir shall be made available to capture magnitude of future inflow that has return frequency greater than or equal to f c. A case study is shown to demonstrate practical application of the derived mathematical formulas
Full Text Available In mobile Internet of Things, there are many challenges, including sensing technology of sensors, how and when to join cooperative transmission, and how to select the cooperative sensors. To address these problems, we studied the combination forecasting based on the multilevel sensing technology of sensors, building upon which we proposed the adaptive opportunistic cooperative control mechanism based on the threshold values such as activity probability, distance, transmitting power, and number of relay sensors, in consideration of signal to noise ratio and outage probability. More importantly, the relay sensors would do self-test real time in order to judge whether to join the cooperative transmission, for maintaining the optimal cooperative transmission state with high performance. The mathematical analyses results show that the proposed adaptive opportunistic cooperative control approach could perform better in terms of throughput ratio, packet error rate and delay, and energy efficiency, compared with the direct transmission and opportunistic cooperative approaches.
Dang, Mau Chien; Dung Dang, Thi My; Fribourg-Blanc, Eric
Inkjet printing is an advanced technique which reliably reproduces text, images and photos on paper and some other substrates by desktop printers and is now used in the field of materials deposition. This interest in maskless materials deposition is coupled with the development of microfabrication techniques for the realization of circuits or patterns on flexible substrates for which printing techniques are of primary interest. This paper is a review of some results obtained in inkjet printing technology to develop microfabrication techniques at Laboratory for Nanotechnology (LNT). Ink development, in particular conductive ink, study of printed patterns, as well as application of these to the realization of radio-frequency identification (RFID) tags on flexible substrates, are presented. (paper)
Full Text Available to explain the variations in independent variables as functions (commonly referred to regression functions) of variations in dependent variables . With this knowledge it is then possible to perform prediction and forecasting of the values that dependent....G.; “A General Method for Estimating a Linear Structural Equation System,” in Structural Equation Models in the Social Sciences, eds.: A.S. Goldberger and O. D. Duncan, New York: Seminar, 1973.  Steinberg, A.N. and Rogova, G.; "Situation...
Full Text Available Leakage power of CMOS VLSI Technology is a great concern. To reduce leakage power in CMOS circuits, a Leakage Power Minimiza-tion Technique (LPMT is implemented in this paper. Leakage cur-rents are monitored and compared. The Comparator kicks the charge pump to give body voltage (Vbody. Simulations of these circuits are done using TSMC 0.35µm technology with various operating temper-atures. Current steering Digital-to-Analog Converter (CSDAC is used as test core to validate the idea. The Test core (eg.8-bit CSDAC had power consumption of 347.63 mW. LPMT circuit alone consumes power of 6.3405 mW. This technique results in reduction of leakage power of 8-bit CSDAC by 5.51mW and increases the reliability of test core. Mentor Graphics ELDO and EZ-wave are used for simulations.
Full Text Available The authors’ intention is to correlate the basic knowledge in using the TRIZ methodology (Theory of Inventive Problem Solving or in Russian: Teoriya Resheniya Izobretatelskikh Zadatch as a problem solving tools meant to help the decision makers to perform more significant forecasting exercises. The idea is to identify the TRIZ features and instruments (40 inventive principles, i.e. for putting in evidence the noise and signal problem, for trend identification (qualitative and quantitative tendencies and support tools in technological forecasting, to make the decision-makers able to refine and to increase the level of confidence in the forecasting results. The interest in connecting TRIZ to forecasting methodology, nowadays, relates to the massive application of TRIZ methods and techniques for engineering system development world-wide and in growing application of TRIZ’s concepts and paradigms for improvements of non-engineering systems (including the business and economic applications.
Williams, K. A.; Partridge, E. C., III
Originally envisioned as a means to integrate the many systems found throughout the government, the general mission of the NCS continues to be to ensure the survivability of communications during and subsequent to any national emergency. In order to accomplish this mission the NCS is an arrangement of heterogeneous telecommunications systems which are provided by their sponsor Federal agencies. The physical components of Federal telecommunications systems and networks include telephone and digital data switching facilities and primary common user communications centers; Special purpose local delivery message switching and exchange facilities; Government owned or leased radio systems; Technical control facilities which are under exclusive control of a government agency. This thesis describes the logical design of a proposed decision support system for use by the National Communications System in forecasting technology, prices, and costs. It is general in nature and only includes those forecasting models which are suitable for computer implementation. Because it is a logical design it can be coded and applied in many different hardware and/or software configurations.
Zack, J. W.
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
Klimmt, C.; Roth, C.; Vermeulen, I.E.; Vorderer, P.A.; Roth, F.S.
Advances in gaming and other entertainment technologies are evolving rapidly and create new conceptual challenges for understanding and explaining the user experiences they can facilitate. The present article reports a prospective study on a particularly promising entertainment technology of the
Kucharavy , Dmitry; De Guio , Roland
International audience; The ability to foresee future technology is a key task of Innovative Design. The paper focuses on the obstacles to reliable prediction of technological evolution for the purpose of Innovative Design. First, a brief analysis of problems for existing forecasting methods is presented. The causes for the complexity of technology prediction are discussed in the context of reduction of the forecast errors. Second, using a contradiction analysis, a set of problems related to ...
Grover S. Kearns
Full Text Available Recent prosecutions of highly publicized white-collar crimes combined with public outrage have resulted in heightened regulation of financial reporting and greater emphasis on systems of internal control. Because both white-collar and cybercrimes are usually perpetrated through computers, internal and external auditors’ knowledge of information technology (IT is now more vital than ever. However, preserving digital evidence and investigative techniques, which can be essential to fraud examinations, are not skills frequently taught in accounting programs and instruction in the use of computer assisted auditing tools and techniques – applications that might uncover fraudulent activity – is limited. Only a few university-level accounting classes provide instruction in IT investigative techniques. This paper explains why such a course would be beneficial to the program, the college, and the student. Additionally, it presents a proposed curriculum and suggests useful resources for the instructor and student.
Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren Ladegaard
This paper presents results from the first statistically significant study of traffic forecasts in transportation infrastructure projects. The sample used is the largest of its kind, covering 210 projects in 14 nations worth US$58 billion. The study shows with very high statistical significance...... that forecasters generally do a poor job of estimating the demand for transportation infrastructure projects. The result is substantial downside financial and economic risk. Forecasts have not become more accurate over the 30-year period studied. If techniques and skills for arriving at accurate demand forecasts...... forecasting. Highly inaccurate traffic forecasts combined with large standard deviations translate into large financial and economic risks. But such risks are typically ignored or downplayed by planners and decision-makers, to the detriment of social and economic welfare. The paper presents the data...
Weather prediction is performed using the numerical model of the atmosphere evolution.The evolution equations are derived from the Navier Stokes equation for the adiabatic part but the are very much complicated by the change of phase of water, the radiation porocess and the boundary layer.The technique used operationally is described. Weather prediction is an initial value problem and accurate initial conditions need to be specified. Due to the small number of observations available (105 ) as compared to the dimension of the model state variable (107),the problem is largely underdetermined. Techniques of optimal control and inverse problems are used and have been adapted to the large dimension of our problem. our problem.The at mosphere is a chaotic system; the implication for weather prediction is discussed. Ensemble prediction is used operationally and the technique for generating initial conditions which lead to a numerical divergence of the subsequent forecasts is described.
Full Text Available Named Entity Recognition (NER is a main task into Natural Language Processing. On the one hand, supporting the extraction of the information on unstructured data. On the other hand, The NER is a probabilistic graphical model that allows us to represent the conditional independency assumptions into the sequential labelling. In this paper, we propose a discriminative graphical model by using linear-chain Conditional Random Fields (CRFs. We present the experiments based on the Conll-2002 shared task and Ancora corpus according to the following criteria: recall, precision and F-score. Our contributions in this work are the following: first, we tested our baseline on the CoNLL-2002 shared task obtaining 80% F1-measure, and 59% F1-measure on AnCora corpus respectively. Finally, the application Vigtech allow us to identify information and patterns in the cancer topic, we discuss the results according to the model performance and the useful information to support the forecasting process
The contributions contain informations concerning the present state and development of radiometric measurement techniques in metallurgy and foundry technology as well as their application to the solution of various problems. The development of isotope techniques is briefly described. Major applications of radiometric equipment in industrial measurement are presented together with the use of isotopes to monitor processes of industrial production. This is followed by a short description of numerous laboratory-scale applications. Another contribution deals with fundamental problems and methods of moisture measurement by neutrons. A complex moisture/density measurement device the practical applicability of which has been tested is described here. Possibilities for clay determination in used-up moulding materials are discussed in a further contribution. The clay content can be determined by real-time radiometric density measurement so that the necessary moisture or addition of fresh sand can be controlled. (orig.) With 20 figs., 9 tabs., 178 refs [de
Grisak, G.E.; Pickens, J.F.
A review of techniques and technology pertaining to the movement of ground water, solutes/radionuclides and heat through porous and fractured media has been conducted. The theory describing each of these processes has been presented in terms of their partial differential equations. The equations serve as the basis for the identification of processes. These parameters have been discussed in detail with regards to their importance in controlling the overall transport processes. A hypothetical research program has been assembled for the purpose of illustrating the hydrogeologic methods and research techniques applicable to site characterization studies. Areas where the current state of the art is lacking have been identified and the necessary research has been recommended. 103 refs
Some techniques and technologies for proposed interplanetary missions are described. Methods for reducing the effect of zero gravity on humans during missions to Mars and the moon, and the need for launch vehicles with increased lift capability are discussed. The use of nuclear power, liquid oxygen from the moon, and helium 3 as propellants for spacecraft is examined. The development and capabilities of the Shuttle Z vehicle are considered. Attention is given to the Space Station Freedom and Energia. A launch vehicle concept which utilizes the Shuttle Z for a mission to Mars is presented.
Gabriela Victoria Anghelache
The analysis of various forecasting strategies has been conducted using data sets on a daily basis, on a time horizon of nine years, for a total of 22 companies listed on BSE and for the BET and BET-C exchange indexes; the research is differentiating the pre-crisis period and the crisis period.
El Zarwi, Feras
The transportation system is undergoing major technological and infrastructural changes, such as the introduction of autonomous vehicles, high speed rail, carsharing, ridesharing, flying cars, drones, and other app-driven on-demand services. While the changes are imminent, the impact on travel behavior is uncertain, as is the role of policy in shaping the future. Literature shows that even under the most optimistic scenarios, society’s environmental goals cannot be met by technology, operatio...
Md. Tabrez Quasim; Rupak Chattopadhyay
Any business enterprise must rely a lot on how well it can predict the future happenings. To cope up with the modern global customer demand, technological challenges, market competitions etc., any organization is compelled to foresee the future having maximum impact and least chances of errors. The traditional forecasting approaches have some limitations. That is why the business world is adopting the modern Artificial Intelligence based forecasting techniques. This paper has tried to presen...
Timmermann, Allan G
Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the ex-ante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this paper we analyse theoretically the factors that determine the advantages from combining forecasts (for example, the d...
Aiolfi, Marco; Capistrán, Carlos; Timmermann, Allan
We consider combinations of subjective survey forecasts and model-based forecasts from linear and non-linear univariate specifications as well as multivariate factor-augmented models. Empirical results suggest that a simple equal-weighted average of survey forecasts outperform the best model-based forecasts for a majority of macroeconomic variables and forecast horizons. Additional improvements can in some cases be gained by using a simple equal-weighted average of survey and model-based fore...
Williams, J. L.; Maxwell, R. M.; Delle Monache, L.
Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its propensity to change speed and direction over short time scales. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. Using the PF.WRF model, a fully-coupled hydrologic and atmospheric model employing the ParFlow hydrologic model with the Weather Research and Forecasting model coupled via mass and energy fluxes across the land surface, we have explored the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture and wind speed, and demonstrated that reductions in uncertainty in these coupled fields propagate through the hydrologic and atmospheric system. We have adapted the Data Assimilation Research Testbed (DART), an implementation of the robust Ensemble Kalman Filter data assimilation algorithm, to expand our capability to nudge forecasts produced with the PF.WRF model using observational data. Using a semi-idealized simulation domain, we examine the effects of assimilating observations of variables such as wind speed and temperature collected in the atmosphere, and land surface and subsurface observations such as soil moisture on the quality of forecast outputs. The sensitivities we find in this study will enable further studies to optimize observation collection to maximize the utility of the PF.WRF-DART forecasting system.
Sakti, Apurba; Azevedo, Inês M.L.; Fuchs, Erica R.H.; Michalek, Jeremy J.; Gallagher, Kevin G.; Whitacre, Jay F.
There are a large number of accounts about rapidly declining costs of batteries with potentially transformative effects, but these accounts often are not based on detailed design and technical information. Using a method ideally suited for that purpose, we find that when experts are free to assume any battery pack design, a majority of the cost estimates are consistent with the ranges reported in the literature, although the range is notably large. However, we also find that 55% of relevant experts’ component-level cost projections are inconsistent with their total pack-level projections, and 55% of relevant experts’ elicited cost projections are inconsistent with the cost projections generated by putting their design- and process-level assumptions into our process-based cost model (PBCM). These results suggest a need for better understanding of the technical assumptions driving popular consensus regarding future costs. Approaches focusing on technological details first, followed by non-aggregated and systemic cost estimates while keeping the experts aware of any discrepancies, should they arise, may result in more accurate forecasts. - Highlights: • New technology cost projections often confuse underlying technical assumptions. • We use a novel method combining expert elicitations and production cost models. • 55% of experts’ component-level cost projections contain internal inconsistencies. • Technical assumptions driving popular consensus regarding future costs are key.
Rounaghi, Mohammad Mahdi; Abbaszadeh, Mohammad Reza; Arashi, Mohammad
One of the most important topics of interest to investors is stock price changes. Investors whose goals are long term are sensitive to stock price and its changes and react to them. In this regard, we used multivariate adaptive regression splines (MARS) model and semi-parametric splines technique for predicting stock price in this study. The MARS model as a nonparametric method is an adaptive method for regression and it fits for problems with high dimensions and several variables. semi-parametric splines technique was used in this study. Smoothing splines is a nonparametric regression method. In this study, we used 40 variables (30 accounting variables and 10 economic variables) for predicting stock price using the MARS model and using semi-parametric splines technique. After investigating the models, we select 4 accounting variables (book value per share, predicted earnings per share, P/E ratio and risk) as influencing variables on predicting stock price using the MARS model. After fitting the semi-parametric splines technique, only 4 accounting variables (dividends, net EPS, EPS Forecast and P/E Ratio) were selected as variables effective in forecasting stock prices.
Daiha, Karina de Godoy; Angeli, Renata; de Oliveira, Sabrina Dias; Almeida, Rodrigo Volcan
The great potential of lipases is known since 1930 when the work of J. B. S. Haldane was published. After eighty-five years of studies and developments, are lipases still important biocatalysts? For answering this question the present work investigated the technological development of four important industrial sectors where lipases are applied: production of detergent formulations; organic synthesis, focusing on kinetic resolution, production of biodiesel, and production of food and feed products. The analysis was made based on research publications and patent applications, working as scientific and technological indicators, respectively. Their evolution, interaction, the major players of each sector and the main subject matters disclosed in patent documents were discussed. Applying the concept of technology life cycle, S-curves were built by plotting cumulative patent data over time to monitor the attractiveness of each technology for investment. The results lead to a conclusion that the use of lipases as biocatalysts is still a relevant topic for the industrial sector, but developments are still needed for lipase biocatalysis to reach its full potential, which are expected to be achieved within the third, and present, wave of biocatalysis.
Gerikh, Valentin; Kolosok, Irina; Kurbatsky, Victor; Tomin, Nikita
The paper presents the results of experimental studies concerning calculation of electricity prices in different price zones in Russia and Europe. The calculations are based on the intelligent software "ANAPRO" that implements the approaches based on the modern methods of data analysis and artificial intelligence technologies.
Gong, Bing; Ordieres-Meré, Joaquín
Since the air pollution can cause serious health problem, the concerns about forecasting air pollution timely and accurately arise by researchers in order to alert the public avoiding high level pollution and help the government make decisions. In our research, we take Hong Kong (finished research) and the cities in Mexico (finishing) and Mainland China (starting), especially the high pollutant areas such as Mexico City and Beijing, as study cases. Meanwhile, various types of arti...
Shabri, Ani; Samsudin, Ruhaidah
Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666
Full Text Available Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI, has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.
Shabri, Ani; Samsudin, Ruhaidah
Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.
Simpson, Sue; Hyde, Chris; Cook, Alison; Packer, Claire; Stevens, Andrew
Early warning systems are an integral part of many health technology assessment programs. Despite this finding, to date, there have been no quantitative evaluations of the accuracy of predictions made by these systems. We report a study evaluating the accuracy of predictions made by the main United Kingdom early warning system. As prediction of impact is analogous to diagnosis, a method normally applied to determine the accuracy of diagnostic tests was used. The sensitivity, specificity, and predictive values of the National Horizon Scanning Centre's prediction methods were estimated with reference to an (imperfect) gold standard, that is, expert opinion of impact 3 to 5 years after prediction. The sensitivity of predictions was 71 percent (95 percent confidence interval [CI], 0.36-0.92), and the specificity was 73 percent (95 percent CI, 0.64-0.8). The negative predictive value was 98 percent (95 percent CI, 0.92-0.99), and the positive predictive value was 14 percent (95 percent CI, 0.06-0.3). Forecasting is difficult, but the results suggest that this early warning system's predictions have an acceptable level of accuracy. However, there are caveats. The first is that early warning systems may themselves reduce the impact of a technology, as helping to control adoption and diffusion is their main purpose. The second is that the use of an imperfect gold standard may bias the results. As early warning systems are viewed as an increasingly important component of health technology assessment and decision making, their outcomes must be evaluated. The method used here should be investigated further and the accuracy of other early warning systems explored.
del Campo, Jose María; Negro, Vicente; Núñez, Miguel
Area, launched in 1999 with the Bologna Declaration, has bestowed such a magnitude and unprecedented agility to the transformation process undertaken by European universities. However, the change has been more profound and drastic with regards to the use of new technologies both inside and outside the classroom. This article focuses on the study and analysis of the technology’s history within the university education and its impact on teachers, students and teaching methods. A...
O'Donncha, F.; Zhang, Y.; James, S. C.
Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.
Pérez, Iñaki; Ambrosio, Cristina
Material saving is currently a must for the forging companies, as material costs sum up to 50% for parts made of steel and up to 90% in other materials like titanium. For long products, cross wedge rolling (CWR) technology can be used to obtain forging preforms with a suitable distribution of the material along its own axis. However, defining the correct preform dimensions is not an easy task and it could need an intensive trial-and-error campaign. To speed up the preform definition, it is necessary to apply optimization techniques on Finite Element Models (FEM) able to reproduce the material behaviour when being rolled. Meta-models Assisted Evolution Strategies (MAES), that combine evolutionary algorithms with Kriging meta-models, are implemented in FORGE® software and they allow reducing optimization computation costs in a relevant way. The paper shows the application of these optimization techniques to the definition of the right preform for a shaft from a vehicle of the agricultural sector. First, the current forging process, based on obtaining the forging preform by means of an open die forging operation, is showed. Then, the CWR preform optimization is developed by using the above mentioned optimization techniques. The objective is to reduce, as much as possible, the initial billet weight, so that a calculation of flash weight reduction due to the use of the proposed preform is stated. Finally, a simulation of CWR process for the defined preform is carried out to check that most common failures (necking, spirals,..) in CWR do not appear in this case.
for the third and fourth day precipitation forecasts. A marked improvement was shown for the consensus 24 hour precipitation forecast, and small... Zuckerberg (1980) found a small long term skill increase in forecasts of heavy snow events for nine eastern cities. Other National Weather Service...and maximum temperature) are each awarded marks 2, 1, or 0 according to whether the forecast is correct, 8 - *- -**■*- ———"—- - -■ t0m 1 MM—IB I
Marcos, Raül; Llasat, Ma Carmen; Quintana-Seguí, Pere; Turco, Marco
In this paper, we have compared different bias correction methodologies to assess whether they could be advantageous for improving the performance of a seasonal prediction model for volume anomalies in the Boadella reservoir (northwestern Mediterranean). The bias correction adjustments have been applied on precipitation and temperature from the European Centre for Middle-range Weather Forecasting System 4 (S4). We have used three bias correction strategies: two linear (mean bias correction, BC, and linear regression, LR) and one non-linear (Model Output Statistics analogs, MOS-analog). The results have been compared with climatology and persistence. The volume-anomaly model is a previously computed Multiple Linear Regression that ingests precipitation, temperature and in-flow anomaly data to simulate monthly volume anomalies. The potential utility for end-users has been assessed using economic value curve areas. We have studied the S4 hindcast period 1981-2010 for each month of the year and up to seven months ahead considering an ensemble of 15 members. We have shown that the MOS-analog and LR bias corrections can improve the original S4. The application to volume anomalies points towards the possibility to introduce bias correction methods as a tool to improve water resource seasonal forecasts in an end-user context of climate services. Particularly, the MOS-analog approach gives generally better results than the other approaches in late autumn and early winter. Copyright © 2017 Elsevier B.V. All rights reserved.
Strong, Vivian E; Forde, Kenneth A; MacFadyen, Bruce V; Mellinger, John D; Crookes, Peter F; Sillin, Lelan F; Shadduck, Phillip P
Ethical considerations relevant to the implementation of new surgical technologies and techniques are explored and discussed in practical terms in this statement, including (1) How is the safety of a new technology or technique ensured?; (2) What are the timing and process by which a new technology or technique is implemented at a hospital?; (3) How are patients informed before undergoing a new technology or technique?; (4) How are surgeons trained and credentialed in a new technology or technique?; (5) How are the outcomes of a new technology or technique tracked and evaluated?; and (6) How are the responsibilities to individual patients and society at large balanced? The following discussion is presented with the intent to encourage thought and dialogue about ethical considerations relevant to the implementation of new technologies and new techniques in surgery.
The 21st mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated operational field test and evaluation of tools, techniques, technologies, and training for science driven exploration during extravehicular activity (EVA). The mission was conducted in July 2016 from the Aquarius habitat, an underwater laboratory, off the coast of Key Largo in the Florida Keys National Marine Sanctuary. An international crew of eight (comprised of NASA and ESA astronauts, engineers, medical personnel, and habitat technicians) lived and worked in and around Aquarius and its surrounding reef environment for 16 days. The integrated testing (both interior and exterior objectives) conducted from this unique facility continues to support current and future human space exploration endeavors. Expanding on the scientific and operational evaluations conducted during NEEMO 20, the 21st NEEMO mission further incorporated a diverse Science Team comprised of planetary geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center, marine scientists from the Department of Biological Sciences at Florida International University (FIU) Integrative Marine Genomics and Symbiosis (IMaGeS) Lab, and conservationists from the Coral Restoration Foundation. The Science Team worked in close coordination with the long-standing EVA operations, planning, engineering, and research components of NEEMO in all aspects of mission planning, development, and execution.
Alphey, Luke; Baker, Pam; Condon, George C.; Condon, Kirsty C.; Dafa'alla, Tarig H.; Fu, Guoliang; Jin, Li; Labbe, Genevieve; Morrison, Neil M.; Nimmo, Derric D.; O'Connell, Sinead; Phillips, Caroline E.; Plackett, Andrew; Scaife, Sarah; Woods, Alexander; Burton, Rosemary S.; Epton, Matthew J.; Gong, Peng
The Sterile Insect Technique (SIT) has been used very successfully against range of pest insects, including various tephritid fruit flies, several moths and a small number of livestock pests. However, modern genetics could potentially provide several improvements that would increase the cost-effectiveness of SIT, and extend the range of suitable species. These include improved identification of released individuals by incorporation of a stable, heritable, genetic marker; built-in sex separation (genetic sexing); reduction of the hazard posed by non-irradiated accidental releases from mass-rearing facility (fail-safe); elimination of the need for sterilization by irradiation (genetic sterilization). We discuss applications of these methods and the state of the art, at the time of this meeting, in developing suitable strains. We have demonstrated, in several key pest species, that the required strains can be constructed by introducing a repressible dominant lethal genetic system, a method known as RIDL(trade mark). Based on field experience with Medfly, incorporation of a genetic sexing system into SIT programs for other tephritids could potentially provide a very significant improvement in cost-effectiveness. We have now been able to make efficient female-lethal strains for Medfly. One advantage of our approach is that it should be possible rapidly to extend this technology to other fruit fly species; indeed we have recently been able also to make genetic sexing strains of Medfly (Anastrepha ludens). (author)
Alphey, Luke; Baker, Pam; Condon, George C; Condon, Kirsty C; Dafa' alla, Tarig H; Fu, Guoliang; Jin, Li; Labbe, Genevieve; Morrison, Neil M; Nimmo, Derric D; O' Connell, Sinead; Phillips, Caroline E; Plackett, Andrew; Scaife, Sarah; Woods, Alexander [Oxitec Ltd., Oxford (United Kingdom); Burton, Rosemary S; Epton, Matthew J; Gong, Peng [University of Oxford (United Kingdom). Dept. of Zoology
The Sterile Insect Technique (SIT) has been used very successfully against range of pest insects, including various tephritid fruit flies, several moths and a small number of livestock pests. However, modern genetics could potentially provide several improvements that would increase the cost-effectiveness of SIT, and extend the range of suitable species. These include improved identification of released individuals by incorporation of a stable, heritable, genetic marker; built-in sex separation (genetic sexing); reduction of the hazard posed by non-irradiated accidental releases from mass-rearing facility (fail-safe); elimination of the need for sterilization by irradiation (genetic sterilization). We discuss applications of these methods and the state of the art, at the time of this meeting, in developing suitable strains. We have demonstrated, in several key pest species, that the required strains can be constructed by introducing a repressible dominant lethal genetic system, a method known as RIDL(trade mark). Based on field experience with Medfly, incorporation of a genetic sexing system into SIT programs for other tephritids could potentially provide a very significant improvement in cost-effectiveness. We have now been able to make efficient female-lethal strains for Medfly. One advantage of our approach is that it should be possible rapidly to extend this technology to other fruit fly species; indeed we have recently been able also to make genetic sexing strains of Medfly (Anastrepha ludens). (author)
Multidisciplinary studies of the social, economic and political impact resulting from recent advances in satellite meteorology. Volume 6: Executive summary. [technological forecasting spacecraft control/attitude (inclination) -classical mechanics
An assessment of the technological impact of modern satellite weather forecasting for the United States is presented. Topics discussed are: (1) television broadcasting of weather; (2) agriculture (crop production); (3) water resources; (4) urban development; (5) recreation; and (6) transportation.
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.
Complex nonlinear systems are typically characterized by many degrees of freedom, as well as interactions between the elements. Interesting examples can be found in the areas of earthquakes and finance. In these two systems, fat tails play an important role in the statistical dynamics. For earthquake systems, the Gutenberg-Richter magnitude-frequency is applicable, whereas for daily returns for the securities in the financial markets are known to be characterized by leptokurtotic statistics in which the tails are power law. Very large fluctuations are present in both systems. In earthquake systems, one has the example of great earthquakes such as the M9.1, March 11, 2011 Tohoku event. In financial systems, one has the example of the market crash of October 19, 1987. Both were largely unexpected events that severely impacted the earth and financial systems systemically. Other examples include the M9.3 Andaman earthquake of December 26, 2004, and the Great Recession which began with the fall of Lehman Brothers investment bank on September 12, 2013. Forecasting the occurrence of these damaging events has great societal importance. In recent years, national funding agencies in a variety of countries have emphasized the importance of societal relevance in research, and in particular, the goal of improved forecasting technology. Previous work has shown that both earthquakes and financial crashes can be described by a common Landau-Ginzburg-type free energy model. These metastable systems are characterized by fat tail statistics near the classical spinodal. Correlations in these systems can grow and recede, but do not imply causation, a common source of misunderstanding. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In this talk, we describe the basic phenomenology of these systems and emphasize their similarities and differences. We also consider the problem of forecast validation and verification
Kilbourn, Kyle; Bay, Marie Brøndum
In predicting areas of growth, public innovation projects may rely on optimistic visions of technology still in development as a way of ensuring novelty for funding. This paper explores what happens when forecasts of robotic technology meets the practice of sterile supply in a preliminary stage...
Tu, K N
Treatise on Materials Science and Technology, Volume 27: Analytical Techniques for Thin Films covers a set of analytical techniques developed for thin films and interfaces, all based on scattering and excitation phenomena and theories. The book discusses photon beam and X-ray techniques; electron beam techniques; and ion beam techniques. Materials scientists, materials engineers, chemical engineers, and physicists will find the book invaluable.
Bonelli, Maria Grazia; Ferrini, Mauro; Manni, Andrea
The assessment of metals and organic micropollutants contamination in agricultural soils is a difficult challenge due to the extensive area used to collect and analyze a very large number of samples. With Dioxins and dioxin-like PCBs measurement methods and subsequent the treatment of data, the European Community advises the develop low-cost and fast methods allowing routing analysis of a great number of samples, providing rapid measurement of these compounds in the environment, feeds and food. The aim of the present work has been to find a method suitable to describe the relations occurring between organic and inorganic contaminants and use the value of the latter in order to forecast the former. In practice, the use of a metal portable soil analyzer coupled with an efficient statistical procedure enables the required objective to be achieved. Compared to Multiple Linear Regression, the Artificial Neural Networks technique has shown to be an excellent forecasting method, though there is no linear correlation between the variables to be analyzed.
Gür Ali, Ö.; Sayin, S.; Woensel, van T.; Fransoo, J.C.
Promotions and shorter life cycles make grocery sales forecasting more difficult, requiring more complicated models. We identify methods of increasing complexity and data preparation cost yielding increasing improvements in forecasting accuracy, by varying the forecasting technique, the input
Yadgarov, Kh.T.; Pugachev, V.V.; Kim, A.A.
One of the main ways of pollution of plants by radionuclides is the receipt of radionuclides in plants from ground through root system and direct uptake of radionuclides by underground parts of plants. Therefore receipt of radioisotopes in rhizosphere of plants plays the main role in radionuclides accumulation in the plants. For plants cultivation in conditions of radioactive pollution of region it is necessary to estimate the value of possible radionuclides accumulation in a harvest of plants. Such forecasts are necessary at planning of growing of agricultural crops for the food, forage or technical purposes depending on a degree of their pollution by radionuclides. We investigated correlation between the content of strontium - 90 in plants in early phases of their development (20 days time) and in a harvest of plants at a soil way of radionuclide receipt. Our results of study of dependence of strontium - 90 accumulation in a harvest from its content in 20 days time sprouts show, that with reduction of the content of strontium - 90 in 20 days time sprouts, its quantity in a harvest of agriculture cultures is reduced. The correlation analysis of the received data has confirmed positive connection between accumulation of radionuclide in young and adult plants. So, correlation coefficients for a cotton, wheat and barley are 0,89; 0,91 and 0,91 correspondingly. Thus, the direct connection between the contents of strontium - 90 in plants of young age and its accumulation in a harvest of adult plants is established. It enables to predict pollution of' harvest by strontium - 90 under its contents in young plants. Using the received data, with the help of the least- squares method, we have calculated coefficients of the regression equation of a kind: y = a + bx, Where: y - the predicted contents of radionuclide in the harvest; x - the content of radionuclide in 20 days time sprouts; a, b - the empirical coefficients. Rather good coincidence of theoretical calculations and
A new technique is developed to predict auroral activity based on a sample of over 9000 auroral sites identified in global auroral images obtained by an ultraviolet imager on the NASA Polar satellite during a 6-month period. Four attributes of auroral activity sites are utilized in forecasting, namely, the area, the power, and the rates of change in area and power. This new technique is quite accurate, as indicated by the high true skill scores for forecasting three different levels of auroral dissipation during the activity lifetime. The corresponding advanced warning time ranges from 22 to 79 min from low to high dissipation levels
Full Text Available In the present study, we introduce to secondary education an Earth observation technique using synthetic aperture radar (SAR. The goal is to increase interest in and raise the awareness of students in the Earth observation technique through practical activities. A curriculum is developed based on the result of questionnaire surveys of school teachers. The curriculum is composed of 16 units. Teaching materials related to the Earth observation technique are researched and developed. We designed a visual SAR processor and a small corner reflector (CR as a new teaching technique. In teaching sessions at secondary school, the developed teaching materials and software were used effectively. In observation experiments, students set up CRs that they had built, and ALOS PALSAR was able to clearly observe all of the CRs. The proposed curriculum helped all of the students to understand the usefulness of the Earth observation technique.
Mohr, Markus; Unger, Hermann
A new approach for the reassessment of modern energy technologies is discussed. This mainly addresses renewable-energy systems, like photovoltaics or wind converters. A new number called the 'Marginal Energy Risk Price (MERP) for Hedging' is introduced. (Author)
Longhi, Sauro; Freddi, Alessandro
This book covers the three main scientific and technological areas critical for improving people's quality of life - namely human monitoring, smart health and assisted living - from both the research and development points of view.
Focusing on novel materials and techniques, this pioneering volume provides you with a solid understanding of the design and fabrication of smart RF passive components. You find comprehensive details on LCP, metal materials, ferrite materials, nano materials, high aspect ratio enabled materials, green materials for RFID, and silicon micromachining techniques. Moreover, this practical book offers expert guidance on how to apply these materials and techniques to design a wide range of cutting-edge RF passive components, from MEMS switch based tunable passives and 3D passives, to metamaterial-bas
Walk, Steven Robert
Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.
Gu, Xiaoling; Jin, Yang; Dong, Fang; Cai, Yueqing; You, Zhengyi; You, Junhui; Zhang, Liying; Du, Shuhu
Conventional isolation and identification of active compounds from herbs have been extensively reported by using various chromatographic and spectroscopic techniques. However, how to quickly discover new bioactive ingredients from natural sources still remains a challenging task due to the interference of their similar structures or matrices. Here, we present a grand approach for rapid analysis, forecast and discovery of bioactive compounds from herbs based on a hyphenated strategy of thin layer chromatography and ratiometric surface-enhanced Raman spectroscopy. The performance of the hyphenated strategy is first evaluated by analyzing four protoberberine alkaloids, berberine (BER), coptisine (COP), palmatine (PAT) and jatrorrhizine (JAT), from a typical herb Coptidis Rhizoma as an example. It has been demonstrated that this coupling method can identify the four compounds by characteristic peaks at 728, 708, 736 and 732 cm -1 , and especially discriminate BER and COP (with similar migration distances) by ratiometric Raman intensity (I 708 /I 728 ). The corresponding limits of detection are 0.1, 0.05, 0.1 and 0.5 μM, respectively, which are about 1-2 orders of magnitude lower than those of direct observation method under 254 nm UV lamp. Based on these findings, the proposed method further guides forecast and discovery of unknown compounds from traditional Chinese herb Typhonii Rhizoma. Results infer that two trace alkaloids (BER and COP) from the n-butanol extract of Typhonii Rhizoma are found for the first time. Moreover, in vitro experiments manifest that BER can effectively decrease the viability of human glioma U87 cells by inducing cell cycle arrest in a concentration-dependent manner. Copyright © 2018 Elsevier B.V. All rights reserved.
UAV Autonomy program which includes intelligent reasoning for autonomy, technologies to enhance see and avoid capabilities, object identification ...along the ship’s base recovery course (BRC). The pilot then flies toward the stern of the ship, aligning his approach path with the ship’s lineup line...quiescent point identification . CONCLUSIONS The primary goal for conducting dynamic interface analysis is to expand existing operating envelopes and
Courtois, G [Commissariat a l' Energie Atomique, Saclay (France).Centre d' Etudes Nucleaires
After the presentation of gamma radiography and X-ray radiography, the author compare both techniques showing, in particular, the greater utility of gamma radiography in industrial diagnostic and more particularly on works site diagnostic. Problem of using radiography and safety consideration will be studied. Figures shows two radiography equipment which have been designed for gamma radiography respecting the safety regulations required by the Radioisotope Inter-ministerial Commission. In the second part, different techniques and uses of gamma radiography are briefly described : xerography, neutron radiography, fluoroscopy and imaging amplifier, tomography, betatrons and linear accelerators. Cost analysis will discussed in conclusion. (M.P.)
Full Text Available The ASTA4 876-91 standard establishes a corrosion forecast of concrete reinforced bar by measuring the electrochemical potential. This forecast is based on thermodynamic considerations without taking into account the kinetic of the corrosion process. A comparison was made between the results obtained based on this standard and others using electrochemical techniques (Tafel, Rp, EIS, Electrochemical Noise. These techniques allows to obtain the corrosion rate in samples having 0.4, 0.5 and 0.66 water/cement ratios submitted to salt spray outdoors and by immersion in 3% saline solution during a test time of 20 months. Differences were detected between the results obtained using the ASTM standard and the electrochemical techniques used. The main difference is that samples submitted to immersion shows a higher probability of corrosion than samples submitted to salt spray; however, the electrochemical techniques showed the contrary concerning the corrosion kinetic process .A comparison respecting corrosion rate was also made between the results obtained by the different electrochemical techniques. It is very well known that all electrochemical techniques supposed always general corrosion except electrochemical noise. Using the technique the pitting index can be calculated. It shows that localized corrosion is the most predominant
La norma ASTM 876-91 establece un pronóstico de corrosión de la barra de refuerzo del hormigón armado mediante la determinación de potenciales electroquímicos. Este pronóstico se basa en consideraciones termodinámicas, sin tener en cuenta la cinética del proceso de corrosión. Se comparan los resultados obtenidos aplicando esta norma con técnicas electroquímicas (Tafel, Rp, EIS, Ruido Electroquímico que permiten calcular la velocidad de corrosión en probetas con relaciones agua/cemento 0,4, 0,5 y 0,66 sometidas a niebla salina en condiciones naturales y en inmersión en solución salina al 3% durante un
U.S. Environmental Protection Agency — The Exposure Forecaster Database (ExpoCastDB) is EPA's database for aggregating chemical exposure information and can be used to help with chemical exposure...
Duus, Henrik Johannsen
Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...
Nauta, Bram; Annema, Anne J.
CMOS evolution introduces several problems in analog design. Gate-leakage mismatch exceeds conventional matching tolerances requiring active cancellation techniques or alternative architectures. One strategy to deal with the use of lower supply voltages is to operate critical parts at higher supply
Møller Hansen, Peter; Pedersen, Mads M; Hansen, Kristoffer L
With conventional Doppler ultrasound it is not possible to estimate direction and velocity of blood flow, when the angle of insonation exceeds 60-70°. Transverse oscillation is an angle independent vector velocity technique which is now implemented on a conventional ultrasound scanner. In this pa...
Snider, Brent R.; Eliasson, Janice B.
This teaching brief describes a 30-minute game where student groups compete in-class in an introductory time-series forecasting exercise. The students are challenged to "beat the instructor" who competes using forecasting techniques that will be subsequently taught. All forecasts are graphed prior to revealing the randomly generated…
In June 1986, a two day workshop was convened to review mobile offshore drilling unit (MODU) evacuation research and development (R D) requirements for programs currently sponsored by the Marine Engineering Committee for PERD (Panel on Energy Research and Development) task 6.2. The proceedings of the workshop are presented in terms of: evacuation technology needs; a review of current R D projects for their relevance to the technological needs; recommended changes to current projects where warranted; and recommended priority R D projects that are not presently being undertaken to meet the evacuation technology needs. Current R D projects reviewed include: preferred orientation and displacement (PROD) lifeboat launching, personal transfer baskets, arctic escape system (AES), fast rescue craft (FRC), helicopter survival suits, diving bell emergency heaters, working immersion suits, diving bell emergency ascent system, undersea robotics systems and DCIEM decompression tables. It was concluded that gaps in offshore safety/evacuation capability cannot be adequately addressed by one or two major projects. Rather, a wider range of problems exist that require attention through a greater financial commitment than is presently the case. Government-industry cooperation in this field is required. Overall, the projects were judged to have been well planned, managed, and relevant to offshore safety needs. The only exception was a feeling that diving related projects consummed a disproportionate share of the available funding. Further, there was some reservation that the robotics study was outside the 6.2 PERD R D mandate. A list of recommendations is presented. 1 tab.
Truhanov, V. N.; Sultanov, M. M.
In the present article researches of statistical material on the refusals and malfunctions influencing operability of heat power installations have been conducted. In this article the mathematical model of change of output characteristics of the turbine depending on number of the refusals revealed in use has been presented. The mathematical model is based on methods of mathematical statistics, probability theory and methods of matrix calculation. The novelty of this model is that it allows to predict the change of the output characteristic in time, and the operating influences have been presented in an explicit form. As desirable dynamics of change of the output characteristic (function, reliability) the law of distribution of Veybull which is universal is adopted since at various values of parameters it turns into other types of distributions (for example, exponential, normal, etc.) It should be noted that the choice of the desirable law of management allows to determine the necessary management parameters with use of the saved-up change of the output characteristic in general. The output characteristic can be changed both on the speed of change of management parameters, and on acceleration of change of management parameters. In this article the technique of an assessment of the pseudo-return matrix has been stated in detail by the method of the smallest squares and the standard Microsoft Excel functions. Also the technique of finding of the operating effects when finding restrictions both for the output characteristic, and on management parameters has been considered. In the article the order and the sequence of finding of management parameters has been stated. A concrete example of finding of the operating effects in the course of long-term operation of turbines has been shown.
Nakamuta, Yasushi; Miyoshi, Toshiaki; O'shima, Eiji
As In-Operation Inspection Technology (IOI) , we selected primary loop recirculation (PLR) pump, sea water pump, small diameter pipe branch in the steam generator (SG) room and motor driven valve for the typical component of the nuclear power plant, and we are developing the technology which can forecast the residual life of parts in the plan until FY2000. With respect to PLR pump and sea water pump, technical procedure for predicting the propagation of bearing wear, under the combined effect of several degradation conditions of each pump during the plant operation are under development. With respect to pipe branch, we are developing the non-contact laser sensors, and we are constructing the system which forecasts high cycle fatigue in the root of pipe branch by monitoring the vibration of pipe branch. With respect to motor driven valve, technical procedure for predicting the thermal degradation of gaskets and gland packing, technical procedure for predicting the stem nut wear and wear of hunging portion of valve disc, and technical procedure for detecting the degradation of driving parts, without disassembling the motor driven valve, are under development. (author)
Axe, J.D.; Hewat, A.W.; Maier, J.; Margaca, F.M.A.; Rauch, H.
The importance of research, development and application of advanced materials is well understood by all developed and most developing countries. Amongst advanced materials, ceramics play a prominent role due to their specific chemical and physical properties. According to performance and importance, advanced ceramics can be classified as structural ceramics (mechanical function) and the so-called functional ceramics. In the latter class of materials, special electrical, chemical, thermal, magnetic and optical properties are of interest. The most valuable materials are multifunctional, for example, when structural ceramics combine beneficial mechanical properties with thermal and chemical sensitivity. Multifunctionality is characteristic of many composite materials (organic/inorganic composite). Additionally, properties of material can be changed by reducing its dimension (thin films, nanocrystalline ceramics). Nuclear techniques, found important applications in research and development of advanced ceramics. The use of neutron techniques has increased dramatically in recent years due to the development of advanced neutron sources, instrumentation and improved data analysis. Typical neutron techniques are neutron diffraction, neutron radiography, small angle neutron scattering and very small angle neutron scattering. Neutrons can penetrate deeply into most materials thus sampling their bulk properties. In determination of the crystal structure of HTSC, YBa 2 Cu 2 O 7 , XRD located the heavy metal atoms, but failed in finding many of the oxygen atoms, while the neutron diffraction located all atoms equally well in the crystal structure. Neutron diffraction is also unique for the determination of the magnetic structure of materials since the neutrons themselves have a magnetic moment. Application of small angle neutron scattering for the determination of the size of hydrocarbon aggregates within the zeolite channels is illustrated. (author)
Takooshian, Harold; Gielen, Uwe P; Plous, Scott; Rich, Grant J; Velayo, Richard S
How can we best internationalize undergraduate psychology education in the United States and elsewhere? This question is more timely than ever, for at least 2 reasons: Within the United States, educators and students seek greater contact with psychology programs abroad, and outside the United States, psychology is growing apace, with educators and students in other nations often looking to U.S. curricula and practices as models. In this article, we outline international developments in undergraduate psychology education both in the United States and abroad, and analyze the dramatic rise of online courses and Internet-based technologies from an instructional and international point of view. Building on the recommendations of the 2005 APA Working Group on Internationalizing the Undergraduate Psychology Curriculum, we then advance 14 recommendations on internationalizing undergraduate psychology education--for students, faculty, and institutions. (c) 2016 APA, all rights reserved).
Pierdzioch, C.; Rulke, J. C.; Stadtmann, G.
We analyze more than 20,000 forecasts of nine metal prices at four different forecast horizons. We document that forecasts are heterogeneous and report that anti-herding appears to be a source of this heterogeneity. Forecaster anti-herding reflects strategic interactions among forecasters...
The objective of the revised Training Manual is to help scientists to acquire the necessary knowledge needed for performing proper research and development work in the field of food irradiation. The Manual presents an up-to-date picture of the current state of food irradiation and reflects the important advances made in the technology of food irradiation, in the radiation chemistry of foods, in the microbiology of irradiated foods, in wholesomeness and standardization. It contains the following chapters: (1) Radionuclides and radiation; (2) Radiation detection and measurement; (3) Radiation protection; (4) Radiation chemistry; (5) Effects of radiation on living organisms; (6) Preservation of foods; (7) Radiation preservation of foods; (8) Packaging; (9) Combination processes; (10) Limitations of food irradiation; (11) Wholesomeness of irradiated foods; (12) Government regulation of irradiated foods; (13) Food irradiation facilities; (14) Commercial aspects of food irradiation; (15) Literature sources. The practical part of the Manual contains a revised and expanded series of detailed laboratory exercises in the use of ionizing radiation for food processing
Full Text Available Abstract Background Treatments for coronary heart disease (CHD have evolved rapidly over the last 15 years with considerable change in the number and effectiveness of both medical and surgical treatments. This period has seen the rapid development and uptake of statin drugs and coronary artery revascularization procedures (CARPs that include Coronary Artery Bypass Graft procedures (CABGs and Percutaneous Coronary Interventions (PCIs. It is difficult in an era of such rapid change to accurately forecast requirements for treatment services such as CARPs. In a previous paper we have described and outlined the use of a Markov Monte Carlo simulation model for analyzing and predicting the requirements for CARPs for the population of Western Australia (Mannan et al, 2007. In this paper, we expand on the use of this model for forecasting CARPs in Western Australia with a focus on the lack of adequate performance of the (standard model for forecasting CARPs in a period during the mid 1990s when there were considerable changes to CARP technology and implementation policy and an exploration and demonstration of how the standard model may be adapted to achieve better performance. Methods Selected key CARP event model probabilities are modified based on information relating to changes in the effectiveness of CARPs from clinical trial evidence and an awareness of trends in policy and practice of CARPs. These modified model probabilities and the ones obtained by standard methods are used as inputs in our Markov simulation model. Results The projected numbers of CARPs in the population of Western Australia over 1995–99 only improve marginally when modifications to model probabilities are made to incorporate an increase in effectiveness of PCI procedures. However, the projected numbers improve substantially when, in addition, further modifications are incorporated that relate to the increased probability of a PCI procedure and the reduced probability of a CABG
Mannan, Haider R; Knuiman, Matthew; Hobbs, Michael
Treatments for coronary heart disease (CHD) have evolved rapidly over the last 15 years with considerable change in the number and effectiveness of both medical and surgical treatments. This period has seen the rapid development and uptake of statin drugs and coronary artery revascularization procedures (CARPs) that include Coronary Artery Bypass Graft procedures (CABGs) and Percutaneous Coronary Interventions (PCIs). It is difficult in an era of such rapid change to accurately forecast requirements for treatment services such as CARPs. In a previous paper we have described and outlined the use of a Markov Monte Carlo simulation model for analyzing and predicting the requirements for CARPs for the population of Western Australia (Mannan et al, 2007). In this paper, we expand on the use of this model for forecasting CARPs in Western Australia with a focus on the lack of adequate performance of the (standard) model for forecasting CARPs in a period during the mid 1990s when there were considerable changes to CARP technology and implementation policy and an exploration and demonstration of how the standard model may be adapted to achieve better performance. Selected key CARP event model probabilities are modified based on information relating to changes in the effectiveness of CARPs from clinical trial evidence and an awareness of trends in policy and practice of CARPs. These modified model probabilities and the ones obtained by standard methods are used as inputs in our Markov simulation model. The projected numbers of CARPs in the population of Western Australia over 1995-99 only improve marginally when modifications to model probabilities are made to incorporate an increase in effectiveness of PCI procedures. However, the projected numbers improve substantially when, in addition, further modifications are incorporated that relate to the increased probability of a PCI procedure and the reduced probability of a CABG procedure stemming from changed CARP preference
Western Political Consulting Techniques and Post-Soviet Political Technology in Political Campaigns in Latvia Ieva Dmitričenko Keywords: political campaignsm political consulting, political technology, parties, marketing, media Political campaigning is an international phenomenon, because there is a free flow of information, knowledge and human resource among practitioners of political campaigning in various countries. As a result political campaigning techniques that have proven to ...
Kuzmanovic, Deborah A.; Elashvili, Ilya; O'Connell, Catherine; Krueger, Susan
Small-angle scattering (SAS) techniques, like small-angle X-ray scattering (SAXS) and small-angle neutron scattering (SANS), were used to measure and thus to validate the accuracy of a novel technology for virus sizing and concentration determination. These studies demonstrate the utility of SAS techniques for use in quality assurance measurements and as novel technology for the physical characterization of viruses
Full Text Available Forecast combination has been proven to be a very important technique to obtain accurate predictions for various applications in economics, finance, marketing and many other areas. In many applications, forecast errors exhibit heavy-tailed behaviors for various reasons. Unfortunately, to our knowledge, little has been done to obtain reliable forecast combinations for such situations. The familiar forecast combination methods, such as simple average, least squares regression or those based on the variance-covariance of the forecasts, may perform very poorly due to the fact that outliers tend to occur, and they make these methods have unstable weights, leading to un-robust forecasts. To address this problem, in this paper, we propose two nonparametric forecast combination methods. One is specially proposed for the situations in which the forecast errors are strongly believed to have heavy tails that can be modeled by a scaled Student’s t-distribution; the other is designed for relatively more general situations when there is a lack of strong or consistent evidence on the tail behaviors of the forecast errors due to a shortage of data and/or an evolving data-generating process. Adaptive risk bounds of both methods are developed. They show that the resulting combined forecasts yield near optimal mean forecast errors relative to the candidate forecasts. Simulations and a real example demonstrate their superior performance in that they indeed tend to have significantly smaller prediction errors than the previous combination methods in the presence of forecast outliers.
E. M. Cutrim
Full Text Available Responding to the call for reform in science education, changes were made in an introductory meteorology and climate course offered at a large public university. These changes were a part of a larger project aimed at deepening and extending a program of science content courses that model effective teaching strategies for prospective middle school science teachers. Therefore, revisions were made to address misconceptions about meteorological phenomena, foster deeper understanding of key concepts, encourage engagement with the text, and promote inquiry-based learning. Techniques introduced include: use of a flash cards, student reflection questionnaires, writing assignments, and interactive discussions on weather and forecast data using computer technology such as Integrated Data Viewer (IDV. The revision process is described in a case study format. Preliminary results (self-reflection by the instructor, surveys of student opinion, and measurements of student achievement, suggest student learning has been positively influenced. This study is supported by three grants: NSF grant No. 0202923, the Unidata Equipment Award, and the Lucia Harrison Endowment Fund.
This article articulates the groundwork for a new understanding of the concept of technique through a critical engagement with Herbert Marcuse's critical theory of technology. To this end, it identifies and engages three expressions of technique in Marcuse's work: mimesis, reified labor, and the happy consciousness. It is argued that this mapping…
Reikard, Gordon; Pinson, Pierre; Bidlot, Jean
Recently, the technology has been developed to make wave farms commercially viable. Since electricity is perishable, utilities will be interested in forecasting ocean wave energy. The horizons involved in short-term management of power grids range from as little as a few hours to as long as several...... days. In selecting a method, the forecaster has a choice between physics-based models and statistical techniques. A further idea is to combine both types of models. This paper analyzes the forecasting properties of a well-known physics-based model, the European Center for Medium-Range Weather Forecasts...... (ECMWF) Wave Model, and two statistical techniques, time-varying parameter regressions and neural networks. Thirteen data sets at locations in the Atlantic and Pacific Oceans and the Gulf of Mexico are tested. The quantities to be predicted are the significant wave height, the wave period, and the wave...
Giovanni Salini Calderón
we studied a monthly database corresponding to South Oscillation Index (SOI and between the years 1886 to 2006. It explains how there must manipulated this database whose data possess nonlinear characteristic, which will be used to do forecasts several steps ahead. Two standard tests to this database were applied, the Average Mutual Information (AMI and the False Nearest Neighbours (FNN. The optimal spacing of the information was obtained as well as the number of values backward necessary to predict values towards the future. Then, several models were designed of artificial neural nets (ANN, with different learning rules, function of transfer, elements of process (or neurons in the hidden layer, etc., that allowed to do forecasting of up to 20 steps ahead. The best networks were those that possessed the rules of learning called extDBD and Delta-Rule, and sigmoid as well as hyperbolic tangent as function of transfer. The type of used network was one of feedforward multilayer perceptron and trained by means of backpropagation technique. Networks were proved by one, two hidden layers and without any hidden layer. The best model that was obtained it turned out to be one that consisted with an alone hidden layer.
Wilson, S. G.; Oliver, J. D., Jr.; Kot, R. C.; Richards, C. R.
Communications theory and practice are merged with state-of-the-art technology in IC fabrication, especially monolithic GaAs technology, to examine the general feasibility of a number of advanced technology digital transmission systems. Satellite-channel models with (1) superior throughput, perhaps 2 Gbps; (2) attractive weight and cost; and (3) high RF power and spectrum efficiency are discussed. Transmission techniques possessing reasonably simple architectures capable of monolithic fabrication at high speeds were surveyed. This included a review of amplitude/phase shift keying (APSK) techniques and the continuous-phase-modulation (CPM) methods, of which MSK represents the simplest case.
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740
Dai, Wensheng; Wu, Jui-Yu; Lu, Chi-Jie
Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Full Text Available Sales forecasting is one of the most important issues in managing information technology (IT chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR, is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA, temporal ICA (tICA, and spatiotemporal ICA (stICA to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting.
Anil Kumar Kar
New hydrological insights for the region: This study establishes different possible key RG networks using Hall’s method, analytical hierarchical process (AHP, self organization map (SOM and hierarchical clustering (HC using the characteristics of each rain gauge occupied Thiessen polygon area. Efficiency of the key networks is tested by artificial neural network (ANN, Fuzzy and NAM rainfall-runoff models. Furthermore, flood forecasting has been carried out using the three most effective RG networks which uses only 7 RGs instead of 14 gauges established in the Kantamal sub-catchment, Mahanadi basin. The Fuzzy logic applied on the key RG network derived using AHP has shown the best result for flood forecasting with efficiency of 82.74% for 1-day lead period. This study demonstrates the design procedure of key RG network for effective flood forecasting particularly when there is difficulty in gathering the information from all RGs.
Most of the literature on combination of forecasts deals with the assumption of unbiased individual forecasts. Here, we consider the case of biased forecasts and discuss two different combination techniques resulting in an unbiased forecast. On the one hand we correct the individual forecasts, and on the other we calculate bias based weights. A simulation study gives some insight in the situations where we should use the different methods.
Sutton, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zhao, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
Technologies to survey and decontaminate wide-area contamination and process the subsequent radioactive waste have been developed and implemented following the Chernobyl nuclear power plant release and the breach of a radiological source resulting in contamination in Goiania, Brazil. These civilian examples of radioactive material releases provided some of the first examples of urban radiological remediation. Many emerging technologies have recently been developed and demonstrated in Japan following the release of radioactive cesium isotopes (Cs-134 and Cs-137) from the Fukushima Dai-ichi nuclear power plant in 2011. Information on technologies reported by several Japanese government agencies, such as the Japan Atomic Energy Agency (JAEA), the Ministry of the Environment (MOE) and the National Institute for Environmental Science (NIES), together with academic institutions and industry are summarized and compared to recently developed, deployed and available technologies in the United States. The technologies and techniques presented in this report may be deployed in response to a wide area contamination event in the United States. In some cases, additional research and testing is needed to adequately validate the technology effectiveness over wide areas. Survey techniques can be deployed on the ground or from the air, allowing a range of coverage rates and sensitivities. Survey technologies also include those useful in measuring decontamination progress and mapping contamination. Decontamination technologies and techniques range from non-destructive (e.g., high pressure washing) and minimally destructive (plowing), to fully destructive (surface removal or demolition). Waste minimization techniques can greatly impact the long-term environmental consequences and cost following remediation efforts. Recommendations on technical improvements to address technology gaps are presented together with observations on remediation in Japan.
Full Text Available Information technology plays an important role on increasing productivity in many organizations. The primary objective of the present survey is to study the impact of information technology on productivity and find a positive and significant relationship between these two factors. Structural equations technique and LISREL software are used for analysis of the questionnaires distributed among managers and some employees of Iran Behnoush Company. Organizations try to improve their performance by investment in information technology. However, many of the previous studies indicate insignificance of the impact of information technology on productivity of the organizations. The present survey studies the impact of information technology on organizations' productivity through the collected data from the above company. Results confirm existence of a positive relationship between information technology and productivity.
Marina Gennadievna Vlasova
Full Text Available Today the national security system effectiveness seriously depends on the professional analysis of information and timely forecasts. Thus the efficient methods of forecasting in the sphere of international relations are of current importance for the modern intelligence services. The Indications and Warning Technique that was a key element of forecasting methodology in intelligence until the end of Cold War is estimated in the present article. Is this method still relevant in the contemporary world with its new international order, new security challenges and technological revolution in the data collection and processing? The main conclusion based on the overview of current researches and known intelligence practice is that indicators technique is still relevant for the early warning of national security threats but requires some adaptation to today’s issues. The most important trends in adaptation are supposed to be a creation of broadest possible spectrum of threatens scenarios as well as research of current strategic threatens and corresponding indicators. Also the appropriate software that automates the use of indications technique by the security services is very important. The author believes that the cooperation between intelligence services and academic community can increase the efficiency of the Indications Methodology and of the strategic forecasting as well.
Electrical Load Survey and Forecast for a Decentralized Hybrid Power System at Elebu, Kwara State, Nigeria. ... Nigerian Journal of Technology ... The paper reports the results of electrical load demand and forecast for Elebu rural community ...
Between the early sixties and late 1989, the German Federal Government spent some DM 23 billion to support research and development of the entire field of nuclear technology (such as fundamental research, industrial applications, medicine, safety technology, advanced energy systems) in the Federal Republic of Germany. Of this amount, approx. DM 11 billion was spent on the technology of nuclear power plants equipped with light water reactors, on safety research, and on the nuclear fuel cycle. Comparing the expenditures of the Federal Government for the conversion of nuclear power into electricity with the savings achieved in electricity generating costs of approx. DM 58 billion by late 1989 (the cost advantage of nuclear power being approx. Pf 5/kWh), one arrives at a cost advantage to the whole economy of approx. DM 47 billion by the date shown above; by the year 2000, this advantage will have risen to some DM 150 billion. (orig.) [de
Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat
An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...... for evaluating and monitoring forecast performance are also summarized....
Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle
Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.
Butin, V I; Trofimov, Eh N
Typical responses of the bipolar digital integrated circuits (DIC) of the combination type under the action of pulse gamma radiation are presented. Analysis of the DIC transients is carried out. A calculation-experimental method for forecasting the temporal serviceability loss of bipolar DIC is proposed. The reliability of the method is confirmed experimentally. 1 fig.
He, Han; Wang, Huaning; Du, Zhanle; Zhang, Liyun; Huang, Xin; Yan, Yan; Fan, Yuliang; Zhu, Xiaoshuai; Guo, Xiaobo; Dai, Xinghua
The history of solar weather forecasting services at National Astronomical Observatories, Chinese Academy of Sciences (NAOC) can be traced back to 1960s. Nowadays, NAOC is the headquarters of the Regional Warning Center of China (RWC-China), which is one of the members of the International Space Environment Service (ISES). NAOC is responsible for exchanging data, information and space weather forecasts of RWC-China with other RWCs. The solar weather forecasting services at NAOC cover short-term prediction (within two or three days), medium-term prediction (within several weeks), and long-term prediction (in time scale of solar cycle) of solar activities. Most efforts of the short-term prediction research are concentrated on the solar eruptive phenomena, such as flares, coronal mass ejections (CMEs) and solar proton events, which are the key driving sources of strong space weather disturbances. Based on the high quality observation data of the latest space-based and ground-based solar telescopes and with the help of artificial intelligence techniques, new numerical models with quantitative analyses and physical consideration are being developed for the predictions of solar eruptive events. The 3-D computer simulation technology is being introduced for the operational solar weather service platform to visualize the monitoring of solar activities, the running of the prediction models, as well as the presenting of the forecasting results. A new generation operational solar weather monitoring and forecasting system is expected to be constructed in the near future at NAOC.
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated
Full Text Available There are a lot of didactic and technological methods and techniques that shape and develop cognitive interest of primary school students in modern methodology of teaching foreign languages. The use of various forms of gaming interaction, problem assignments, information and communication technologies in the teaching of primary school students allows diversifying the teaching of a foreign language, contributes to the development of their creative and cognitive activity. The use of health-saving technologies ensures the creation of a psychologically and emotionally supportive atmosphere at the lesson, which is an essential condition for acquiring new knowledge and maintaining stable cognitive interest among students while learning a foreign language.
Full Text Available To what extent do frequently cited determinants of military spending allow us to predict and forecast future levels of expenditure? The authors draw on the data and specifications of a recent model on military expenditure and assess the predictive power of its variables using in-sample predictions, out-of-sample forecasts and Bayesian model averaging. To this end, this paper provides guidelines for prediction exercises in general using these three techniques. More substantially, however, the findings emphasize that previous levels of military spending as well as a country’s institutional and economic characteristics particularly improve our ability to predict future levels of investment in the military. Variables pertaining to the international security environment also matter, but seem less important. In addition, the results highlight that the updated model, which drops weak predictors, is not only more parsimonious, but also slightly more accurate than the original specification.
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Araújo, Miguel B; New, Mark
Concern over implications of climate change for biodiversity has led to the use of bioclimatic models to forecast the range shifts of species under future climate-change scenarios. Recent studies have demonstrated that projections by alternative models can be so variable as to compromise their usefulness for guiding policy decisions. Here, we advocate the use of multiple models within an ensemble forecasting framework and describe alternative approaches to the analysis of bioclimatic ensembles, including bounding box, consensus and probabilistic techniques. We argue that, although improved accuracy can be delivered through the traditional tasks of trying to build better models with improved data, more robust forecasts can also be achieved if ensemble forecasts are produced and analysed appropriately.
Chantoem, Rewadee; Rattanavich, Saowalak
This research compares the English language achievements of vocational students, their reading and writing abilities, and their attitudes towards learning English taught with just-in-time teaching techniques through web technologies and conventional methods. The experimental and control groups were formed, a randomized true control group…
Nguyen Quang Liem
Full Text Available This report highlights the promising applications of modern analysis techniques such as Scanning Electron Microsopy, X-ray fluorescence, X-ray diffraction, Raman scattering spectroscopy, and thermal expansion measurement in searching back the ancient art ceramics technologies.
M.Com. (Business Management) Forecasting is an important function used in a wide range of business planning or decision-making situations. The purpose ofthis study was to build a sales forecasting model that would be practical and cost effective, from the various forecasting methods and techniques available. Various forecast models, methods and techniques are outlined in the initial part of this study by the author. The author has outlined some of the fundamentals and limitations that unde...
Our main objective is to improve the quality of photovoltaic power forecasts deriving from weather forecasts. Such forecasts are imperfect due to meteorological uncertainties and statistical modeling inaccuracies in the conversion of weather forecasts to power forecasts. First we gather several weather forecasts, secondly we generate multiple photovoltaic power forecasts, and finally we build linear combinations of the power forecasts. The minimization of the Continuous Ranked Probability Score (CRPS) allows to statistically calibrate the combination of these forecasts, and provides probabilistic forecasts under the form of a weighted empirical distribution function. We investigate the CRPS bias in this context and several properties of scoring rules which can be seen as a sum of quantile-weighted losses or a sum of threshold-weighted losses. The minimization procedure is achieved with online learning techniques. Such techniques come with theoretical guarantees of robustness on the predictive power of the combination of the forecasts. Essentially no assumptions are needed for the theoretical guarantees to hold. The proposed methods are applied to the forecast of solar radiation using satellite data, and the forecast of photovoltaic power based on high-resolution weather forecasts and standard ensembles of forecasts. (author) [fr
National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...
National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...
National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...
The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.
Jin, Sainan; Corradi, Valentina; Swanson, Norman
Forecast accuracy is typically measured in terms of a given loss function. However, as a consequence of the use of misspecified models in multiple model comparisons, relative forecast rankings are loss function dependent. This paper addresses this issue by using a novel criterion for forecast evaluation which is based on the entire distribution of forecast errors. We introduce the concepts of general-loss (GL) forecast superiority and convex-loss (CL) forecast superiority, and we establish a ...
Bon, A. T.; Ng, T. K.
Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.
This paper documents the presence of systematic bias in the real GDP and inflation forecasts of private sector forecasters in the G7 economies in the years 1990â€“2005. The data come from the monthly Consensus Economics forecasting service, and bias is measured and tested for significance using parametric fixed effect panel regressions and nonparametric tests on accuracy ranks. We examine patterns across countries and forecasters to establish whether the bias reflects the inefficient use of i...
Hong, Seokpyo; Ahn, Kilsoo; Kim, Sungjune; Gong, Sungyong
This study presents a methodology that enables a quantitative assessment of green chemistry technologies. The study carries out a quantitative evaluation of a particular case of material reutilization by calculating the level of "greenness" i.e., the level of compliance with the principles of green chemistry that was achieved by implementing a green chemistry technology. The results indicate that the greenness level was enhanced by 42% compared to the pre-improvement level, thus demonstrating the economic feasibility of green chemistry. The assessment technique established in this study will serve as a useful reference for setting the direction of industry-level and government-level technological R&D and for evaluating newly developed technologies, which can greatly contribute toward gaining a competitive advantage in the global market.
Marcati, Alberto; Prete, M. Irene; Mileti, Antonio; Cortese, Mario; Zodiatis, George; Karaolia, Andria; Gauci, Adam; Drago, Aldo
This paper presents a case study on the management of users' engagement in the development of a new technology. Based on the experience of MEDESS-4MS, an integrated operational model for oil spill Decision Support System covering the whole Mediterranean Sea, the case study is aimed at the development of a framework for user engagement and for the management of its dual logic. Indeed, users may play a dual role in the innovation process, contributing to both the design of the innovation and its promotion. Users contribute to shaping the innovation, by aggregating and integrating knowledge, and they facilitate its diffusion, by adopting the innovation and fostering its adoption within the socio-economic system.
Rogers, R.; Marres, N.
New World Wide Web (web) mapping techniques may inform and ultimately facilitate meaningful participation in current science and technology debates. The technique described here "landscapes" a debate by displaying key "webby" relationships between organizations. "Debate-scaping" plots two
Mohamad, Nur Royhaila; Marzuki, Nur Haziqah Che; Buang, Nor Aziah; Huyop, Fahrul; Wahab, Roswanira Abdul
The current demands of sustainable green methodologies have increased the use of enzymatic technology in industrial processes. Employment of enzyme as biocatalysts offers the benefits of mild reaction conditions, biodegradability and catalytic efficiency. The harsh conditions of industrial processes, however, increase propensity of enzyme destabilization, shortening their industrial lifespan. Consequently, the technology of enzyme immobilization provides an effective means to circumvent these concerns by enhancing enzyme catalytic properties and also simplify downstream processing and improve operational stability. There are several techniques used to immobilize the enzymes onto supports which range from reversible physical adsorption and ionic linkages, to the irreversible stable covalent bonds. Such techniques produce immobilized enzymes of varying stability due to changes in the surface microenvironment and degree of multipoint attachment. Hence, it is mandatory to obtain information about the structure of the enzyme protein following interaction with the support surface as well as interactions of the enzymes with other proteins. Characterization technologies at the nanoscale level to study enzymes immobilized on surfaces are crucial to obtain valuable qualitative and quantitative information, including morphological visualization of the immobilized enzymes. These technologies are pertinent to assess efficacy of an immobilization technique and development of future enzyme immobilization strategies. PMID:26019635
Watanabe, Masaaki; Miyasaka, Yasuhiko; Miyao, Hidehiko; Ooki, Arahiko; Ninomiya, Toshiaki; Koiwai, Masami
On decommissioning of nuclear facilities, the thermal cutting technique such as an oxygen-acetylene gas cutting and a plasma arc cutting are generally used for cutting massive and thick steel structures in consideration with cutting speed and control performance. These techniques generate dust, smoke, aerosol and a large quantity of secondary waste. Mechanical cutting technique has an advantage of small amount of secondary waste, and the metal chips from the kerf recovered easily compared with these thermal cutting technique. The remote mechanical cutting system for highly activated RPV has been developed with the manner which achieves the safety and cost effectiveness. The development has been performed on consignment to RANDEC from the Science and Technology Agency of Japan. (author)
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
Montgomery, Douglas C; Kulahci, Murat
An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.
code. Press enter or select the go button to submit request Local forecast by "City, St" or Prediction Center on Twitter NCEP Quarterly Newsletter WPC Home Analyses and Forecasts National Forecast to all federal, state, and local government web resources and services. National Forecast Charts
C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)
textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,
Sille, A K [Gosudarstvennyj Komitet po Ispol' zovaniyu Atomnoj Ehnergii SSSR, Moscow
It is reported on the activities of the Scientific and Technical Coordination Council for Radiation Technique and Technology (STCC-RTT) of the CMEA Permanent Commission for the Peaceful Uses of Atomic Energy according to the programme 1971 to 1975. The STCC-RTT is concerned with technical applications such as radiation sterilization, food irradiation, radiation-induced chemical processes etc. The main tasks which have to be solved within the period from 1976 to 1980 are outlined.
Ochoa-Vásquez, Miguel A.; Ramírez-Montoya, María S.
Improving reading comprehension skills is fundamental to those students willing to enroll in undergraduate studies. This sequential-explanatory mixed methods research design attempted to measure the impact that English reading comprehension assessment had on 96 college students’ school performance, after receiving a 15-hour instruction on reading evaluating techniques in technological-enriched environments. The data was collected through reading comprehension pre/post-tests and a semi-structu...
Turco, M.; Milelli, M.
skill scores of two competitive forecast. It is important to underline that the conclusions refer to the analysis of the Piemonte operational alert system, so they cannot be directly taken as universally true. But we think that some of the main lessons that can be derived from this study could be useful for the meteorological community. In details, the main conclusions are the following: - despite the overall improvement in global scale and the fact that the resolution of the limited area models has increased considerably over recent years, the QPF produced by the meteorological models involved in this study has not improved enough to allow its direct use, that is, the subjective HQPF continues to offer the best performance; - in the forecast process, the step where humans have the largest added value with respect to mathematical models, is the communication. In fact the human characterisation and communication of the forecast uncertainty to end users cannot be replaced by any computer code; - eventually, although there is no novelty in this study, we would like to show that the correct application of appropriated statistical techniques permits a better definition and quantification of the errors and, mostly important, allows a correct (unbiased) communication between forecasters and decision makers.
Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...... constitute a valuable input to freight models for forecasting future capacity problems.......Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...
Borkum, Mark I; Frey, Jeremy G
The drug discovery process is now highly dependent on the management, curation and integration of large amounts of potentially useful data. Semantics are necessary in order to interpret the information and derive knowledge. Advances in recent years have mitigated concerns that the lack of robust, usable tools has inhibited the adoption of methodologies based on semantics. THIS PAPER PRESENTS THREE EXAMPLES OF HOW SEMANTIC WEB TECHNIQUES AND TECHNOLOGIES CAN BE USED IN ORDER TO SUPPORT CHEMISTRY RESEARCH: a controlled vocabulary for quantities, units and symbols in physical chemistry; a controlled vocabulary for the classification and labelling of chemical substances and mixtures; and, a database of chemical identifiers. This paper also presents a Web-based service that uses the datasets in order to assist with the completion of risk assessment forms, along with a discussion of the legal implications and value-proposition for the use of such a service. We have introduced the Semantic Web concepts, technologies, and methodologies that can be used to support chemistry research, and have demonstrated the application of those techniques in three areas very relevant to modern chemistry research, generating three new datasets that we offer as exemplars of an extensible portfolio of advanced data integration facilities. We have thereby established the importance of Semantic Web techniques and technologies for meeting Wild's fourth "grand challenge".
Mikami, M; Tanaka, T Y; Maki, T
JMAs dust forecasting information, which is based on a GCM dust model, is presented through the JMA website coupled with nowcast information. The website was updated recently and JMA and MOE joint 'KOSA' website was open from April 2008. Data assimilation technique will be introduced for improvement of the 'KOSA' information.
Zhao, Zhizhen; Giannakis, Dimitrios
Analog forecasting is a nonparametric technique introduced by Lorenz in 1969 which predicts the evolution of states of a dynamical system (or observables defined on the states) by following the evolution of the sample in a historical record of observations which most closely resembles the current initial data. Here, we introduce a suite of forecasting methods which improve traditional analog forecasting by combining ideas from kernel methods developed in harmonic analysis and machine learning and state-space reconstruction for dynamical systems. A key ingredient of our approach is to replace single-analog forecasting with weighted ensembles of analogs constructed using local similarity kernels. The kernels used here employ a number of dynamics-dependent features designed to improve forecast skill, including Takens’ delay-coordinate maps (to recover information in the initial data lost through partial observations) and a directional dependence on the dynamical vector field generating the data. Mathematically, our approach is closely related to kernel methods for out-of-sample extension of functions, and we discuss alternative strategies based on the Nyström method and the multiscale Laplacian pyramids technique. We illustrate these techniques in applications to forecasting in a low-order deterministic model for atmospheric dynamics with chaotic metastability, and interannual-scale forecasting in the North Pacific sector of a comprehensive climate model. We find that forecasts based on kernel-weighted ensembles have significantly higher skill than the conventional approach following a single analog.
Technological tools and work methodologies most used in the area of geological sciences are reviewed and described. The various electronic devices such as laptops, palmtops or PDA (personal digital assistant), tablets and smartphones have allowed to take field geological data and store them efficiently. Tablets and smartphones have been convenient for data collection of scientific data by the diversity of sensors that present, portability, autonomy and the possibility to install specific applications. High precision GPS in conjunction with LIDAR technology and sonar technology have been more accessible and used for geological research, generating high resolution three-dimensional models to complement geological studies. Remote sensing techniques such as high penetration radar are used to perform models of the ice thickness and topography in Antarctic. Modern three-dimensional scanning and printing techniques are used in geological science research and teaching. Currently, the advance in the computer technology has allowed to handle three-dimensional models on personal computers efficiently way and with different display options. Some, of the new areas of geology, emerged recently, are mentioned to generate a broad panorama toward where can direct geological researches in the next years [es
Lusis, Peter; Khalilpour, Kaveh Rajab; Andrew, Lachlan
forecasting for a single-customer or even down at an appliance level. Access to high resolution data from smart meters has enabled the research community to assess conventional load forecasting techniques and develop new forecasting strategies suitable for demand-side disaggregated loads. This paper studies...... how calendar effects, forecasting granularity and the length of the training set affect the accuracy of a day-ahead load forecast for residential customers. Root mean square error (RMSE) and normalized RMSE were used as forecast error metrics. Regression trees, neural networks, and support vector...... regression yielded similar average RMSE results, but statistical analysis showed that regression trees technique is significantly better. The use of historical load profiles with daily and weekly seasonality, combined with weather data, leaves the explicit calendar effects a very low predictive power...
Among the many beneficial applications of radiation and radioisotopes in industry which are now well established in advanced countries, the applications of nuclear techniques and nucleonic control systems in the mineral industry have great potential for developing Member States. The use of nucleonic on-stream analyzers in the coal industry has resulted in enormous technical and economic benefits in addition to minimization of environmental pollution. Large savings have also resulted from the use of such analyzers in the processing of other minerals. Nuclear borehole logging techniques have demonstrated great potential in oil and gas evaluation. Radiotracer investigations have led to process optimisation and trouble shooting in various stages in ore processing and metallurgy. Though the technical and economic benefits of applications of nuclear techniques in the mineral industry are well recognised, technology transfer in these areas has been hampered by a variety of factors. In order to review the status and trends in nuclear techniques and nucleonic control systems in the mineral industry and the problems and considerations in their technology transfer to developing Member States, the IAEA convened an Advisory Group Meeting in Bombay, India, 15-19 January 1990. The present publication is based on the 7 contributions presented at this meeting. A separate abstract was prepared for each of these papers. Refs, figs and tabs
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help
Nilsson, P.; Uvo, C.B.; Landman, W.A.
flows. The technique includes model output statistics (MOS) based on a non-linear Neural Network (NN) approach. Results show that streamflow forecasts from Global Circulation Model (GCM) predictions, for the Scandinavia region are viable and highest skill values were found for basins located in south......A seasonal forecasting technique to produce probabilistic and deterministic streamflow forecasts for 23 basins in Norway and northern Sweden is developed in this work. Large scale circulation and moisture fields, forecasted by the ECHAM4.5 model 4 months in advance, are used to forecast spring...
Templeton, K.J.; Dirks, L.L.
Guidance for forecasting solid low-level waste (LLW) on a site-wide basis is described in this document. Forecasting is defined as an approach for collecting information about future waste receipts. The forecasting approach discussed in this document is based solely on hanford's experience within the last six years. Hanford's forecasting technique is not a statistical forecast based upon past receipts. Due to waste generator mission changes, startup of new facilities, and waste generator uncertainties, statistical methods have proven to be inadequate for the site. It is recommended that an approach similar to Hanford's annual forecasting strategy be implemented at each US Department of Energy (DOE) installation to ensure that forecast data are collected in a consistent manner across the DOE complex. Hanford's forecasting strategy consists of a forecast cycle that can take 12 to 30 months to complete. The duration of the cycle depends on the number of LLW generators and staff experience; however, the duration has been reduced with each new cycle. Several uncertainties are associated with collecting data about future waste receipts. Volume, shipping schedule, and characterization data are often reported as estimates with some level of uncertainty. At Hanford, several methods have been implemented to capture the level of uncertainty. Collection of a maximum and minimum volume range has been implemented as well as questionnaires to assess the relative certainty in the requested data
Kobold, Mira; Suselj, Kay
The timely and accurate flood forecasting is essential for the reliable flood warning. The effectiveness of flood warning is dependent on the forecast accuracy of certain physical parameters, such as the peak magnitude of the flood, its timing, location and duration. The conceptual rainfall - runoff models enable the estimation of these parameters and lead to useful operational forecasts. The accurate rainfall is the most important input into hydrological models. The input for the rainfall can be real time rain-gauges data, or weather radar data, or meteorological forecasted precipitation. The torrential nature of streams and fast runoff are characteristic for the most of the Slovenian rivers. Extensive damage is caused almost every year- by rainstorms affecting different regions of Slovenia' The lag time between rainfall and runoff is very short for Slovenian territory and on-line data are used only for now casting. Forecasted precipitations are necessary for hydrological forecast for some days ahead. ECMWF (European Centre for Medium-Range Weather Forecasts) gives general forecast for several days ahead while more detailed precipitation data with limited area ALADIN/Sl model are available for two days ahead. There is a certain degree of uncertainty using such precipitation forecasts based on meteorological models. The variability of precipitation is very high in Slovenia and the uncertainty of ECMWF predicted precipitation is very large for Slovenian territory. ECMWF model can predict precipitation events correctly, but underestimates amount of precipitation in general The average underestimation is about 60% for Slovenian region. The predictions of limited area ALADIN/Si model up to; 48 hours ahead show greater applicability in hydrological forecasting. The hydrological models are sensitive to precipitation input. The deviation of runoff is much bigger than the rainfall deviation. Runoff to rainfall error fraction is about 1.6. If spatial and time distribution
N. B. ROMLI
Full Text Available Total power dissipation in CMOS circuits has become a huge challenging in current semiconductor industry due to the leakage current and the leakage power. The exponential growth of both static and dynamic power dissipations in any CMOS process technology option has increased the cost and efficiency of the system. Technology options are used for the execution specifications and usually it depends on the optimisation and the performance constraints over the chip. This article reviews the relevant researches of the source or power dissipation, the mechanism to reduce the dynamic power dissipation as well as static power dissipation and an overview of various circuit techniques to control them. Important device parameters including voltage threshold and switching capacitance impact to the circuit performance in lowering both dynamic and static power dissipation are presented. The demand for the reduction of power dissipation in CMOS technology shall remain a challenging and active area of research for years to come. Thus, this review shall work as a guideline for the researchers who wish to work on power dissipation and control techniques.
Belaifa, M.; Morimune, K.
The petroleum sector plays a neuralgic role in the basement of world economies, and market actors (producers, intermediates, as well as consumers) are continuously subjected to the dynamics of unstable oil market. Huge amounts are being invested along the production chain to make one barrel of crude oil available to the end user. Adding to that are the effect of geopolitical dynamics as well as geological risks as expressed in terms of low chances of successful discoveries. In addition, fiscal regimes and regulations, technology and environmental concerns are also among some of the major factors that contribute to the substantial risk in the oil industry and render the market structure vulnerable to crises. The management of these vulnerabilities require modern tools to reduce risk to a certain level, which unfortunately is a non-zero value. The aim of this paper is, therefore, to provide a modern technique to capture the oil price stochastic volatility that can be implemented to value the exposure of an investor, a company, a corporate or a Government. The paper first analyses the regional dependence on oil prices, through a historical perspective and then looks at the evolution of pricing environment since the large price jumps of the 1970s. The main causes of oil prices volatility are treated in the third part of the paper. The rest of the article deals with volatility models and forecasts used in risk management, with an implication for pricing derivatives. (author)
Qin Ling; Leung, Kwok Sui; Griffith, J.F.
This book provides a perspective on the current status of bioimaging technologies developed to assess the quality of musculoskeletal tissue with an emphasis on bone and cartilage. It offers evaluations of scaffold biomaterials developed for enhancing the repair of musculoskeletal tissues. These bioimaging techniques include micro-CT, nano-CT, pQCT/QCT, MRI, and ultrasound, which provide not only 2-D and 3-D images of the related organs or tissues, but also quantifications of the relevant parameters. The advance bioimaging technologies developed for the above applications are also extended by incorporating imaging contrast-enhancement materials. Thus, this book will provide a unique platform for multidisciplinary collaborations in education and joint R and D among various professions, including biomedical engineering, biomaterials, and basic and clinical medicine. (orig.)
Brahmbhatt, Akshaar; Misra, Sanjay
Peripheral Arterial Disease (PAD) affects over 8 million people in the United States alone. While great strides have been made in reducing the burden of cardiovascular disease the prevalence of PAD is expected to rise as the global population ages. PAD characterized by narrowing of arterial blood can be asymptomatic or cause acute limb threatening claudication. It has been classically treated with bypass, but these techniques have been supplanted by endovascular therapy. Plain old Balloon Angioplasty (POBA) has been successful in helping revascularize lesions, but its effect has not been durable due to restenosis. This prompted the creation of several technologies aimed at reducing restenosis. These advances slowly improved outcomes and the durability of endovascular management. Amongst the main tools used in current endovascular practice are drug delivery devices aimed at inhibiting the inflammatory and proliferative pathways that lead to restenosis. This review will examine the current drug delivery technologies used in the SFA. PMID:27423996
seyyed mohammad zargar
Full Text Available Cloud computing is a new method to provide computing resources and increase computing power in organizations. Despite the many benefits this method shares, it has not been universally used because of some obstacles including security issues and has become a concern for IT managers in organization. In this paper, the general definition of cloud computing is presented. In addition, having reviewed previous studies, the researchers identified effective variables on technology acceptance and, especially, cloud computing technology. Then, using DEMATEL technique, the effectiveness and permeability of the variable were determined. The researchers also designed a model to show the existing dynamics in cloud computing technology using system dynamics approach. The validity of the model was confirmed through evaluation methods in dynamics model by using VENSIM software. Finally, based on different conditions of the proposed model, a variety of scenarios were designed. Then, the implementation of these scenarios was simulated within the proposed model. The results showed that any increase in data security, government support and user training can lead to the increase in the adoption and use of cloud computing technology.
Mwangi, C.; Ndemwa, P.
The concept is a new concept in Kenya that need driven technology. A paradigm shift from the conventional methods of measuring breast milk intake by means of weighing infants before and after feeding. A validation tool against anthropometrical measures of body fat through body density and skin-fold measurements. A reliable, accurate and non-invasive tool for monitoring lean body mass changes in clinical assessments.Isotopes Techniques in Body composition assessment.Technique-based Parameters of efficacy and/or effect are: Isotope (deuterium) dose given orally to subject (about 30 grams),Saliva (or urine) samples collected after 3-4 hrs, Concentration of isotope in saliva is measured using Fourier Transformed Infra-red Spectrophotometer (FTIR), Concentration gives the Total Body Water (TBW) component in the body, TBW = 73% Fat Free Mass (FFM), Calculate FFM (kg) from equation and subtract from Total Body Weight (kg) to get value of Fat Mass (kg)
Raghuwanshi, Sanjeev Kumar; Srivastav, Akash
Microwave photonics system provides high bandwidth capabilities of fiber optic systems and also contains the ability to provide interconnect transmission properties, which are virtually independent of length. The low-loss wide bandwidth capability of optoelectronic systems makes them attractive for the transmission and processing of microwave signals, while the development of high-capacity optical communication systems has required the use of microwave techniques in optical transmitters and receivers. These two strands have led to the development of the research area of microwave photonics. So, we can considered microwave photonics as the field that studies the interaction between microwave and optical waves for applications such as communications, radars, sensors and instrumentations. In this paper we have thoroughly reviewed the microwave generation techniques by using photonics technology.
Horngren , C. T., Datar, S . M., & Foster, G. (2006). Cost Accounting : A Managerial Emphasis. 12th ed. Saddle River, NJ: Pearson...COVERED Master’s Thesis 4. TITLE AND SUBTITLE A Cost Estimation Analysis of U.S. Navy Ship Fuel-Savings Techniques and Technologies 6. AUTHOR( S ...FY12 FY13 FY14 FY15 FY16 FY17 FY18 N P V C u m S a v i n g s ( $ / y r / S D s h i p s ) Time
Full Text Available The wafer-level packaging process is an important technology used in semiconductor manufacturing, and how to effectively control this manufacturing system is thus an important issue for packaging firms. One way to aid in this process is to use a forecasting tool. However, the number of observations collected in the early stages of this process is usually too few to use with traditional forecasting techniques, and thus inaccurate results are obtained. One potential solution to this problem is the use of grey system theory, with its feature of small dataset modeling. This study thus uses the AGM(1,1 grey model to solve the problem of forecasting in the pilot run stage of the packaging process. The experimental results show that the grey approach is an appropriate and effective forecasting tool for use with small datasets and that it can be applied to improve the wafer-level packaging process.
Batukaev, Abdulmalik; Kalinitchenko, Valery; Minkina, Tatiana; Mandzhieva, Saglara; Sushkova, Svetlana
The uncertainty and degradation of biosphere is a result of outdated industrial technologies. The incorrect principals of the nature resources use paradigm are to be radically changed corresponding to principals of Geoethics. Technological dead-end is linked to Philosophy of Technology. The organic protection and imitation of natural patterns are till now the theoretical base of technology. The technological and social determinism are proposed as the "inevitable" for humankind. One is forced to believe that the only way for humanity is to agree that the outdated way of technical development is the only possibility for humankind to survive. But rough imitation as a method of outdated technological platform is fruitless now. Survival under practice of industrial technology platform now has become extremely dangerous. The challenge for humanity is to overcome the chain of environmental hazards of agronomy, irrigation, industry, and other human activities in biosphere, which awkwardly imitate the natural processes: plowing leads to degradation of soil and greenhouse gases emission; irrigation leads to excessive moistening and degradation of soil, landscape, greenhouse gases emission, loss of freshwater - the global deficit; waste utilization leads to greenhouse gases emission, loss of oxigen and other ecological hazards. The fundamentally new technologies are to be generates for development of biosphere, food and resources renewing. Aristotle told that technique can go beyond nature and implement "what nature can't bring to a finish." To overcome fundamental shortcomings of industrial technologies, incorrect land use we propose the Biogeosystem Technique (BGT*) for biosphere sustainability. The BGT* key point is transcendent approach (not imitating of the natural processes) - new technical solutions for biosphere - soil construction, the fluxes of energy, matter, and water control and biological productivity of terrestrial systems. Intra-soil milling which provides the
Papadiochou, Sofia; Pissiotis, Argirios L
The comparative assessment of computer-aided design and computer-aided manufacturing (CAD-CAM) technology and other fabrication techniques pertaining to marginal adaptation should be documented. Limited evidence exists on the effect of restorative material on the performance of a CAD-CAM system relative to marginal adaptation. The purpose of this systematic review was to investigate whether the marginal adaptation of CAD-CAM single crowns, fixed dental prostheses, and implant-retained fixed dental prostheses or their infrastructures differs from that obtained by other fabrication techniques using a similar restorative material and whether it depends on the type of restorative material. An electronic search of English-language literature published between January 1, 2000, and June 30, 2016, was conducted of the Medline/PubMed database. Of the 55 included comparative studies, 28 compared CAD-CAM technology with conventional fabrication techniques, 12 contrasted CAD-CAM technology and copy milling, 4 compared CAD-CAM milling with direct metal laser sintering (DMLS), and 22 investigated the performance of a CAD-CAM system regarding marginal adaptation in restorations/infrastructures produced with different restorative materials. Most of the CAD-CAM restorations/infrastructures were within the clinically acceptable marginal discrepancy (MD) range. The performance of a CAD-CAM system relative to marginal adaptation is influenced by the restorative material. Compared with CAD-CAM, most of the heat-pressed lithium disilicate crowns displayed equal or smaller MD values. Slip-casting crowns exhibited similar or better marginal accuracy than those fabricated with CAD-CAM. Cobalt-chromium and titanium implant infrastructures produced using a CAD-CAM system elicited smaller MD values than zirconia. The majority of cobalt-chromium restorations/infrastructures produced by DMLS displayed better marginal accuracy than those fabricated with the casting technique. Compared with copy
Ullah, Nazrin; Choudhury, P.
The aim of the present study is to investigate applicability of artificial intelligence techniques such as ANFIS (Adaptive Neuro-Fuzzy Inference System) in forecasting flood flow in a river system. The proposed technique combines the learning ability of neural network with the transparent linguistic representation of fuzzy system. The technique is applied to forecast discharge at a downstream station using flow information at various upstream stations. A total of three years data has been selected for the implementation of this model. ANFIS models with various input structures and membership functions are constructed, trained and tested to evaluate efficiency of the models. Statistical indices such as Root Mean Square Error (RMSE), Correlation Coefficient (CORR) and Coefficient of Efficiency (CE) are used to evaluate performance of the ANFIS models in forecasting river flood. The values of the indices show that ANFIS model can accurately and reliably be used to forecast flood in a river system.
Cai, Yuancui; Chen, Lichao
With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.
Full Text Available
ENGLISH ABSTRACT: This paper tests the accuracy of using Linear regression, Logistics regression, and Bass curves in selected new product rollouts, based on sales data. The selected new products come from the electronics and electrical engineering and information and communications technology industries. The eight selected products are: electronic switchgear, electric motors, supervisory control and data acquisition systems, programmable logic controllers, cell phones, wireless modules, routers, and antennas. We compare the Linear regression, Logistics regression and Bass curves with respect to forecasting using analysis of variance. The accuracy of these three curves is studied and conclusions are drawn. We use an expert panel to compare the different curves and provide lessons for managers to improve forecasting new product sales. In addition, comparison between the two industries is drawn, and areas for further research are indicated.
AFRIKAANSE OPSOMMING: Hierdie artikel toets die akkuraatheid van die gebruik van linêere regressie, logistiese regressie en Bass-krommes by die bekendstelling van nuwe produkte gebaseer op verkoopsdata. Die geselekteerde nuwe produkte is uit die elektriese en elektroniese asook informasietegnologie- en kommunikasie bedrywe. Linêere regressie, logistiese regressie en Bass-krommes word vergelyk ten opsigte van vooruitskatting deur variansie te ontleed. Die akkuraatheid word ontleed en gevolgtrekkings gemaak. Die doel is om vooruitskatting van nuwe produkverkope te verbeter.
Pizzo, V. J.
I will present my view of the current status of space weather forecasting abilities related to CMEs. This talk will address the large-scale aspects, but specifically not energetic particle phenomena. A key point is that all models, whether sophisticated numerical contraptions or quasi-empirical ones, are only as good as the data you feed them. Hence the emphasis will be on observations and analysis methods. First I will review where we stand with regard to the near-Sun quantitative data needed to drive any model, no matter how complex or simple-minded, and I will discuss technological roadblocks that suggest it may be some time before we see any meaningful improvements beyond what we have today. Then I cover issues related to characterizing CME propagation out through the corona and into interplanetary space, as well as to observational limitations in the vicinity of 1 AU. Since none of these observational constraints are likely to be resolved anytime soon, the real challenge is to make more informed use of what is available. Thus, this talk will focus on how we may identify and pursue the most profitable approaches, for both forecast and research applications. The discussion will highlight a number of promising leads, including those related to inclusion of solar backside information, joint magnetograph observations from L5 and Earth, how to use (not just run) ensembles, more rational use of HI observations, and suggestions for using cube-sats for deep space observations of CMEs and MCs.
Puechl, K H
The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis.
The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis. (author)
Jennifer Castle; David Hendry; Michael P. Clements
We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods ar...
Gersbach, Hans; Hahn, Volker
We introduce a new type of incentive contract for central bankers: inflation forecast contracts, which make central bankers’ remunerations contingent on the precision of their inflation forecasts. We show that such contracts enable central bankers to influence inflation expectations more effectively, thus facilitating more successful stabilization of current inflation. Inflation forecast contracts improve the accuracy of inflation forecasts, but have adverse consequences for output. On balanc...
Many historians today prefer to speak of knowledge and practice rather than science and technology. Here I argue for the value of reinstating the terms science, techniques and technology as tools for a more precise analysis of governmentality and the workings of power. My tactic is to use these three categories and their articulations to highlight flows between matter and ideas in the production and reproduction of knowledge. In any society, agriculture offers a wonderfully rich case of how ideas, material goods and social relations interweave. In China agronomy was a science of state, the basis of legitimate rule. I compare different genres of agronomic treatise to highlight what officials, landowners and peasants respectively contributed to, and expected from, this charged natural knowledge. I ask how new forms of textual and graphic inscription for encoding agronomic knowledge facilitated its dissemination and ask how successful this knowledge proved when rematerialized and tested as concrete artefacts or techniques. I highlight forms of innovation in response to crisis, and outline the overlapping interpretative frameworks within which the material applications of Chinese agricultural science confirmed and extended its truth across space and time.
Ayhan, Teoman; Al Madani, Hussain [Mechanical Engineering Department, College of Engineering, University of Bahrain, P.O. box 32038, Isatown 32036 (Bahrain)
With an ever-increasing population and rapid growth of industrialization, there is great demand for fresh water. Desalination has been a key proponent to meet the future challenges due to decreasing availability of fresh water. However, desalination uses significant amount of energy, today mostly from fossil fuels. It is, therefore, reasonable to rely on renewable energy sources such as solar energy, wind energy, ocean thermal energy, waste heat from the industry and other renewable sources. The present study deals with the energy-efficient seawater desalination system utilizing renewable energy sources and natural vacuum technique. A new desalination technology named Natural Vacuum Desalination is proposed. The novel desalination technique achieve remarkable energy efficiency through the evaporation of seawater under vacuum and will be described in sufficient detail to demonstrate that it requires much less electric energy compared to any conventional desalination plant of fresh water production of similar capacity. The discussion will highlight the main operative and maintenance features of the proposed natural vacuum seawater desalination technology which seems to have promising techno-economic potential providing also advantageous coupling with renewable energy sources. (author)
Illig, Kurt R
Undergraduate neuroscience courses typically involve highly interdisciplinary material, and it is often necessary to use class time to review how principles of chemistry, math and biology apply to neuroscience. Lecturing and Socratic discussion can work well to deliver information to students, but these techniques can lead students to feel more like spectators than participants in a class, and do not actively engage students in the critical analysis and application of experimental evidence. If one goal of undergraduate neuroscience education is to foster critical thinking skills, then the classroom should be a place where students and instructors can work together to develop them. Students learn how to think critically by directly engaging with course material, and by discussing evidence with their peers, but taking classroom time for these activities requires that an instructor find a way to provide course materials outside of class. Using technology as an on-demand provider of course materials can give instructors the freedom to restructure classroom time, allowing students to work together in small groups and to have discussions that foster critical thinking, and allowing the instructor to model these skills. In this paper, I provide a rationale for reducing the use of traditional lectures in favor of more student-centered activities, I present several methods that can be used to deliver course materials outside of class and discuss their use, and I provide a few examples of how these techniques and technologies can help improve learning outcomes.
Jun, Jong Sun; Lee, Byung Sun; Han, Sang Joon; Shin, Yong Chul; Kim, Yung Baek; Kim, Dong Hoon; Oh, Yang Kyoon; Suh, Yung; Choi, Chan Duk; Kang, Byung Hun; Hong, Hyung Pyo; Shin, Jee Tae; Moon, Kwon Kee; Lee, Soon Sung; Kim, Sung Hoh; Koo, In Soo; Kim, Dong Wan; Huh, Sub [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
A study has been performed for the advanced DSP technology for digital nuclear I and C systems and its prototype, and for the monitoring and diagnosing techniques for the highly-pressurized components in NSSS. In the DSP part, the DSP requirements for NPPs have been induced for the performance of the DSP systems and the functional analysis for Reactor Coolant System (RCS) has been performed as the embodied target system. Total quantities of the I and C signals, signal types, and signal functions were also investigated in Ulchin NPP units 3 and 4. From these basis, the prototype facility was configured for performance validation and algorithm implementation. In order to develop the methods of DSP techniques and algorithms, the current signal validation methods have been studied and analyzed. In the analysis for the communication networks in NPP, the basic technique for the configuration of communication networks and the important considerations for applying to NPPs have been reviewed. Test and experimental facilities have been set up in order to carry out the required tests during research activities on the monitoring techniques for abnormal conditions. Studies were concentrated on methods how to acquire vibration signals from the mechanical structures and equipment including rotating machinery and reactor, and analyses for the characteristics of the signals. Fuzzy logic was evaluated as a good technique to improve the reliability of the monitoring and diagnosing algorithm through the application of the theory such as the automatic pattern recognition algorithm of the vibration spectrum, the alarm detection and diagnosis for collisions of loose parts. 71 figs, 32 tabs, 64 refs. (Author).
Jun, Jong Sun; Lee, Byung Sun; Han, Sang Joon; Shin, Yong Chul; Kim, Yung Baek; Kim, Dong Hoon; Oh, Yang Kyoon; Suh, Yung; Choi, Chan Duk; Kang, Byung Hun; Hong, Hyung Pyo; Shin, Jee Tae; Moon, Kwon Kee; Lee, Soon Sung; Kim, Sung Hoh; Koo, In Soo; Kim, Dong Wan; Huh, Sub
A study has been performed for the advanced DSP technology for digital nuclear I and C systems and its prototype, and for the monitoring and diagnosing techniques for the highly-pressurized components in NSSS. In the DSP part, the DSP requirements for NPPs have been induced for the performance of the DSP systems and the functional analysis for Reactor Coolant System (RCS) has been performed as the embodied target system. Total quantities of the I and C signals, signal types, and signal functions were also investigated in Ulchin NPP units 3 and 4. From these basis, the prototype facility was configured for performance validation and algorithm implementation. In order to develop the methods of DSP techniques and algorithms, the current signal validation methods have been studied and analyzed. In the analysis for the communication networks in NPP, the basic technique for the configuration of communication networks and the important considerations for applying to NPPs have been reviewed. Test and experimental facilities have been set up in order to carry out the required tests during research activities on the monitoring techniques for abnormal conditions. Studies were concentrated on methods how to acquire vibration signals from the mechanical structures and equipment including rotating machinery and reactor, and analyses for the characteristics of the signals. Fuzzy logic was evaluated as a good technique to improve the reliability of the monitoring and diagnosing algorithm through the application of the theory such as the automatic pattern recognition algorithm of the vibration spectrum, the alarm detection and diagnosis for collisions of loose parts. 71 figs, 32 tabs, 64 refs. (Author)
Eppich, R.; Almagro Vidal, A.
, Jordan, Argentina, United Arab Emirates, United States of America and around 20 other countries. These strategies deal with establishing methodologies and guiding principles for the selection of technologies, highlighting successful illustrated examples, levelling uneven educational bases and gaining access to expertise. The authors have developed these strategies and techniques to appeal, engage and succeed with such diverse groups - to encourage the participants to cooperate on a common goal and overcome specific challenges while embracing the technology and thinking critically about its appropriate application for the conservation of cultural heritage in their home countries. Other strategies include setting norms that respect the various cultures and differing levels of technology education, offering voluntary sessions for more advanced and ambitious participants, finding and then adopting natural leaders as co-instructors and offering a mix of sessions including standard lectures combined with field and laboratory exercises and distance learning. This methodology and strategies have proven to be successful as the participants have provided positive evaluations months and/or years after the courses, implemented their own courses using the materials and methods and have established a strong, sustainable network related to this topic.
Fan, Shuangrui; JI, TINGYUN; Bergqvist, Rickard
modeling techniques used in freight rate forecasting. At the same time research in shipping index forecasting e.g. BDTI applying artificial intelligent techniques is scarce. This analyses the possibilities to forecast the BDTI by applying Wavelet Neural Networks (WNN). Firstly, the characteristics...... of traditional and artificial intelligent forecasting techniques are discussed and rationales for choosing WNN are explained. Secondly, the components and features of BDTI will be explicated. After that, the authors delve the determinants and influencing factors behind fluctuations of the BDTI in order to set...
Full Text Available An increased number of intermittent renewables poses a threat to the system balance. As a result, new tools and concepts, like advanced demand-side management and smart grid technologies, are required for the demand to meet supply. There is a need for higher consumer awareness and automatic response to a shortage or surplus of electricity. The distributed water heater can be considered as one of the most energy-intensive devices, where its energy demand is shiftable in time without influencing the comfort level. Tailored hot water usage predictions and advanced control techniques could enable these devices to supply ancillary energy balancing services. The paper analyses a set of hot water consumption data from residential dwellings. This work is an important foundation for the development of a demand-side management strategy based on hot water consumption forecasting at the level of individual residential houses. Various forecasting models, such as exponential smoothing, seasonal autoregressive integrated moving average, seasonal decomposition and a combination of them, are fitted to test different prediction techniques. These models outperform the chosen benchmark models (mean, naive and seasonal naive and show better performance measure values. The results suggest that seasonal decomposition of the time series plays the most significant part in the accuracy of forecasting.
Bamford, Samuel; Kregsamer, Peter; Fazinic, Stjepko; Jaksic, Milko; Wegrzynek, Dariusz; Chinea-Cano, Ernesto; Markowicz, Andrzej
Two thin film technological materials (A/B) from the aerospace industry have been characterized for their elemental composition, for the purpose of determining their purity and trace element distribution. The results contribute to the assessment of the materials' suitability as part of a spacecraft's thermal hardware. Analysis was done using a combination of PIXE/RBS and energy dispersive X-ray fluorescence (EDXRF) analytical techniques. Samples of the materials were analyzed with PIXE/RBS system using 2 MeV proton beam from a 1 MV Tandetron accelerator and also with separate EDXRF systems employing Am-241 and Mo-secondary target as excitation sources. PIXE/RBS measurements enabled identification of the elemental composition and elucidation of the layer structure of the materials. From the PIXE/RBS results, Am-241-excited EDXRF technique was selected for quantitative determination of indium (In) and tin (Sn) by their K-X-rays, after reasonable absorption corrections. A comparison has been made of the results obtained from EDXRF and PIXE/RBS. Material A has been found to be a thin film with three layers, while material B is a thin film comprised of four layers. Thicknesses and compositions (including trace elements) of all layers have been determined. The limitation of EDXRF in the analysis of inhomogeneously distributed elements was overcome by using PIXE/RBS as an appropriate complimentary technique
King, Rachel C; Villeneuve, Emma; White, Ruth J; Sherratt, R Simon; Holderbaum, William; Harwin, William S
Technological advances in sensors and communications have enabled discrete integration into everyday objects, both in the home and about the person. Information gathered by monitoring physiological, behavioural, and social aspects of our lives, can be used to achieve a positive impact on quality of life, health, and well-being. Wearable sensors are at the cusp of becoming truly pervasive, and could be woven into the clothes and accessories that we wear such that they become ubiquitous and transparent. To interpret the complex multidimensional information provided by these sensors, data fusion techniques are employed to provide a meaningful representation of the sensor outputs. This paper is intended to provide a short overview of data fusion techniques and algorithms that can be used to interpret wearable sensor data in the context of health monitoring applications. The application of these techniques are then described in the context of healthcare including activity and ambulatory monitoring, gait analysis, fall detection, and biometric monitoring. A snap-shot of current commercially available sensors is also provided, focusing on their sensing capability, and a commentary on the gaps that need to be bridged to bring research to market. Copyright © 2017. Published by Elsevier Ltd.
Gallorini, M.; Pietra, R.; Sabbioni, E.
Neutron activation analysis and radioanalytical techniques have been employed to investigate problems related to trace elements and high purity technology materials. Applications of these techniques are overviewed: semiconductor technology as in the case of As and In ion implantation in high purity silicon; problems related to trace elements impurities in thermometric measurements; coating materials to prevent trace elements contamination in biological sampling and metals release from human prostheses. (author) 8 refs.; 2 figs.; 8 tabs
Willis, H Lee
Containing 12 new chapters, this second edition contains offers increased-coverage of weather correction and normalization of forecasts, anticipation of redevelopment, determining the validity of announced developments, and minimizing risk from over- or under-planning. It provides specific examples and detailed explanations of key points to consider for both standard and unusual utility forecasting situations, information on new algorithms and concepts in forecasting, a review of forecasting pitfalls and mistakes, case studies depicting challenging forecast environments, and load models illustrating various types of demand.
Hopson, T. M.
One potential benefit of an ensemble prediction system (EPS) is its capacity to forecast its own forecast error through the ensemble spread-error relationship. In practice, an EPS is often quite limited in its ability to represent the variable expectation of forecast error through the variable dispersion of the ensemble, and perhaps more fundamentally, in its ability to provide enough variability in the ensembles dispersion to make the skill-spread relationship even potentially useful (irrespective of whether the EPS is well-calibrated or not). In this paper we examine the ensemble skill-spread relationship of an ensemble constructed from the TIGGE (THORPEX Interactive Grand Global Ensemble) dataset of global forecasts and a combination of multi-model and post-processing approaches. Both of the multi-model and post-processing techniques are based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. The methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. A context for these concepts is provided by assessing the constructed ensemble in forecasting district-level humidity impacting the incidence of meningitis in the meningitis belt of Africa, and in forecasting flooding events in the Brahmaputra and Ganges basins of South Asia.
Lehner, F.; Wood, A.; Llewellyn, D.; Blatchford, D. B.; Goodbody, A. G.; Pappenberger, F.
Recent studies have documented the influence of increasing temperature on streamflow across the American West, including snow-melt driven rivers such as the Colorado or Rio Grande. At the same time, some basins are reporting decreasing skill in seasonal streamflow forecasts, termed water supply forecasts (WSFs), over the recent decade. While the skill in seasonal precipitation forecasts from dynamical models remains low, their skill in predicting seasonal temperature variations could potentially be harvested for WSFs to account for non-stationarity in regional temperatures. Here, we investigate whether WSF skill can be improved by incorporating seasonal temperature forecasts from dynamical forecasting models (from the North American Multi Model Ensemble and the European Centre for Medium-Range Weather Forecast System 4) into traditional statistical forecast models. We find improved streamflow forecast skill relative to traditional WSF approaches in a majority of headwater locations in the Colorado and Rio Grande basins. Incorporation of temperature into WSFs thus provides a promising avenue to increase the robustness of current forecasting techniques in the face of continued regional warming.
Malila, W. A.; Crane, R. B.; Richardson, W.
Recent improvements in remote sensor technology carry implications for data processing. Multispectral line scanners now exist that can collect data simultaneously and in registration in multiple channels at both reflective and thermal (emissive) wavelengths. Progress in dealing with two resultant recognition processing problems is discussed: (1) More channels mean higher processing costs; to combat these costs, a new and faster procedure for selecting subsets of channels has been developed. (2) Differences between thermal and reflective characteristics influence recognition processing; to illustrate the magnitude of these differences, some explanatory calculations are presented. Also introduced, is a different way to process multispectral scanner data, namely, radiation balance mapping and related procedures. Techniques and potentials are discussed and examples presented.
Vacik, J.; Hnatowicz, V.; Cervena, J.; Perina, V.; Mach, R.
Accelerator driven transmutation technology (ADTT) is a promising way toward liquidation of spent nuclear fuel, nuclear wastes and weapon grade Pu. The ADTT facility comprises a high current (proton) accelerator supplying a sub-critical reactor assembly with spallation neutrons. The reactor part is supposed to be cooled by molten fluorides or metals which serve, at the same time, as a carrier of nuclear fuel. Assumed high working temperature (400-600 C) and high radiation load in the subcritical reactor and spallation neutron source put forward the problem of optimal choice of ADTT construction materials, especially from the point of their radiation and corrosion resistance when in contact with liquid working media. The use of prompt nuclear analytical techniques in ADTT related material research is considered and examples of preliminary analytical results obtained using neutron depth profiling method are shown for illustration. (orig.)
In this review of the accuracy required and achievable in radiotherapy dosimetry, older approaches and evidence-based estimates for 3DCRT have been reprised, summarising and drawing together the author's earlier evaluations where still relevant. Available evidence for IMRT uncertainties has been reviewed, selecting information from tolerances, QA, verification measurements, in vivo dosimetry and dose delivery audits, to consider whether achievable uncertainties increase or decrease for current advanced treatments and practice. Overall there is some evidence that they tend to increase, but that similar levels should be achievable. Thus it is concluded that those earlier estimates of achievable dosimetric accuracy are still applicable, despite the changes and advances in technology and techniques. The one exception is where there is significant lung involvement, where it is likely that uncertainties have now improved due to widespread use of more accurate heterogeneity models. Geometric uncertainties have improved with the wide availability of IGRT.
A CFD and RTD Education Package was developed, in which lecture notes, tutorials and computer softwares for both CFD and RTD are included. A user-friendly web-based interface has been prepared to allow lecturers more effectively conducting their training courses or workshops, and to provide students or users more easily learning the CFD and RTD knowledge and practising computer softwares. This report gives an overview of the advances in development and use of CFD models and codes for industrial, particularly multiphase processing applications. Experimental needs for validation and improvement of CFD models and softwares are highlighted. Integration of advanced CFD modelling with radiotracer techniques as a complementary technology for future research and industrial applications is discussed. The features and examples of the developed CFD and RTD Education package are presented. (author)
Bridge, Pete; Carmichael, Mary-Ann; Brady, Carole; Dry, Allison
Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. The Snapshot method proved to be a feasible and efficient method of gathering useful data to inform curriculum matching. Frequency of IMRT use in Queensland matches or possibly exceeds that indicated in the literature. It is recommended that future repetition of the study be undertaken in order to monitor trends in referral patterns and new technology implementation
Bridge, Pete; Carmichael, Mary-Ann [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland, 4001 (Australia); Brady, Carole [Radiation Oncology Mater Centre, Raymond Terrace, South Brisbane, Queensland, 4101 (Australia); Dry, Allison [Cancer Care Services Royal Brisbane Women' s Hospital Herston, Brisbane, Queensland, 4029 (Australia); School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland, 4001 (Australia)
Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. The Snapshot method proved to be a feasible and efficient method of gathering useful data to inform curriculum matching. Frequency of IMRT use in Queensland matches or possibly exceeds that indicated in the literature. It is recommended that future repetition of the study be undertaken in order to monitor trends in referral patterns and new technology implementation.
Grimit, E. [3TIER Environmental Forecast Group, Seattle, WA (United States)
Recent advances in the combination of weather forecast ensembles with Bayesian statistical techniques have helped to address uncertainties in wind forecasting. Weather forecast ensembles are a collection of numerical weather predictions. The combination of several equally-skilled forecasts typically results in a consensus forecast with greater accuracy. The distribution of forecasts also provides an estimate of forecast inaccuracy. However, weather forecast ensembles tend to be under-dispersive, and not all forecast uncertainties can be taken into account. In order to address these issues, a multi-variate linear regression approach was used to correct the forecast bias for each ensemble member separately. Bayesian model averaging was used to provide a predictive probability density function to allow for multi-modal probability distributions. A test location in eastern Canada was used to demonstrate the approach. Results of the test showed that the method improved wind forecasts and generated reliable prediction intervals. Prediction intervals were much shorter than comparable intervals based on a single forecast or on historical observations alone. It was concluded that the approach will provide economic benefits to both wind energy developers and investors. refs., tabs., figs.