Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Reading Level
      Reading Level
      Clear All
      Reading Level
  • Content Type
      Content Type
      Clear All
      Content Type
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Item Type
    • Is Full-Text Available
    • Subject
    • Publisher
    • Source
    • Donor
    • Language
    • Place of Publication
    • Contributors
    • Location
295 result(s) for "Computer networks Econometric models."
Sort by:
Statistical and Machine Learning forecasting methods: Concerns and ways forward
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions.
Which model is more efficient in carbon emission prediction research? A comparative study of deep learning models, machine learning models, and econometric models
Accurately predicting future carbon emissions is of great significance for the government to scientifically promote carbon emission reduction policies. Among the current technologies for forecasting carbon emissions, the most prominent ones are econometric models and deep learning, but few works have systematically compared and analyzed the forecasting performance of the methods. Therefore, the paper makes a comparison for deep learning model, machine learning model, and the econometric model to demonstrate whether deep learning is an efficient method for carbon emission prediction research. In model mechanism, neural network for deep learning refers to an information processing model established by simulating biological neural system, and the model can be further extended through bionic characteristics. So the paper further optimizes the model from the perspective of bionics and proposes an innovative deep learning model based on the memory behavior mechanism of group creatures. Comparison results show that the prediction accuracy of the heuristic neural network is higher than that of the econometric model. Through in-depth analysis, the heuristic neural network is more suitable for predicting future carbon emissions, while the econometric model is more suitable for clarifying the impact of influencing factors on carbon emissions.
Renewable estimation and incremental inference in generalized linear models with streaming data sets
The paper presents an incremental updating algorithm to analyse streaming data sets using generalized linear models. The method proposed is formulated within a new framework of renewable estimation and incremental inference, in which the maximum likelihood estimator is renewed with current data and summary statistics of historical data. Our framework can be implemented within a popular distributed computing environment, known as Apache Spark, to scale up computation. Consisting of two data-processing layers, the rho architecture enables us to accommodate inference-related statistics and to facilitate sequential updating of the statistics used in both estimation and inference. We establish estimation consistency and asymptotic normality of the proposed renewable estimator, in which the Wald test is utilized for an incremental inference. Our methods are examined and illustrated by various numerical examples from both simulation experiments and a real world data analysis.
Study on the effect of digital economy on high-quality economic development in China
At present, the digital economy, which takes information technology and data as the key elements, is booming and has become an important force in promoting the economic growth of various countries. In order to explore the current dynamic trend of China’s digital economy development and the impact of the digital economy on the high-quality economic development, this paper measures the digital economic development index of 30 cities in China from the three dimensions of digital infrastructure, digital industry, and digital integration, uses panel data of 30 cities in China from 2015 to 2019 to construct an econometric model for empirical analysis, and verifies the mediating effect of technological progress between the digital economy and high-quality economic development. The results show that (1) The development level of China’s digital economy is increasing year by year, that the growth of digital infrastructure is obvious, and that the development of the digital industry is relatively slow. (2) Digital infrastructure, digital industry and digital integration all have significant positive effects on regional total factor productivity, and the influence coefficients are 0.2452, 0.0773 and 0.3458 respectively. (3) Regarding the transmission mechanism from the digital economy to the high-quality economic development, the study finds that the mediating effect of technological progress is 0.1527, of which the mediating effect of technological progress in the eastern, northeast, central and western regions is 1.70%, 9.25%, 28.89% and 21.22% respectively. (4) From the perspective of spatial distribution, the development level of the digital economy in the eastern region is much higher than that in other non-eastern regions, and the development of digital economy in the eastern region has a higher marginal contribution rate to the improvement of the total factor productivity. This study can provide a theoretical basis and practical support for the government to formulate policies for the development of the digital economy.
Wealthy individual investors and stock markets’ tail risk
This paper employs a unique data set to analyze the trading behavior of wealthy individual investors across Mainland China and their impact on Chinese stock markets’ tail risk. Results show that the wealthy individual investors’ trading behavior can explain Chinese stock markets’ tail risk, and the daily investment portfolios based on the network density of wealthy individual investors have significant excess returns. This paper also investigates the determinants of wealthy individual investors’ trading behavior with the social network method and the spatial econometric model, and reveals that wealthy individuals benefit from the spillover effect of their trading behavior through the investor networks. The results of this paper not only reveal micro evidence for the formation mechanism of asset prices, but also provide insight into the behavior of wealthy individual investors.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion
The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient.
Electrical circuit model of spatiotemporal trade dynamics: Foundations and derivation of the gravity model
A model of time-dependent trade of goods between spatial locations is formulated via an electric circuit analogy, in which goods are analogous to charge and price to voltage, while producers and consumers are represented by sources and sinks of goods flow, which is represented by current, located at the nodes of a trade network. The core ansatz is that the flow of goods along each network link is driven by the voltage difference across that link, opposed by resistance that represents trade friction. Market prices are then determined indirectly by internal balances of flows, subject to external constraints on supply and demand. The model yields multiple outcomes that support its validity and applicability, including price setting via emergent balance of supply and demand, price fluctuations, traditional and generalized elasticities, network structure-flow relations, competition between producers, and substitution between suppliers, between consumers, and/or between trade links. All these results prove to be consistent with observed features of trade dynamics, thereby supporting the validity of the model. The new model is then used to derive the widely used gravity model of international trade from a mechanistic basis, yielding exponents consistent with published data and leading naturally to core-periphery structure, as observed in real trade networks. The analysis also implies that trade flows self-organize to minimize trade friction in the system as a whole, an emergent global outcome from the purely local dynamics of the populations of producers, consumers, and traders. Possible generalizations and further applications are outlined, including incorporation of asymmetry and capacity limits of trade links, constraints on supply and demand, behavioral responses, and coupling to models of investment strategies.
Evaluating the Performance of Metaheuristic Based Artificial Neural Networks for Cryptocurrency Forecasting
The irregular movement of cryptocurrency market makes effective price forecasting a challenging task. Price fluctuations in cryptocurrencies often appear to be arbitrary that has been a hot topic. Though various statistical and econometric forecasting models exists, still there is lack of advanced artificial intelligence models to explain behaviour of such price fluctuations. Artificial neural networks (ANNs) are data-driven models and can effectively handle complex nonlinear functions in presence of abundant data. However, optimal parameter tuning of such models with conventional back propagation-based learning entail domain expertise, higher computational cost, and yield inferior accuracy thus, makes its use tough. Contrast to this, metaheuristic-based ANN training has been emerging as an efficient learning paradigm. This article constructs few optimal ANNs through three efficient metaheuristics with less control parameters such as fireworks algorithm (FWA), chemical reaction optimization (CRO), and the teaching–learning based optimization (TLBO) separately. The role of a metaheuristic is to investigate the near-optimal weights and thresholds of an ANN of solitary hidden layer and thereby ensuring a higher degree of accuracy. The hybrid models are then used to simulate and predict the behaviour of four fast growing cryptocurrencies such as Bitcoin, Litecoin, Ethereum, and Ripple. Various experiments are carried out using real time cryptocurrency data and hybrid ANNs through four performance measures. We undertake a comparative performance analysis of forecasting models and Friedman tests to demonstrate the superiority and statistical significance. In particular, ANN trained with CRO, TLBO, and FWA obtained an average rank of 1, 2, and 2.75 respectively.
International carbon financial market prediction using particle swarm optimization and support vector machine
Carbon financial futures have both the characteristics of commodity futures and environmental protection and its price is affected by many factors. It is hard and complex for traditional analysis methods to get precise prediction results effectively. How to effectively predict the price trend of carbon financial futures has been focused on by both academia and traders. This study addresses the high prediction error of European allowance (EUA) futures price by constructing a novel approach by combining the support vector machine (SVM) and particle swarm optimization (PSO) algorithm. This article introduces a parameters optimization method, which provides the best parameters for SVM to improve the prediction performance of the EUA futures price. Furthermore, this research uses the realistic trading dataset containing 30,762 EUA futures closing prices to verify the effectiveness and efficiency of the PSO-SVM prediction model. The empirical results show that the prediction performance of the model, especially the radial kernel function, is significantly improved. And this approach can determine the parameters according to the characteristics of the dataset and input the parameters for training and prediction automatically. The PSO-SVM algorithm can effectively predict extreme price fluctuations and overcome the problem of high prediction error caused by parameter constraints.