Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
20,003
result(s) for
"Graph Neural Networks"
Sort by:
Concepts and techniques of graph neural network
\"This book will aim to provide stepwise discussion; exhaustive literature review; detailed analysis and discussion; rigorous experimentation results, application-oriented approach that will be demonstrated with respect to applications of Graph Neural Network (GNN). It will be written to develop the understanding of concepts and techniques on GNN and to establish the familiarity of different real applications in various domains for GNN. Moreover, it will also cover the prevailing challenges and opportunities\"-- Provided by publisher.
Spatial-temporal graph neural network for traffic forecasting: An overview and open research issues
by
Bui, Khac-Hoai Nam
,
Yi Hongsuk
,
Cho Jiho
in
Deep learning
,
Forecasting
,
Graph neural networks
2022
Traffic forecasting plays an important role of modern Intelligent Transportation Systems (ITS). With the recent rapid advancement in deep learning, graph neural networks (GNNs) have become an emerging research issue for improving the traffic forecasting problem. Specifically, one of the main types of GNNs is the spatial-temporal GNN (ST-GNN), which has been applied to various time-series forecasting applications. This study aims to provide an overview of recent ST-GNN models for traffic forecasting. Particularly, we propose a new taxonomy of ST-GNN by dividing existing models into four approaches such as graph convolutional recurrent neural network, fully graph convolutional network, graph multi-attention network, and self-learning graph structure. Sequentially, we present experimental results based on the reconstruction of representative models using selected benchmark datasets to evaluate the main contributions of the key components in each type of ST-GNN. Finally, we discuss several open research issues for further investigations.
Journal Article
HGNN−BRFE: Heterogeneous Graph Neural Network Model Based on Region Feature Extraction
2024
With the strong capability of heterogeneous graphs in accurately modeling various types of nodes and their interactions, they have gradually become a research hotspot, promoting the rapid development of the field of heterogeneous graph neural networks (HGNNs). However, most existing HGNN models rely on meta−paths for feature extraction, which can only utilize part of the data from the graph for training and learning. This not only limits the data generalization ability of deep learning models but also affects the application effect of data−driven adaptive technologies. In response to this challenge, this study proposes a new model—heterogeneous graph neural network based on regional feature extraction (HGNN−BRFE). This model enhances performance through an “extraction−fusion” strategy in three key aspects: first, it efficiently extracts features of neighboring nodes of the same type according to specific regions; second, it effectively fuses information from different regions and hierarchical neighbors using attention mechanisms; third, it specially designs a process for feature extraction and fusion targeting heterogeneous type nodes, ensuring that the rich semantic and heterogeneity information within the heterogeneous graph is retained while maintaining the node’s own characteristics during the node embedding process to prevent the loss of its own features and potential over−smoothing issues. Experimental results show that HGNN−BRFE achieves a performance improvement of 1–3% over existing methods on classification tasks across multiple real−world datasets.
Journal Article
Physics-informed graph neural network for predicting fluid flow in porous media
2025
With the rapid development of deep learning neural networks, new solutions have emerged for addressing fluid flow problems in porous media. Combining data-driven approaches with physical constraints has become a hot research direction, with physics-informed neural networks (PINNs) being the most popular hybrid model. PINNs have gained widespread attention in subsurface fluid flow simulations due to their low computational resource requirements, fast training speeds, strong generalization capabilities, and broad applicability. Despite success in homogeneous settings, standard PINNs face challenges in accurately calculating flux between irregular Eulerian cells with disparate properties and capturing global field influences on local cells. This limits their suitability for heterogeneous reservoirs and the irregular Eulerian grids frequently used in reservoir. To address these challenges, this study proposes a physics-informed graph neural network (PIGNN) model. The PIGNN model treats the entire field as a whole, integrating information from neighboring grids and physical laws into the solution for the target grid, thereby improving the accuracy of solving partial differential equations in heterogeneous and Eulerian irregular grids. The optimized model was applied to pressure field prediction in a spatially heterogeneous reservoir, achieving an average L2 error and R2 score of 6.710 × 10−4 and 0.998, respectively, which confirms the effectiveness of model. Compared to the conventional PINN model, the average L2 error was reduced by 76.93%, the average R2 score increased by 3.56%. Moreover, evaluating robustness, training the PIGNN model using only 54% and 76% of the original data yielded average relative L2 error reductions of 58.63% and 56.22%, respectively, compared to the PINN model. These results confirm the superior performance of this approach compared to PINN.
Journal Article
Multichannel Adaptive Data Mixture Augmentation for Graph Neural Networks
2024
Graph neural networks (GNNs) have demonstrated significant potential in analyzing complex graph-structured data. However, conventional GNNs encounter challenges in effectively incorporating global and local features. Therefore, this paper introduces a novel approach for GNN called multichannel adaptive data mixture augmentation (MAME-GNN). It enhances a GNN by adopting a multi-channel architecture and interactive learning to effectively capture and coordinate the interrelationships between local and global graph structures. Additionally, this paper introduces the polynomial–Gaussian mixture graph interpolation method to address the problem of single and sparse graph data, which generates diverse and nonlinear transformed samples, improving the model's generalization ability. The proposed MAME-GNN is validated through extensive experiments on publicly available datasets, showcasing its effectiveness. Compared to existing GNN models, the MAME-GNN exhibits superior performance, significantly enhancing the model's robustness and generalization ability.
Journal Article
A Data-centric graph neural network for node classification of heterophilic networks
by
Gao, Wenlian
,
Xue, Yanfeng
,
Jin, Zhen
in
Artificial Intelligence
,
Classification
,
Complex Systems
2024
In the real world, numerous heterophilic networks effectively model the tendency of similar entities to repel each other and dissimilar entities to be attracted to each other within complex systems. Concerning the node classification problem in heterophilic networks, a plethora of heterophilic Graph Neural Networks (GNNs) have emerged. However, these GNNs demand extensive hyperparameter tuning, activation function selection, parameter initialization, and other configuration settings, particularly when dealing with diverse heterophilic networks and resource constraints. This situation raises a fundamental question: Can a method be designed to directly preprocess heterophilic networks and then leverage the trained models in network representation learning systems? In this paper, we propose a novel approach to transform heterophilic network structures. Specifically, we train an edge classifier and subsequently employ this edge classifier to transform a heterophilic network into its corresponding homophilic counterpart. Finally, we conduct experiments on heterophilic network datasets with variable sizes, demonstrating the effectiveness of our approach. The code and datasets are publicly available at
https://github.com/xueyanfeng/D_c_GNNs
.
Journal Article
Online social network user performance prediction by graph neural networks
by
Gafarov, Fail
,
Ustin, Pavel
,
Berdnikov, Andrey
in
online social network, graph neural networks, professional performance, graph convolutional neural networks, social graph
2022
Online social networks provide rich information that characterizes the user’s personality, his interests, hobbies, and reflects his current state. Users of social networks publish photos, posts, videos, audio, etc. every day. Online social networks (OSN) open up a wide range of research opportunities for scientists. Much research conducted in recent years using graph neural networks (GNN) has shown their advantages over conventional deep learning. In particular, the use of graph neural networks for online social network analysis seems to be the most suitable. In this article we studied the use of graph convolutional neural networks with different convolution layers (GCNConv, SAGEConv, GraphConv, GATConv, TransformerConv, GINConv) for predicting the user’s professional success in VKontakte online social network, based on data obtained from his profiles. We have used various parameters obtained from users’ personal pages in VKontakte social network (the number of friends, subscribers, interesting pages, etc.) as their features for determining the professional success, as well as networks (graphs) reflecting connections between users (followers/ friends). In this work we performed graph classification by using graph convolutional neural networks (with different types of convolution layers). The best accuracy of the graph convolutional neural network (0.88) was achieved by using the graph isomorphism network (GIN) layer. The results, obtained in this work, will serve for further studies of social success, based on metrics of personal profiles of OSN users and social graphs using neural network methods.
Journal Article
A review of graph neural networks: concepts, architectures, techniques, challenges, datasets, applications, and future directions
2024
Deep learning has seen significant growth recently and is now applied to a wide range of conventional use cases, including graphs. Graph data provides relational information between elements and is a standard data format for various machine learning and deep learning tasks. Models that can learn from such inputs are essential for working with graph data effectively. This paper identifies nodes and edges within specific applications, such as text, entities, and relations, to create graph structures. Different applications may require various graph neural network (GNN) models. GNNs facilitate the exchange of information between nodes in a graph, enabling them to understand dependencies within the nodes and edges. The paper delves into specific GNN models like graph convolution networks (GCNs), GraphSAGE, and graph attention networks (GATs), which are widely used in various applications today. It also discusses the message-passing mechanism employed by GNN models and examines the strengths and limitations of these models in different domains. Furthermore, the paper explores the diverse applications of GNNs, the datasets commonly used with them, and the Python libraries that support GNN models. It offers an extensive overview of the landscape of GNN research and its practical implementations.
Journal Article
A survey of graph neural networks in various learning paradigms: methods, applications, and challenges
2023
In the last decade, deep learning has reinvigorated the machine learning field. It has solved many problems in computer vision, speech recognition, natural language processing, and other domains with state-of-the-art performances. In these domains, the data is generally represented in the Euclidean space. Various other domains conform to non-Euclidean space, for which a graph is an ideal representation. Graphs are suitable for representing the dependencies and inter-relationships between various entities. Traditionally, handcrafted features for graphs are incapable of providing the necessary inference for various tasks from this complex data representation. Recently, there has been an emergence of employing various advances in deep learning for graph-based tasks (called Graph Neural Networks (GNNs)). This article introduces preliminary knowledge regarding GNNs and comprehensively surveys GNNs in different learning paradigms—supervised, unsupervised, semi-supervised, self-supervised, and few-shot or meta-learning. The taxonomy of each graph-based learning setting is provided with logical divisions of methods falling in the given learning setting. The approaches for each learning task are analyzed from theoretical and empirical standpoints. Further, we provide general architecture design guidelines for building GNN models. Various applications and benchmark datasets are also provided, along with open challenges still plaguing the general applicability of GNNs.
Journal Article
An End-To-End Hyperbolic Deep Graph Convolutional Neural Network Framework
by
Mao, Jingyi
,
Lv, Xiaojun
,
Zhou, Yuchen
in
Artificial neural networks
,
Euclidean geometry
,
Euclidean space
2024
Graph Convolutional Neural Networks (GCNs) have been widely used in various fields due to their powerful capabilities in processing graph-structured data. However, GCNs encounter significant challenges when applied to scale-free graphs with power-law distributions, resulting in substantial distortions. Moreover, most of the existing GCN models are shallow structures, which restricts their ability to capture dependencies among distant nodes and more refined high-order node features in scale-free graphs with hierarchical structures. To more broadly and precisely apply GCNs to real-world graphs exhibiting scale-free or hierarchical structures and utilize multi-level aggregation of GCNs for capturing high-level information in local representations, we propose the Hyperbolic Deep Graph Convolutional Neural Network (HDGCNN), an end-to-end deep graph representation learning framework that can map scale-free graphs from Euclidean space to hyperbolic space. In HDGCNN, we define the fundamental operations of deep graph convolutional neural networks in hyperbolic space. Additionally, we introduce a hyperbolic feature transformation method based on identity mapping and a dense connection scheme based on a novel non-local message passing framework. In addition, we present a neighborhood aggregation method that combines initial structural features with hyperbolic attention coefficients. Through the above methods, HDGCNN effectively leverages both the structural features and node features of graph data, enabling enhanced exploration of non-local structural features and more refined node features in scale-free or hierarchical graphs. Experimental results demonstrate that HDGCNN achieves remarkable performance improvements over state-of-the-art GCNs in node classification and link prediction tasks, even when utilizing low-dimensional embedding representations. Furthermore, when compared to shallow hyperbolic graph convolutional neural network models, HDGCNN exhibits notable advantages and performance enhancements.
Journal Article