Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
1,498 result(s) for "next generation networks"
Sort by:
Backhaul in 5G systems for developing countries: A literature review
The fifth‐generation (5G) technology is seen to be a solution for developing countries to improve their quality of services in terms of transportation, education, health, agriculture, and other fields, thereby holding the promises of economic transformation by improving productivity and the quality of life. The paper considers some of the challenges that face operators, investors, and providers when planning to introduce 5G technology to developing countries. The main focus of this paper is the 5G transport network, in particular, the backhaul network. The 5G transport network architectural options and scenarios are discussed based on several standards and previous studies, the problems of existing infrastructures of developing countries are investigated, and new proposed technologies that may help to address those challenges are considered. This paper has highlighted the obstacles and challenges that providers and operators who plan to invest, deploy, or develop 5G technology in developing countries may face.
Scope of machine learning applications for addressing the challenges in next‐generation wireless networks
The convenience of availing quality services at affordable costs anytime and anywhere makes mobile technology very popular among users. Due to this popularity, there has been a huge rise in mobile data volume, applications, types of services, and number of customers. Furthermore, due to the COVID‐19 pandemic, the worldwide lockdown has added fuel to this increase as most of our professional and commercial activities are being done online from home. This massive increase in demand for multi‐class services has posed numerous challenges to wireless network frameworks. The services offered through wireless networks are required to support this huge volume of data and multiple types of traffic, such as real‐time live streaming of videos, audios, text, images etc., at a very high bit rate with a negligible delay in transmission and permissible vehicular speed of the customers. Next‐generation wireless networks (NGWNs, i.e. 5G networks and beyond) are being developed to accommodate the service qualities mentioned above and many more. However, achieving all the desired service qualities to be incorporated into the design of the 5G network infrastructure imposes large challenges for designers and engineers. It requires the analysis of a huge volume of network data (structured and unstructured) received or collected from heterogeneous devices, applications, services, and customers and the effective and dynamic management of network parameters based on this analysis in real time. In the ever‐increasing network heterogeneity and complexity, machine learning (ML) techniques may become an efficient tool for effectively managing these issues. In recent days, the progress of artificial intelligence and ML techniques has grown interest in their application in the networking domain. This study discusses current wireless network research, brief discussions on ML methods that can be effectively applied to the wireless networking domain, some tools available to support and customise efficient mobile system design, and some unresolved issues for future research directions.
Wideband planar antenna with reconfigurable omnidirectional and directional radiation patterns
Proposed is a reconfigurable planar antenna that can generate omnidirectional and directional radiation patterns. The design has been tuned to resonate at 3.8 GHz and covers a bandwidth of several hundred MHz. Using RF switches, the antenna can generate five different radiation patterns: a single omnidirectional mode and four directional modes. Simulations and measurements were carried out and are in good agreement. The compact planar design, large operating bandwidth, and ability to create both omnidirectional and directional radiation patterns make this antenna suitable for small form factor wireless devices in next generation cognitive networks.
Dynamic backhaul resource allocation in wireless networks using artificial neural networks
The increasing bandwidth demand of end-users renders the need for efficient resource management more compelling in next generation wireless networks. In the present work, a novel scheme incorporating the deployment of an intelligent agent capable of monitoring, storing, and predicting the forthcoming needs for resources of a base station (BS) is proposed. In this way, the BS can in advance commit the necessary resources for its backhaul connection, guaranteeing the end-user's quality of service. The prediction process is performed using machine learning techniques.
Development of Low-Cost Air Quality Stations for Next Generation Monitoring Networks: Calibration and Validation of PM2.5 and PM10 Sensors
A low-cost air quality station has been developed for real-time monitoring of main atmospheric pollutants. Sensors for CO, CO2, NO2, O3, VOC, PM2.5 and PM10 were integrated on an Arduino Shield compatible board. As concerns PM2.5 and PM10 sensors, the station underwent a laboratory calibration and later a field validation. Laboratory calibration has been carried out at the headquarters of CNR-IBIMET in Florence (Italy) against a TSI DustTrak reference instrument. A MATLAB procedure, implementing advanced mathematical techniques to detect possible complex non-linear relationships between sensor signals and reference data, has been developed and implemented to accomplish the laboratory calibration. Field validation has been performed across a full “heating season” (1 November 2016 to 15 April 2017) by co-locating the station at a road site in Florence where an official fixed air quality station was in operation. Both calibration and validation processes returned fine scores, in most cases better than those achieved for similar systems in the literature. During field validation, in particular, for PM2.5 and PM10 mean biases of 0.036 and 0.598 µg/m3, RMSE of 4.056 and 6.084 µg/m3, and R2 of 0.909 and 0.957 were achieved, respectively. Robustness of the developed station, seamless deployed through a five and a half month outdoor campaign without registering sensor failures or drifts, is a further key point.
Energy and quality of service aware FUZZY-technique for order preference by similarity to ideal solution based vertical handover decision algorithm for heterogeneous wireless networks
Next generation networks will use multiple transport technologies that offer unrestricted user access to different service providers and support generalised mobility to allow consistent and ubiquitous service provisioning to users. Different networks have widely varying characteristics, which makes it difficult to maintain the requested quality of service (QoS) after executing handoff from one transport technology to another. Maintaining the QoS during handoff in heterogeneous networks needs an intelligent decision mechanism. Moreover, the developed algorithms should be energy aware as the devices involved rely on battery power. This study proposes a novel vertical handover decision mechanism namely FUZZY-technique for order preference by similarity to ideal solution (FUZZY-TOP), obtained by combining a fuzzy rule-based mechanism with the technique for order preference by similarity to ideal solution approach. Simulation results based on a generalised Markov chain show that compared with existing vertical handover decision algorithms, the proposed FUZZY-TOP decision algorithm performs better for different traffic classes.
Optimal User Scheduling in Multi Antenna System Using Multi Agent Reinforcement Learning
Multiple Input Multiple Output (MIMO) systems have been gaining significant attention from the research community due to their potential to improve data rates. However, a suitable scheduling mechanism is required to efficiently distribute available spectrum resources and enhance system capacity. This paper investigates the user selection problem in Multi-User MIMO (MU-MIMO) environment using the multi-agent Reinforcement learning (RL) methodology. Adopting multiple antennas’ spatial degrees of freedom, devices can serve to transmit simultaneously in every time slot. We aim to develop an optimal scheduling policy by optimally selecting a group of users to be scheduled for transmission, given the channel condition and resource blocks at the beginning of each time slot. We first formulate the MU-MIMO scheduling problem as a single-state Markov Decision Process (MDP). We achieve the optimal policy by solving the formulated MDP problem using RL. We use aggregated sum-rate of the group of users selected for transmission, and a 20% higher sum-rate performance over the conventional methods is reported.
Large Language Models Meet Next-Generation Networking Technologies: A Review
The evolution of network technologies has significantly transformed global communication, information sharing, and connectivity. Traditional networks, relying on static configurations and manual interventions, face substantial challenges such as complex management, inefficiency, and susceptibility to human error. The rise of artificial intelligence (AI) has begun to address these issues by automating tasks like network configuration, traffic optimization, and security enhancements. Despite their potential, integrating AI models in network engineering encounters practical obstacles including complex configurations, heterogeneous infrastructure, unstructured data, and dynamic environments. Generative AI, particularly large language models (LLMs), represents a promising advancement in AI, with capabilities extending to natural language processing tasks like translation, summarization, and sentiment analysis. This paper aims to provide a comprehensive review exploring the transformative role of LLMs in modern network engineering. In particular, it addresses gaps in the existing literature by focusing on LLM applications in network design and planning, implementation, analytics, and management. It also discusses current research efforts, challenges, and future opportunities, aiming to provide a comprehensive guide for networking professionals and researchers. The main goal is to facilitate the adoption and advancement of AI and LLMs in networking, promoting more efficient, resilient, and intelligent network systems.
Network Function Virtualization and Service Function Chaining Frameworks: A Comprehensive Review of Requirements, Objectives, Implementations, and Open Research Challenges
Network slicing has become a fundamental property for next-generation networks, especially because an inherent part of 5G standardisation is the ability for service providers to migrate some or all of their network services to a virtual network infrastructure, thereby reducing both capital and operational costs. With network function virtualisation (NFV), network functions (NFs) such as firewalls, traffic load balancers, content filters, and intrusion detection systems (IDS) are either instantiated on virtual machines (VMs) or lightweight containers, often chained together to create a service function chain (SFC). In this work, we review the state-of-the-art NFV and SFC implementation frameworks and present a taxonomy of the current proposals. Our taxonomy comprises three major categories based on the primary objectives of each of the surveyed frameworks: (1) resource allocation and service orchestration, (2) performance tuning, and (3) resilience and fault recovery. We also identify some key open research challenges that require further exploration by the research community to achieve scalable, resilient, and high-performance NFV/SFC deployments in next-generation networks.
Innovative Channel Estimation Methods for Massive MIMO Using GAN Architectures
Channel estimation is a critical component of modern wireless communication systems, especially in massive multiple‐input multiple‐output (MIMO) architectures, where the accuracy of received signal decoding heavily depends on the quality of channel state information. As wireless networks evolve into fifth‐generation (5G) and beyond, they face increasingly complex propagation environments with rapid mobility, dense connectivity, and hardware constraints. Accurate and timely channel estimation is therefore essential for maintaining system performance, enabling reliable data transmission, and supporting techniques such as beamforming and interference management. Traditional estimation methods like least squares and minimum mean square error offer baseline performance but are often limited by their computational complexity, sensitivity to noise, and inefficiency in quantised systems—particularly those employing one‐bit analogue‐to‐digital converters. These limitations hinder their applicability in real‐time, low‐power, and bandwidth‐constrained scenarios. To address these challenges, this paper proposes a novel channel estimation framework based on conditional generative adversarial networks. The approach incorporates a U‐Net‐based generator and a sequential convolutional neural network discriminator to learn complex channel mappings from highly quantised received signals. Unlike existing methods, the proposed architecture dynamically adapts to various noise levels and system configurations, offering improved robustness and generalisation. Comprehensive experiments conducted on realistic indoor massive MIMO datasets demonstrate that the proposed method achieves substantial performance gains. The model improves estimation accuracy from 93% to 95.5% and significantly enhances normalised mean square error, consistently outperforming conventional and deep learning‐based techniques across diverse training conditions. These results confirm the effectiveness of the proposed scheme in delivering high‐accuracy channel estimation under extreme quantisation conditions, making it suitable for next‐generation wireless systems. This paper proposes a novel channel estimation framework for massive multiple‐input multiple‐output (MIMO) systems using conditional generative adversarial networks with a U‐Net‐based generator and sequential convolutional neural network discriminator. The method improves accuracy and robustness under severe quantisation conditions, outperforming traditional and deep learning‐based techniques, achieving significant performance gains in terms of normalised mean square error and estimation accuracy. Experiments on realistic massive MIMO datasets demonstrate its suitability for next‐generation wireless communication systems.