Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
5 result(s) for "Selvi G, Chemmalar"
Sort by:
Rating Prediction Method for Item-based Collaborative Filtering Recommender Systems Using Formal Concept Analysis
The recommender systems are used to mainly suggest recommendations to the online users by utilizing the user preferences recorded during the item purchase. No matter how, the performance of the recommendation quality seems to be inevitable and far satisfactory. In this paper, a new approach based on the mathematical model, Formal Concept Analysis (FCA) is used to improve the rating prediction of the unknown users which can certainly overcome the issues of the existing approaches like data sparsity, high dimensionality of data, performance of the recommendation generated for top n recommendation. The FCA method is applied using Boolean Matrix Factorization (ie. optimal formal concepts) in predicting the rating of the unknown users in the available user-item interaction matrix which proves to be more efficientin tackling the problem of computational complexity managing the high dimensionality of data. The proposed method is applied using item-based collaborative filtering technique and the experiment is conducted on the Movielens dataset which shows the satisfactory results. The experiments results are evaluated using the related error metrics and performance metrics. The experimental results are also compared with existing item-based Collaborative Filtering techniques which demonstrate that the performance of recommendation quality gradually improved with state-of-the-art existing techniques.
Three-way formal concept clustering technique for matrix completion in recommender system
PurposeIn today’s world, the recommender systems are very valuable systems for the online users, as the World Wide Web is loaded with plenty of available information causing the online users to spend more time and money. The recommender systems suggest some possible and relevant recommendation to the online users by applying the recommendation filtering techniques to the available source of information. The recommendation filtering techniques take the input data denoted as the matrix representation which is generally very sparse and high dimensional data in nature. Hence, the sparse data matrix is completed by filling the unknown or missing entries by using many matrix completion techniques. One of the most popular techniques used is the matrix factorization (MF) which aims to decompose the sparse data matrix into two new and small dimensional data matrix and whose dot product completes the matrix by filling the logical values. However, the MF technique failed to retain the loss of original information when it tried to decompose the matrix, and the error rate is relatively high which clearly shows the loss of such valuable information.Design/methodology/approachTo alleviate the problem of data loss and data sparsity, the new algorithm from formal concept analysis (FCA), a mathematical model, is proposed for matrix completion which aims at filling the unknown or missing entries without loss of valuable information to a greater extent. The proposed matrix completion algorithm uses the clustering technique where the users who have commonly rated the items and have not commonly rated the items are captured into two classes. The matrix completion algorithm fills the mean cluster value of the unknown entries which well completes the matrix without actually decomposing the matrix.FindingsThe experiment was conducted on the available public data set, MovieLens, whose result shows the prediction error rate is minimal, and the comparison with the existing algorithms is also studied. Thus, the application of FCA in recommender systems proves minimum or no data loss and improvement in the prediction accuracy of rating score.Social implicationsThe proposed matrix completion algorithm using FCA performs good recommendation which will be more useful for today’s online users in making decision with regard to the online purchasing of products.Originality/valueThis paper presents the new technique of matrix completion adopting the vital properties from FCA which is applied in the recommender systems. Hence, the proposed algorithm performs well when compared to other existing algorithms in terms of prediction accuracy.
Spatial big data for disaster management
Big data is an idea of informational collections that depicts huge measure of information and complex that conventional information preparing application program is lacking to manage them. Presently, big data is a widely known domain used in research, academic, and industries. It is utilized to store substantial measure of information in a solitary brought together one. Challenges integrate capture, allocation, analysis, information precise, visualization, distribution, interchange, delegation, inquiring, updating and information protection. In this digital world, to put away the information and recovering the data is enormous errand for the huge organizations and some time information ought to be misfortune due to circulated information putting away. For this issue the organization individuals are chosen to actualize the huge information to put away every one of the information identified with the organization they are put away in one enormous database that is known as large information. Remote sensor is a science getting data used to distinguish the items or break down the range from a separation. It is anything but difficult to discover the question effortlessly with the sensor. It makes geographic data from satellite and sensor information so in this paper dissect what are the structures are utilized for remote sensor in huge information and how the engineering is vary from each other and how they are identify with our investigations. This paper depicts how the calamity happens and figuring consequence of informational collection. And applied a seismic informational collection to compute the tremor calamity in view of classification and clustering strategy. The classical data mining algorithms for classification used are k-nearest, naive bayes and decision table and clustering used are hierarchical, make density based and simple k_means using XLMINER and WEKA tool. This paper also helps to predicts the spatial dataset by applying the XLMINER AND WEKA tool and thus the big spatial data can be well suited to this paper.
RETRACTED: Spatial big data for disaster management
Big data is an idea of informational collections that depicts huge measure of information and complex that conventional information preparing application program is lacking to manage them. Presently, big data is a widely known domain used in research, academic, and industries. It is utilized to store substantial measure of information in a solitary brought together one. Challenges integrate capture, allocation, analysis, information precise, visualization, distribution, interchange, delegation, inquiring, updating and information protection. In this digital world, to put away the information and recovering the data is enormous errand for the huge organizations and some time information ought to be misfortune due to circulated information putting away. For this issue the organization individuals are chosen to actualize the huge information to put away every one of the information identified with the organization they are put away in one enormous database that is known as large information. Remote sensor is a science getting data used to distinguish the items or break down the range from a separation. It is anything but difficult to discover the question effortlessly with the sensor. It makes geographic data from satellite and sensor information so in this paper dissect what are the structures are utilized for remote sensor in huge information and how the engineering is vary from each other and how they are identify with our investigations. This paper depicts how the calamity happens and figuring consequence of informational collection. And applied a seismic informational collection to compute the tremor calamity in view of classification and clustering strategy. The classical data mining algorithms for classification used are k-nearest, naive bayes and decision table and clustering used are hierarchical, make density based and simple k_means using XLMINER and WEKA tool. This paper also helps to predicts the spatial dataset by applying the XLMINER AND WEKA tool and thus the big spatial data can be well suited to this paper.
Generative Pre-trained Transformer: A Comprehensive Review on Enabling Technologies, Potential Applications, Emerging Challenges, and Future Directions
The Generative Pre-trained Transformer (GPT) represents a notable breakthrough in the domain of natural language processing, which is propelling us toward the development of machines that can understand and communicate using language in a manner that closely resembles that of humans. GPT is based on the transformer architecture, a deep neural network designed for natural language processing tasks. Due to their impressive performance on natural language processing tasks and ability to effectively converse, GPT have gained significant popularity among researchers and industrial communities, making them one of the most widely used and effective models in natural language processing and related fields, which motivated to conduct this review. This review provides a detailed overview of the GPT, including its architecture, working process, training procedures, enabling technologies, and its impact on various applications. In this review, we also explored the potential challenges and limitations of a GPT. Furthermore, we discuss potential solutions and future directions. Overall, this paper aims to provide a comprehensive understanding of GPT, enabling technologies, their impact on various applications, emerging challenges, and potential solutions.