Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Code comment generation based on graph neural network enhanced transformer model for code understanding in open-source software ecosystems
by
Yang, Xiaoxian
, Kuang, Li
, Zhou, Cong
in
Artificial Intelligence
/ Coders
/ Computer Science
/ Graph neural networks
/ Machine learning
/ Neural networks
/ Open source software
/ Public domain
/ Representations
/ Software development
/ Software Engineering/Programming and Operating Systems
/ Source code
/ Special Issue on Deep Learning in Open-Source Software Ecosystems
/ Transformers
2022
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Code comment generation based on graph neural network enhanced transformer model for code understanding in open-source software ecosystems
by
Yang, Xiaoxian
, Kuang, Li
, Zhou, Cong
in
Artificial Intelligence
/ Coders
/ Computer Science
/ Graph neural networks
/ Machine learning
/ Neural networks
/ Open source software
/ Public domain
/ Representations
/ Software development
/ Software Engineering/Programming and Operating Systems
/ Source code
/ Special Issue on Deep Learning in Open-Source Software Ecosystems
/ Transformers
2022
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Code comment generation based on graph neural network enhanced transformer model for code understanding in open-source software ecosystems
by
Yang, Xiaoxian
, Kuang, Li
, Zhou, Cong
in
Artificial Intelligence
/ Coders
/ Computer Science
/ Graph neural networks
/ Machine learning
/ Neural networks
/ Open source software
/ Public domain
/ Representations
/ Software development
/ Software Engineering/Programming and Operating Systems
/ Source code
/ Special Issue on Deep Learning in Open-Source Software Ecosystems
/ Transformers
2022
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Code comment generation based on graph neural network enhanced transformer model for code understanding in open-source software ecosystems
Journal Article
Code comment generation based on graph neural network enhanced transformer model for code understanding in open-source software ecosystems
2022
Request Book From Autostore
and Choose the Collection Method
Overview
In open-source software ecosystems, the scale of source code is getting larger and larger, and developers often use various methods (good code comments or method names, etc.) to make the code easier to read and understand. However, high-quality code comments or method names are often unavailable due to tight project schedules or other reasons in open-source software ecosystems such as Github. Therefore, in this work, we try to use deep learning models to generate appropriate code comments or method names to help software development and maintenance, which requires a non-trivial understanding of the code. Therefore, we propose a Graph neural network enhanced Transformer model (GTrans for short) to learn code representation to understand code better. Specifically, GTrans learns code representation from code sequences and graphs. We use a Transformer encoder to capture the global representation from code sequence and a graph neural network (GNN) encoder to focus on the local details in the code graph, and then use a decoder to combine both global and local representations by attention mechanism. We use three public datasets collected from GitHub to evaluate our model. In an extensive evaluation, we show that GTrans outperforms the state-of-the-art models up to 3.8% increase in METEOR metrics on code comment generation and outperforms the state-of-the-art models by margins of 5.8%–9.4% in ROUGE metrics on method name generation after some adjustments on the structure. Empirically, we find the method name generation task depends on more local information than global, and the code comment generation task is in contrast. Our data and code are available at
https://github.com/zc-work/GTrans
.
Publisher
Springer US,Springer Nature B.V
This website uses cookies to ensure you get the best experience on our website.