Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Molecular representation learning with language models and domain-relevant auxiliary tasks
by
Edlich, Thomas
, Meyers, Joshua
, Benedek Fabian
, Segler, Marwin
, Ahmed, Mohamed
, Gaspar, Héléna
, Fiscato, Marco
in
Benchmarks
/ Domains
/ Representation learning
/ Screening
/ Training
2020
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Molecular representation learning with language models and domain-relevant auxiliary tasks
by
Edlich, Thomas
, Meyers, Joshua
, Benedek Fabian
, Segler, Marwin
, Ahmed, Mohamed
, Gaspar, Héléna
, Fiscato, Marco
in
Benchmarks
/ Domains
/ Representation learning
/ Screening
/ Training
2020
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Molecular representation learning with language models and domain-relevant auxiliary tasks
Paper
Molecular representation learning with language models and domain-relevant auxiliary tasks
2020
Request Book From Autostore
and Choose the Collection Method
Overview
We apply a Transformer architecture, specifically BERT, to learn flexible and high quality molecular representations for drug discovery problems. We study the impact of using different combinations of self-supervised tasks for pre-training, and present our results for the established Virtual Screening and QSAR benchmarks. We show that: i) The selection of appropriate self-supervised task(s) for pre-training has a significant impact on performance in subsequent downstream tasks such as Virtual Screening. ii) Using auxiliary tasks with more domain relevance for Chemistry, such as learning to predict calculated molecular properties, increases the fidelity of our learnt representations. iii) Finally, we show that molecular representations learnt by our model `MolBert' improve upon the current state of the art on the benchmark datasets.
Publisher
Cornell University Library, arXiv.org
Subject
MBRLCatalogueRelatedBooks
Related Items
Related Items
We currently cannot retrieve any items related to this title. Kindly check back at a later time.
This website uses cookies to ensure you get the best experience on our website.