Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning
by
Gao, Yue
, Zhang, Fengbin
, Fan, Haoyi
in
Anchors
/ Backbone
/ Feature extraction
/ Reasoning
/ Representations
/ Sampling
/ Supervised learning
/ Time series
2020
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning
by
Gao, Yue
, Zhang, Fengbin
, Fan, Haoyi
in
Anchors
/ Backbone
/ Feature extraction
/ Reasoning
/ Representations
/ Sampling
/ Supervised learning
/ Time series
2020
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning
Paper
Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning
2020
Request Book From Autostore
and Choose the Collection Method
Overview
Self-supervised learning achieves superior performance in many domains by extracting useful representations from the unlabeled data. However, most of traditional self-supervised methods mainly focus on exploring the inter-sample structure while less efforts have been concentrated on the underlying intra-temporal structure, which is important for time series data. In this paper, we present SelfTime: a general self-supervised time series representation learning framework, by exploring the inter-sample relation and intra-temporal relation of time series to learn the underlying structure feature on the unlabeled time series. Specifically, we first generate the inter-sample relation by sampling positive and negative samples of a given anchor sample, and intra-temporal relation by sampling time pieces from this anchor. Then, based on the sampled relation, a shared feature extraction backbone combined with two separate relation reasoning heads are employed to quantify the relationships of the sample pairs for inter-sample relation reasoning, and the relationships of the time piece pairs for intra-temporal relation reasoning, respectively. Finally, the useful representations of time series are extracted from the backbone under the supervision of relation reasoning heads. Experimental results on multiple real-world time series datasets for time series classification task demonstrate the effectiveness of the proposed method. Code and data are publicly available at https://haoyfan.github.io/.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.