Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation
by
Fan, Zhengyang
, Chang, Kuo-Chu
, Li, Wanru
in
Adaptation
/ Aircraft
/ Aircraft engines
/ Analysis
/ Artificial intelligence
/ autoencoder
/ bidirectional LSTM
/ Coders
/ Computational linguistics
/ Deep learning
/ Electric transformers
/ Engines
/ Estimation
/ Feature extraction
/ Language processing
/ Measuring instruments
/ Methods
/ Natural language interfaces
/ Neural networks
/ Preventive maintenance
/ remaining useful life prediction
/ self-supervised learning
/ Sensors
/ Support vector machines
/ Time series
/ Training
/ Transformer
/ turbofan engine
/ Useful life
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation
by
Fan, Zhengyang
, Chang, Kuo-Chu
, Li, Wanru
in
Adaptation
/ Aircraft
/ Aircraft engines
/ Analysis
/ Artificial intelligence
/ autoencoder
/ bidirectional LSTM
/ Coders
/ Computational linguistics
/ Deep learning
/ Electric transformers
/ Engines
/ Estimation
/ Feature extraction
/ Language processing
/ Measuring instruments
/ Methods
/ Natural language interfaces
/ Neural networks
/ Preventive maintenance
/ remaining useful life prediction
/ self-supervised learning
/ Sensors
/ Support vector machines
/ Time series
/ Training
/ Transformer
/ turbofan engine
/ Useful life
2023
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation
by
Fan, Zhengyang
, Chang, Kuo-Chu
, Li, Wanru
in
Adaptation
/ Aircraft
/ Aircraft engines
/ Analysis
/ Artificial intelligence
/ autoencoder
/ bidirectional LSTM
/ Coders
/ Computational linguistics
/ Deep learning
/ Electric transformers
/ Engines
/ Estimation
/ Feature extraction
/ Language processing
/ Measuring instruments
/ Methods
/ Natural language interfaces
/ Neural networks
/ Preventive maintenance
/ remaining useful life prediction
/ self-supervised learning
/ Sensors
/ Support vector machines
/ Time series
/ Training
/ Transformer
/ turbofan engine
/ Useful life
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation
Journal Article
A Bidirectional Long Short-Term Memory Autoencoder Transformer for Remaining Useful Life Estimation
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Estimating the remaining useful life (RUL) of aircraft engines holds a pivotal role in enhancing safety, optimizing operations, and promoting sustainability, thus being a crucial component of modern aviation management. Precise RUL predictions offer valuable insights into an engine’s condition, enabling informed decisions regarding maintenance and crew scheduling. In this context, we propose a novel RUL prediction approach in this paper, harnessing the power of bi-directional LSTM and Transformer architectures, known for their success in sequence modeling, such as natural languages. We adopt the encoder part of the full Transformer as the backbone of our framework, integrating it with a self-supervised denoising autoencoder that utilizes bidirectional LSTM for improved feature extraction. Within our framework, a sequence of multivariate time-series sensor measurements serves as the input, initially processed by the bidirectional LSTM autoencoder to extract essential features. Subsequently, these feature values are fed into our Transformer encoder backbone for RUL prediction. Notably, our approach simultaneously trains the autoencoder and Transformer encoder, different from the naive sequential training method. Through a series of numerical experiments carried out on the C-MAPSS datasets, we demonstrate that the efficacy of our proposed models either surpasses or stands on par with that of other existing methods.
This website uses cookies to ensure you get the best experience on our website.