Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Extensive studies of the neutron star equation of state from the deep learning inference with the observational data augmentation
by
Murase, Koichi
, Fukushima, Kenji
, Fujimoto, Yuki
in
Classical and Quantum Gravitation
/ Computer architecture
/ Data augmentation
/ Deep learning
/ Elementary Particles
/ Equations of state
/ High energy physics
/ Histograms
/ Inference
/ Neural networks
/ Neutron stars
/ Parameterization
/ Phase transitions
/ Physics
/ Physics and Astronomy
/ Polynomials
/ QCD Phenomenology
/ Quantum Field Theories
/ Quantum Field Theory
/ Quantum Physics
/ Regular Article - Theoretical Physics
/ Relativity Theory
/ String Theory
/ Uncertainty
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Extensive studies of the neutron star equation of state from the deep learning inference with the observational data augmentation
by
Murase, Koichi
, Fukushima, Kenji
, Fujimoto, Yuki
in
Classical and Quantum Gravitation
/ Computer architecture
/ Data augmentation
/ Deep learning
/ Elementary Particles
/ Equations of state
/ High energy physics
/ Histograms
/ Inference
/ Neural networks
/ Neutron stars
/ Parameterization
/ Phase transitions
/ Physics
/ Physics and Astronomy
/ Polynomials
/ QCD Phenomenology
/ Quantum Field Theories
/ Quantum Field Theory
/ Quantum Physics
/ Regular Article - Theoretical Physics
/ Relativity Theory
/ String Theory
/ Uncertainty
2021
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Extensive studies of the neutron star equation of state from the deep learning inference with the observational data augmentation
by
Murase, Koichi
, Fukushima, Kenji
, Fujimoto, Yuki
in
Classical and Quantum Gravitation
/ Computer architecture
/ Data augmentation
/ Deep learning
/ Elementary Particles
/ Equations of state
/ High energy physics
/ Histograms
/ Inference
/ Neural networks
/ Neutron stars
/ Parameterization
/ Phase transitions
/ Physics
/ Physics and Astronomy
/ Polynomials
/ QCD Phenomenology
/ Quantum Field Theories
/ Quantum Field Theory
/ Quantum Physics
/ Regular Article - Theoretical Physics
/ Relativity Theory
/ String Theory
/ Uncertainty
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Extensive studies of the neutron star equation of state from the deep learning inference with the observational data augmentation
Journal Article
Extensive studies of the neutron star equation of state from the deep learning inference with the observational data augmentation
2021
Request Book From Autostore
and Choose the Collection Method
Overview
A
bstract
We discuss deep learning inference for the neutron star equation of state (EoS) using the real observational data of the mass and the radius. We make a quantitative comparison between the conventional polynomial regression and the neural network approach for the EoS parametrization. For our deep learning method to incorporate uncertainties in observation, we augment the training data with noise fluctuations corresponding to observational uncertainties. Deduced EoSs can accommodate a weak first-order phase transition, and we make a histogram for likely first-order regions. We also find that our observational data augmentation has a byproduct to tame the overfitting behavior. To check the performance improved by the data augmentation, we set up a toy model as the simplest inference problem to recover a double-peaked function and monitor the validation loss. We conclude that the data augmentation could be a useful technique to evade the overfitting without tuning the neural network architecture such as inserting the dropout.
Publisher
Springer Berlin Heidelberg,Springer Nature B.V,SpringerOpen
This website uses cookies to ensure you get the best experience on our website.