Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
On Normalized Mutual Information: Measure Derivations and Properties
by
Kvålseth, Tarald
in
association measures
/ Continuity (mathematics)
/ mutual information
/ normalized mutual information
/ Random variables
/ similarity measures
/ Upper bounds
/ value validity
2017
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
On Normalized Mutual Information: Measure Derivations and Properties
by
Kvålseth, Tarald
in
association measures
/ Continuity (mathematics)
/ mutual information
/ normalized mutual information
/ Random variables
/ similarity measures
/ Upper bounds
/ value validity
2017
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
On Normalized Mutual Information: Measure Derivations and Properties
Journal Article
On Normalized Mutual Information: Measure Derivations and Properties
2017
Request Book From Autostore
and Choose the Collection Method
Overview
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided.
Publisher
MDPI AG
This website uses cookies to ensure you get the best experience on our website.