File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TIT.2021.3088304
- Scopus: eid_2-s2.0-85111005731
- WOS: WOS:000690440100030
- Find via

Supplementary
- Citations:
- Appears in Collections:
Article: Fourier-Analysis-Based Form of Normalized Maximum Likelihood: Exact Formula and Relation to Complex Bayesian Prior
| Title | Fourier-Analysis-Based Form of Normalized Maximum Likelihood: Exact Formula and Relation to Complex Bayesian Prior |
|---|---|
| Authors | |
| Keywords | Bayesian complex prior minimax regret Normalized maximum likelihood online learning |
| Issue Date | 2021 |
| Citation | IEEE Transactions on Information Theory, 2021, v. 67, n. 9, p. 6164-6178 How to Cite? |
| Abstract | Normalized maximum likelihood (NML) distribution of probabilistic model gives the optimal code length function in the sense of minimax regret. Despite its optimal property, the calculation of NML distribution is not easy, and existing efficient methods have been focusing on its asymptotic behavior, or on specific models. This paper gives an efficient way to calculate NML by integral on parameter domain, not on data domain, showing that NML distribution is a Bayesian predictive distribution with a complex prior, based on our novel Fourier expansion approach. Our results provide an integrated way to calculate NML for exponential family and also include a non-asymptotic version of previous work on asymptotic behavior for general cases. The applications of our methodology are not limited to but also include normal distribution, Gamma distribution, Weibull distribution, and von Mises distribution. |
| Persistent Identifier | http://hdl.handle.net/10722/354199 |
| ISSN | 2023 Impact Factor: 2.2 2023 SCImago Journal Rankings: 1.607 |
| ISI Accession Number ID |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Suzuki, Atsushi | - |
| dc.contributor.author | Yamanishi, Kenji | - |
| dc.date.accessioned | 2025-02-07T08:47:07Z | - |
| dc.date.available | 2025-02-07T08:47:07Z | - |
| dc.date.issued | 2021 | - |
| dc.identifier.citation | IEEE Transactions on Information Theory, 2021, v. 67, n. 9, p. 6164-6178 | - |
| dc.identifier.issn | 0018-9448 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/354199 | - |
| dc.description.abstract | Normalized maximum likelihood (NML) distribution of probabilistic model gives the optimal code length function in the sense of minimax regret. Despite its optimal property, the calculation of NML distribution is not easy, and existing efficient methods have been focusing on its asymptotic behavior, or on specific models. This paper gives an efficient way to calculate NML by integral on parameter domain, not on data domain, showing that NML distribution is a Bayesian predictive distribution with a complex prior, based on our novel Fourier expansion approach. Our results provide an integrated way to calculate NML for exponential family and also include a non-asymptotic version of previous work on asymptotic behavior for general cases. The applications of our methodology are not limited to but also include normal distribution, Gamma distribution, Weibull distribution, and von Mises distribution. | - |
| dc.language | eng | - |
| dc.relation.ispartof | IEEE Transactions on Information Theory | - |
| dc.subject | Bayesian | - |
| dc.subject | complex prior | - |
| dc.subject | minimax regret | - |
| dc.subject | Normalized maximum likelihood | - |
| dc.subject | online learning | - |
| dc.title | Fourier-Analysis-Based Form of Normalized Maximum Likelihood: Exact Formula and Relation to Complex Bayesian Prior | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1109/TIT.2021.3088304 | - |
| dc.identifier.scopus | eid_2-s2.0-85111005731 | - |
| dc.identifier.volume | 67 | - |
| dc.identifier.issue | 9 | - |
| dc.identifier.spage | 6164 | - |
| dc.identifier.epage | 6178 | - |
| dc.identifier.eissn | 1557-9654 | - |
| dc.identifier.isi | WOS:000690440100030 | - |
