• English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • English 
    • English
    • français
    • Deutsch
    • español
    • português (Brasil)
    • Bahasa Indonesia
    • русский
    • العربية
    • 中文
  • Login
View Item 
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
  •   Home
  • OAI Data Pool
  • OAI Harvested Content
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Browse

All of the LibraryCommunitiesPublication DateTitlesSubjectsAuthorsThis CollectionPublication DateTitlesSubjectsAuthorsProfilesView

My Account

Login

The Library

AboutNew SubmissionSubmission GuideSearch GuideRepository PolicyContact

Statistics

Most Popular ItemsStatistics by CountryMost Popular Authors

Bayesian learning with multiple priors and nonvanishing ambiguity

  • CSV
  • RefMan
  • EndNote
  • BibTex
  • RefWorks
Author(s)
Ma, Wei
Contributor(s)
alexander.zimper@up.ac.za
Zimper, Alexander
Keywords
Ambiguity
Bayesian learning
Kullback-Leibler divergence
Ellsberg paradox
Misspecified priors
Berk’s theorem

Full record
Show full item record
URI
http://hdl.handle.net/20.500.12424/2454806
Online Access
http://hdl.handle.net/2263/58312
Abstract
The existing models of Bayesian learning with multiple priors by Marinacci (Stat Pap 43:145–151, 2002) and by Epstein and Schneider (Rev Econ Stud 74:1275–1303, 2007) formalize the intuitive notion that ambiguity should vanish through statistical learning in an one-urn environment. Moreover, the multiple priors decision maker of these models will eventually learn the “truth.” To accommodate nonvanishing violations of Savage’s (The foundations of statistics, Wiley, New York, 1954) sure-thing principle, as reported in Nicholls et al. (J Risk Uncertain 50:97–115, 2015), we construct and analyze a model of Bayesian learning with multiple priors for which ambiguity does not necessarily vanish in an one-urn environment. Our decision maker only forms posteriors from priors that survive a prior selection rule which discriminates, with probability one, against priors whose expected Kullback–Leibler divergence from the “truth” is too far off from the minimal expected Kullback–Leibler divergence over all priors. The “stubbornness” parameter of our prior selection rule thereby governs how much ambiguity will remain in the limit of our learning model.
http://link.springer.com/journal/199
2017-10-31
hb2016
Economics
Date
2016-11-30
Type
Postprint Article
Identifier
oai:repository.up.ac.za:2263/58312
Zimper, A. & Ma, W. Bayesian learning with multiple priors and nonvanishing ambiguity. Economic Theory (2016). doi:10.1007/s00199-016-1007-y. NYP.
0938-2259 (print)
1432-0479 (online)
10.1007/s00199-016-1007-y
http://hdl.handle.net/2263/58312
Copyright/License
© Springer-Verlag Berlin Heidelberg 2016. The original publication is available at : http://link.springer.com/journal/199.
Collections
OAI Harvested Content

entitlement

 
DSpace software (copyright © 2002 - 2022)  DuraSpace
Quick Guide | Contact Us
Open Repository is a service operated by 
Atmire NV
 

Export search results

The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.