Chapter 20 - Value-Based Decision Making
References (0)
Cited by (23)
Value-free reinforcement learning: policy optimization as a minimal model of operant behavior
2021, Current Opinion in Behavioral SciencesCitation Excerpt :Value — as typically defined — is therefore a latent construct that scaffolds choice behavior by providing a common currency for comparison of different actions (often loosely identified with the similar economic concept of utility [8,9]). In support of the hypothesis that choice behavior is supported by the computation of value, proponents frequently note (e.g. [10]) that reward-related dopaminergic neural activity is consistent with a class of value-learning algorithms from the field of reinforcement learning (RL) [11,12]. This argument implicitly assumes that different theoretical frameworks each use the term ‘value’ in the same way.
An amygdala-cingulate network underpins changes in effort-based decision making after a fitness program
2019, NeuroImageCitation Excerpt :The scatterplot (Fig. 3C) revealed a possible non-linear monotonic relationship between the variables, which was confirmed by Spearman’s test: ρ = −0.6491, p = 0.0026; N = 19. According to the theory of value-based decision making, agents subjectively assign values to the options presented, and select the one with the highest utility (Glimcher, 2013). Subjective valuation depends on the rewarding features of the object, but also on the intrinsic costs that the subject has to overcome to obtain it (Wallis and Rushworth, 2013).
Feedback trading: a review of theory and empirical evidence
2023, Review of Behavioral FinanceProsocial learning: Model-based or modelfree?
2023, PLoS ONEPathways to the persistence of drug use despite its adverse consequences
2023, Molecular PsychiatryNeurobiological substrates of the dread of future losses
2023, Cerebral Cortex