• How do you filter signal from noise and dynamically modify your filters?

  • Sine wave speech priming perception - generative models playing a role in top-down processing.

  • cognitive machinery continual estimates and re-estimates uncertainty in its predictions to decide how much to rely on sense vs top-down knowledge.

  • On a foggy day, visual error is given less weight than on a bright and sunny day. Impact of prediction error is attenuated by the precision we expect from the sensory input.

  • attention means we’re putting higher value on the prediction error for the thing we’re attending. There can be both an error enhancement and error suppression effect, depending on what we’re attending.

  • it seems hard to attend something for a long period where the attention does not generate new information.

  • precision estimates on sensory data help us to balance and react to data better (e.g. in an environment where sight seems precise but audio not so much, we ignore some audio and rely more heavily on sight).

  • the idea is introduced that the generative predictive models come with a game plan on how to best reduce uncertainty using your senses.

  • Low level salience maps don’t seem to describe our behavior in real situations; combinations of those with top-down predictive models do better (e.g. looking at where the ball will be in baseball).

  • PP treats action, perception, and attention as forming a single mechanism for integrating bottom-up sensory data with top-down predictions.

  • side note: thinking about magicians and how they manipulate our expectations on reducing uncertainty to perform their tricks.

  • Active agents are driven to sample the world to confirm their own perceptual hypotheses.

  • Confirmation bias directly falls out of this model

  • schizophrenic hallucinations and delusions might represent a breakdown of the machinery that determines how reliable sensory data is and how reliable our predictions are. False perceptions and bizarre beliefs can become self-reinforcing as hyperpriors are reshaped.

  • From the wiki on Predictive coding:

    • A comparison between predictions (priors) and sensory input (likelihood) yields a difference measure (e.g. prediction error, free energy, or surprise) which, if it is sufficiently large beyond the levels of expected statistical noise, will cause the generative model to update so that it better predicts sensory input in the future.

    • If, instead, the model accurately predicts driving sensory signals, activity at higher levels cancels out activity at lower levels, and the posterior probability of the model is increased. Thus, predictive coding inverts the conventional view of perception as a mostly bottom-up process, suggesting that it is largely constrained by prior predictions, where signals from the external world only shape perception to the extent that they are propagated up the cortical hierarchy in the form of prediction error.

  • Useful exercise: try to draw the Feynman machine like boxes proposed by predictive processing.

  • TODO: find papers on implementation of predictive coding