How does in-context learning in models like GPT-3 relate to implicit Bayesian inference?
A) In-context learning involves explicitly training the model on each task before it can perform it, similar to Bayesian updating.
B) In-context learning allows the model to perform tasks by using examples as prior information and making inferences, akin to how Bayesian inference updates probabilities based on new evidence.
C) In-context learning requires the model to store all possible outcomes and select the correct one, unlike Bayesian inference.
D) In-context learning ignores prior examples and only focuses on the current input, which is opposite to Bayesian inference.