Pre-Flight Briefing

In-Context Learning

While large language models demonstrate remarkable zero-shot capabilities, they still fall short on more complex tasks. Few-shot prompting enables 'in-context learning' by providing demonstrations in the prompt to steer the model.

These demonstrations serve as conditioning. You can teach a model a completely new concept or a made-up word just by providing a single example (1-shot prompting).

Research shows that the format you use plays a key role in performance. Surprisingly, keeping a consistent format is often more important than the actual correctness of the labels in your examples!

Reference Examples

1-Shot Example (Teaching a New Word)A 'whatpu' is a furry animal. Example: We saw very cute whatpus. To do a 'farduddle' means to jump fast. Example: