Bay.Area.AI: DSPy: Prompt Optimization for LM Programs, Michael Ryan

Поділитися
Вставка
  • Опубліковано 20 жов 2024
  • DSPy: Prompt Optimization for LM Programs
    Michael Ryan, Stanford
    It has never been easier to build amazing LLM powered applications. Unfortunately engineering reliable and trustworthy LLMs remains challenging. Instead, practitioners should build LM Programs comprised of several composable calls to LLMs which can be rigorously tested, audited, and optimized like other software systems. In this talk I will introduce the idea of LM Programs in DSPy: The library for Programming - not Prompting LMs. I will demonstrate how the LM Program abstraction allows the creation of automatic optimizers for LM Programs which can optimize both the prompts and weights in an LM Program. I will conclude with an introduction to MIPROv2: our latest and highest performing prompt optimization algorithm for LM Programs.
    Michael Ryan is a masters student at Stanford University working on optimization for Language Model Programs in DSPy and Personalizing Language Models. His work has been recognized with a Best Social Impact award at ACL 2024, and an honorable mention for outstanding paper at ACL 2023. Michael co-lead the creation of the MIPRO & MIPROv2 optimizers, DSPy’s most performant optimizers for Language Model Programs. His prior work has showcased unintended cultural and global biases expressed in popular LLMs. He is currently a research intern at Snowflake.

КОМЕНТАРІ • 1

  • @jeyakumarc3628
    @jeyakumarc3628 Годину тому

    It is very interesting to see how DSPy is working towards to Prompt Programming, I feel like Prompt Engineering is pain task with different AI models and maintainability