Member-only story

Getting started with Prompty

An implementation with LangChain

Valentina Alto
8 min readSep 27, 2024

--

When it comes to GenAI-powered applications, it’s getting more and more evident how the system message or metaprompt is a core component to design and develop.

We can define the metaprompt as the set of instructions that we provide the LLM with. It can be seen as a kind of config.json file, but in natural language. Metaprompt differs from what we simply call prompt: while this latter refers to the user’s query, the metaprompt is something that the end user doesn’t see, it is a backend configuration which is set before the application is deployed.

As an AI app core component, metaprompt should have the following features:

  • it should be secured
  • it should follow a standard specification across different AI app so that it is portable
  • it should be observable and easily manageable
  • it should be updated frequently to adapt to new models’ versions or types.

While we covered the security aspect in my former article, in this article we are going to focus on how to make it easier to standardize the way we…

--

--

Valentina Alto
Valentina Alto

Written by Valentina Alto

Data&AI Specialist at @Microsoft | MSc in Data Science | AI, Machine Learning and Running enthusiast

No responses yet