LangChain for LLM Application Development
https://learn.deeplearning.ai/langchain
Models, Prompts, and Parsers
Langchain provides convenience wrappers over OpenAI models. It lets you wrap your prompts so that even the complex prompts can be reused. Parsers let you define the format of the output you want to extract from the OpenAI response.
Memory
LLMs are stateless.
Each transaction is independent.
the entire conversation is provided as context every single time.
Store the entire conversation in the conversation buffer memory. Every new message is added to the buffer.
It is possible to use the conversationbufferwindowmemory. This way only the last n messages are stored in the context memory.
there is also a conversationTokenBufferMemory - this can help control the spending directly as pricing is related to tokens
there is also a ConversationSummaryBufferMemory - summary of the conversation over time.
Chains
163 views
Comments
Participate in the conversation.
Never miss a post from
Satyajeet Jadhav
Get notified when Satyajeet Jadhav publishes a new post.
Read More

Thinkdeli Release Notes
This document maintains the list of new features and bug fixes as they happen.
