Cold Email Automation with LangChain, Lemlist & Apify
Вставка
- Опубліковано 1 чер 2024
- In this video, I’m going to show you how you use the language modeling framework Langchain to feed personalized lines, paragraphs or entire emails to the cold email platform Lemlist. This allows you to boost both the level of automation & personalization in your B2B cold email marketing campaigns.
Link to the code:
github.com/rabbitmetrics/cold...
▬▬▬▬▬▬ V I D E O C H A P T E R S & T I M E S T A M P S ▬▬▬▬▬▬
0:00 Introduction and overview
1:25 Apify and OpenAI API keys
2:25 The code
5:19 Lemlist API integration - Наука та технологія
i recently found your channel and have really been impressed. you have such great production quality and really informative videos! keep it up!!
Excellent content! Thank you!
You're welcome! Thanks for watching
Great stuff!
I think that vicuna 14B model maybe a good replacement for gpt4
Great job! 😎
Could you please create video tutorials on how to create a custom agent in Langchai? I noticed that the previous tutorial you provided on customizing Langchai was a bit challenging for me as a beginner. It involved the use of Python classes, which made it difficult for me to comprehend. I greatly appreciate all your other videos, as they are highly comprehensive and easy to understand, except for the custom language tutorial. Therefore, I kindly request you to consider making a new video on this topi
Thank you.
You're welcome!
Cool ! I wonder how hard it will be to get rid of GPT-4 and use a local alternative on a GPU powered server.
Same, someone have to broke this chain.
Thanks for watching - planning a video on this
How can you do this for 50 businesses at once?
You would need to use the Lemlist API. Depending on volume you might want to start off with a lower level of personalization. Keep an eye on OpenAI costs and look into going open source with a model from Hugging Face. LangChain makes switching out the language model easy.
1. Source code provided on git is missing the loader shown in the video.
Here is the code of the missing loader, which should after the crawl_input cell:
loader = apify.call_actor(
actor_id = "apify/website-content-crawler",
run_input=crawl_input,
dataset_mapping_function=lambda item: Document(
page_content=item["text"] or "", metadata={"source": item["url"]}
),
)
2. It would be easier to add IN THE CODE the environmental variables. It is operating system independent and easier to explain to the public in a video.
This is done by adding a cell, after the frist import of libraries:
os.environ["OPENAI_API_KEY"] = "..."
os.environ["APIFY_API_TOKEN"] = "..."
where the "..." should replace by the actual key from apify and OpenAI.
Even when all the corrections are made, the code still returns an empty "docs"