Connect with us

News

Free course “LangChain: Chat with Your Data”

Throughout the course, video tutorials are accompanied by a Jupyter notebook containing relevant Python code that you can work with in real time.

“LangChain: Chat with Your Data” is a new free short course by Harrison Chase, CEO of LangChain, on how to use an LLM to chat with your own data. The course, which is about an hour long, is hosted on Andrew Ng’s Deeplearning AI platform.

After playing with ChatGPT or any other LLM (Large Language Model), what’s the next logical step? Customize it to use your own data. LLM models are pre-trained on the vast amounts of text available online, so if you want to make it answer questions based on your documents or emails, for example, that will maximize its potential.

Note that this is not the same as initially customizing the LLM on your own domain-specific data, such as medical diagnostics or financial data. In this case, the initial model relies on its pre-trained expertise and therefore may not provide the best answers to your questions that are very different from what it already knows.

However, for day-to-day use, the add-in is quite suitable and can handle your personal data, your company’s proprietary documents, and data or articles that were written after your LLM training with a high percentage of success.

As a matter of fact, for such specialized cases, a special LLM fine-tuning course was recently released on the same platform.

LangChain is a library that abstracts the details of specific LLM providers, so that with one library we can communicate with different LLMs such as OpeanAI or Hugginface, for example.

Throughout the course, video tutorials are accompanied by a Jupyter notebook containing relevant Python code that you can work with in real time.

From the course, you will learn about:

  • Document Loading: Explore the basics of data loading and learn about the more than 80 unique loaders LangChain provides to access a variety of data sources, including audio and video.
  • Document Partitioning: Discover best practices and considerations for data partitioning.
  • Vector Stores and Embeddings: Dive into the concept of embeddings and explore the integration of vector stores in LangChain.
  • Data Extraction: Learn best practices for accessing and indexing data in a vector store, allowing you to extract the most relevant information from semantic queries.
  • Answering Questions: Build a one-pass solution to answer questions.
  • Chatbots: Learn how to track and select relevant information from conversations and data sources, and build your own chatbot using LangChain.

In total, the course will take approximately 1 hour to complete. Fast, to the point and covers a very useful LLM use case.

Advertisement

Trending