“LangChain: Chat with Your Data” is a new free short course by Harrison Chase, CEO of LangChain, on how to use an LLM to chat with your own data. The course, which is about an hour long, is hosted on Andrew Ng’s Deeplearning AI platform.
After playing with ChatGPT or any other LLM (Large Language Model), what’s the next logical step? Customize it to use your own data. LLM models are pre-trained on the vast amounts of text available online, so if you want to make it answer questions based on your documents or emails, for example, that will maximize its potential.
Note that this is not the same as initially customizing the LLM on your own domain-specific data, such as medical diagnostics or financial data. In this case, the initial model relies on its pre-trained expertise and therefore may not provide the best answers to your questions that are very different from what it already knows.
However, for day-to-day use, the add-in is quite suitable and can handle your personal data, your company’s proprietary documents, and data or articles that were written after your LLM training with a high percentage of success.
As a matter of fact, for such specialized cases, a special LLM fine-tuning course was recently released on the same platform.
LangChain is a library that abstracts the details of specific LLM providers, so that with one library we can communicate with different LLMs such as OpeanAI or Hugginface, for example.
Throughout the course, video tutorials are accompanied by a Jupyter notebook containing relevant Python code that you can work with in real time.
From the course, you will learn about:
- Document Loading: Explore the basics of data loading and learn about the more than 80 unique loaders LangChain provides to access a variety of data sources, including audio and video.
- Document Partitioning: Discover best practices and considerations for data partitioning.
- Vector Stores and Embeddings: Dive into the concept of embeddings and explore the integration of vector stores in LangChain.
- Data Extraction: Learn best practices for accessing and indexing data in a vector store, allowing you to extract the most relevant information from semantic queries.
- Answering Questions: Build a one-pass solution to answer questions.
- Chatbots: Learn how to track and select relevant information from conversations and data sources, and build your own chatbot using LangChain.
In total, the course will take approximately 1 hour to complete. Fast, to the point and covers a very useful LLM use case.
Mobile App Development Best Practices – 02.10
Data.ai has summarized the interim results of the year – and once again we have a record. Annual consumer spending...
How to Use Kotlin’s Timing API
Kotlin’s Timing API is stable as of Kotlin 1.9, and it offers some great ways to both measure and specify...
Candy Crush made $20 billion
King attributes its huge success to several factors, including not only the games themselves, but also the company's commitment to...
FittedSheets – Bottom sheets for iOS.
This project is to enable easily presenting view controllers in a bottom sheet that supports scrollviews and multiple sizes. Contributions...
ComposeFadingEdges – Android Compose UI with fading edges
The ComposeFadingEdges is a powerful Android Compose library that seamlessly incorporates customisable fading edges with horizontal or vertical orientations, static or scrollable...
Mobile Consumer Spend Surpasses $100 Billion in Record Time in 2023
Positive growth in mobile consumer spend continued in Q3 2023 at a solid 3.7% year-over-year. Downloads declined slightly over the...