train_a_llm_on_your_own_content

Train a LLM on your own content

Return to LLM PC

Train it on 500 Sutras Tantras Shastras

https://www.manning.com/liveproject/qa-using-vector-databases

prepare your data, create a vector store to embed your documents, and then use LangChain to combine it with an LLM.

LM Studio Discover, download, and run local LLMs

https://lmstudio.ai

https://simonwillison.net/2023/Nov/29/llamafile

https://www.reddit.com/r/LocalLLaMA/about/

https://github.com/Mozilla-Ocho/llamafile

Local 1M Context Inference at 15 tokens/s and ~100% “Needle In a Haystack”: InternLM2.5-1M on KTransformers, Using Only 24GB VRAM and 130GB DRAM. Windows/Pip/Multi-GPU Support and More.

Resources Hi! Last month, we rolled out our KTransformers project (https://github.com/kvcache-ai/ktransformers), which brought local inference to the 236B parameter DeepSeeK-V2 model. The community's response was fantastic, filled with valuable feedback and suggestions. Building on that momentum, we're excited to introduce our next big thing: local 1M context inference!

https://www.reddit.com/r/LocalLLaMA/comments/1f3xfnk/local_1m_context_inference_at_15_tokenss_and_100/

https://medium.com/@younesh.kc/rag-vs-fine-tuning-in-large-language-models-a-comparison-c765b9e21328

Fine-tuning an LLM is time-consuming and expensive! That’s why your local school district has tasked you with using RAG (Retrieval Augmented Generation) to help improve the capabilities of a chemistry chatbot based on Meta AI’s Laama. RAG allows an LLM to search a database to help answer questions, avoiding any unfortunate hallucinations. You’ll create a vector database for external knowledge for your chatbot’s RAG, establish an RAG API server for it to use, and then deploy your new bot to both the web and Discord.

https://www.manning.com/liveproject/add-knowledge-to-the-chatbot

https://www.reddit.com/r/LLM/comments/135c32d/how_would_i_train_an_llm_on_books

https://www.datacamp.com/tutorial/how-to-train-a-llm-with-pytorch

Build a Large Language Model (From Scratch)

https://www.manning.com/books/build-a-large-language-model-from-scratch

train_a_llm_on_your_own_content.txt · Last modified: 2025/02/01 06:24 by 127.0.0.1

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki