riversongs Posted May 27 Report Share Posted May 27 Free Download Code with the Author of Build an LLM (From Scratch) by Sebastian RaschkaReleased 5/2025By Sebastian RaschkaMP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz, 2 ChGenre: eLearning | Language: English | Duration: 13h 35m | Size: 2.72 GBMaster the inner workings of how large language models like GPT really work with hands-on coding sessions led by bestselling author Sebastian Raschka. These companion videos to Build a Large Language Model from Scratch walk you through real-world implementation, with each session ending in a "test yourself" challenge to solidify your skills and deepen your understanding.Table of contentsChapter 1. Python Environment SetupChapter 2. Tokenizing textChapter 2. Converting tokens into token IDsChapter 2. Adding special context tokensChapter 2. Byte pair encodingChapter 2. Data sampling with a sliding windowChapter 2. Creating token embeddingsChapter 2. Encoding word positionsChapter 3. A simple self-attention mechanism without trainable weights | Part 1Chapter 3. A simple self-attention mechanism without trainable weights | Part 2Chapter 3. Computing the attention weights step by stepChapter 3. Implementing a compact self-attention Python classChapter 3. Applying a causal attention maskChapter 3. Masking additional attention weights with dropoutChapter 3. Implementing a compact causal self-attention classChapter 3. Stacking multiple single-head attention layersChapter 3. Implementing multi-head attention with weight splitsChapter 4. Coding an LLM architectureChapter 4. Normalizing activations with layer normalizationChapter 4. Implementing a feed forward network with GELU activationsChapter 4. Adding shortcut connectionsChapter 4. Connecting attention and linear layers in a transformer blockChapter 4. Coding the GPT modelChapter 4. Generating textChapter 5. Using GPT to generate textChapter 5. Calculating the text generation loss: cross entropy and perplexityChapter 5. Calculating the training and validation set lossesChapter 5. Training an LLMChapter 5. Decoding strategies to control randomnessChapter 5. Temperature scalingChapter 5. Top-k samplingChapter 5. Modifying the text generation functionChapter 5. Loading and saving model weights in PyTorchChapter 5. Loading pretrained weights from OpenAIChapter 6. Preparing the datasetChapter 6. Creating data loadersChapter 6. Initializing a model with pretrained weightsChapter 6. Adding a classification headChapter 6. Calculating the classification loss and accuracyChapter 6. Fine-tuning the model on supervised dataChapter 6. Using the LLM as a spam classifierChapter 7. Preparing a dataset for supervised instruction fine-tuningChapter 7. Organizing data into training batchesChapter 7. Creating data loaders for an instruction datasetChapter 7. Loading a pretrained LLMChapter 7. Fine-tuning the LLM on instruction dataChapter 7. Extracting and saving responsesChapter 7. Evaluating the fine-tuned LLMAusFilehttps://ausfile.com/knu39so6drzn/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part1.rar.htmlhttps://ausfile.com/y1yguee5dk37/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part2.rar.htmlhttps://ausfile.com/yiz5phj6ivfi/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part3.rar.htmlRapidgator Links Downloadhttps://rg.to/folder/8101547/CodewiththeAuthorofBuildanLLMFromScratch.htmlFikper Links Downloadhttps://fikper.com/12tZY9EAjx/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part2.rar.htmlhttps://fikper.com/WtdUELo1q9/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part3.rar.htmlhttps://fikper.com/p9Rg9XH7Pq/ncxow.Code.with.the.Author.of.Build.an.LLM.From.Scratch.part1.rar.htmlNo Password - Links are Interchangeable Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now