LlamaCast

LlamaCast

Shahriar Shariati

Daily podcast about the published articles in the LLM field.

Categorias: Tecnología

Escuchar el último episodio:

⚖️ Scaling Laws for Precision

This research paper investigates the impact of precision in training and inference on the performance of large language models. The authors explore how precision affects the effective parameter count and propose scaling laws that predict performance degradation due to low-precision training and post-training quantization. They find that overtrained models are more sensitive to post-training quantization, and that training larger models in lower precision might be computationally optimal. Their unified scaling law accounts for both training and post-training effects and predicts loss in varied precision settings, ultimately suggesting that the standard practice of training models in 16-bit might be suboptimal.

📎 Link to paper
🌐 Read their Tweet

Episodios anteriores

  • 48 - Scaling Laws for Precision 
    Mon, 18 Nov 2024
  • 47 - Test-Time Training 
    Thu, 14 Nov 2024
  • 46 - Qwen2.5-Coder 
    Tue, 12 Nov 2024
  • 45 - Attacking Vision-Language Computer Agents via Pop-ups 
    Sat, 09 Nov 2024
  • 44 - Number Cookbook 
    Fri, 08 Nov 2024
Mostrar más episodios

Más podcasts tecnología de Venezuela

Más podcasts tecnología internacionales

Elige la categoria de podcast