Skip to main content

Linux

How to Run LLMs Larger than RAM
Machine-Learning Large-Language-Models Linux
A short experiment on running larger LLMs on low-end consumer hardware, with comments on performance trade-offs and practicality.