Something to handle code, text and math.

  • wraekscadu@vargar.org
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 day ago

    Depends on how big your local LLM model is. You can easily run 2-3B param models on a potato. BUT, the model is shit.