Show HN: I ran a language model on a PS2

(github.com)

5 points | by xaskasdf 3 hours ago

1 comments

  • SilentEditor 3 hours ago
    Love this project. The CD streaming trick is such a smart constraint hack, and honestly the best part is you trained the model for the hardware instead of forcing a desktop recipe onto PS2.

    Curious about 2 things if you can share:

    whats your per-token latency on real hardware how much quality loss came from PSNT quantization vs fp16 baseline Either way this is peak hacker energy, shipping on actual hardware makes it 10x cooler.

    • xaskasdf 2 hours ago
      It didn't had any quality loss, since the PSNT as quantization it's mainly to convert the model over the console constraints (you can convert any model you want, even when i trained a model for this hw); it's q8 quantization, so quality loss is negligible for these sizes. For the speed, I will fix the Tok/sec count since now drops 0 always for showing measures

      PS: Thank you! And forgot to mention PSNT also supports bitnet models, they work like crap tho