It is possible to load and run 14 Billion parameter llm AI models on Raspberry Pi5 with 16 GB of memory ($120). However, they can be slow with about 0.6 tokens per second. A 13 billion parameter model ...
Not sure I'm going out and buying one. I have pi2,3,4,5 doing various things. A number of them are paired to software defined radios, a pi5 is a desktop machine out in my machine shop, Home Assistant ...