Starfighter

joined 2 years ago
[–] Starfighter@discuss.tchncs.de 5 points 5 days ago* (last edited 5 days ago)

There are some experimental models made specifically for use with Home Assistant, for example home-llm.

Even though they are tiny 1-3B I've found them to work much better than even 14B general purpose models. Obviously they suck for general purpose questions just by their size alone.

That being said they're still LLMs. I like to keep the "prefer handling commands locally" option turned on and only use the LLM as a fallback.