Considering how some of the right wing media completely ignores fact checking, it would be hilarious to just fill the internet with crazy shit like this and see if they run with some of it
MTK
Jeez, I wonder if it has anything to do with the president being a pedo and also giving out pardons to anyone who supports him like it's the candy from a white van.
Who is designing drugs for cancer? What about drugs for the patients?
Either a bat or a hyper-evolved lesbian
You can sniff the network and see if the TV is connecting anywhere.
It's very very unlikely that your TV and your device connected to it both support and enable ethernet over HDMI by default. But if you are unsure you can test it by connecting and seeing if the TV is getting a connection.
Personally I also opened my TV and disconnected the wifi card since in theory the TV could also just try to connect to any open wifi in the area without me knowing, but to each their own threat model.
There are a few places in the world where it is cheap and good, just not a lot of them.
It's not a bad idea, but there are plenty of countries where it would be abused to xbox live chat levels.
Science and Logic day! 23rd of January. Or 12th of March
Celebrations include home science projects, appreciation of scientists and professors, local interactive educational scientific events, etc
The affects would be appreciation of truth and logic which would fight the misinformation issues that we currently have, and it will also serve to educate people and inspire them to seek further education and understanding.
Adoptability
Basically become one of those cats that walk into a house and no one makes them leave
To reduce gas with beans:
- soak with baking soda (1tsp per cup of beans)
- before cooking boil some water and in a bowl cover the beans with the boiled water, after 5 minutes drain and wash them and throw them in to whatever you are cooking
- ferment the beans, best results but more work
Also remember that as your body gets used to it, the gas is reduced.
Buying new: Basically all of the integrated memory units like macs and amd's new AI chips, after that any modern (last 5 years) gpu while focusing only on vram (currently nvidia is more properly supported in SOME tools)
Buying second hand: not likely to find any of the integrated memory stuff, so any GPU from the last decade that is still officially supported and focusing on vram.
8gb is enough to run basic small models, 20+ for pretty capable 20-30b models, 50+ for the 70b ones and 100-200+ for full sized models.
These are rough estimates, do your own research as well.
For the most part with LLMs for a single user you really only care about VRAM and storage speed(ssd) Any GPU will perform faster than you can read for anything that fully fits on it's VRAM, so the GPU only matters if you intend on running large models at extreme speeds (for automation tasks, etc) And the storage is a bottleneck at model load, so depending on your needs it might not be that big of an issue for you, but for example with a 30gb model you can expect to wait 2-10 minutes for it to load into the vram from an HDD, about 1 minute with a sata SSD, and about 4-30 seconds with an NVMe.