JohnEdwa
Without the lens, exactly.
Realistically, cameras can be put into two categories - they either effortlessly fit in your pocket, or don't, and any that don't tend to get left home unless you intend to specifically go take photos. Doesn't really matter how much bigger it is at that point.
And if you have a high end smartphone, you probably can't get a camera that fits in your pocket that would be significantly better.
As the saying goes, the best camera is the one you have with you.
I simply wouldn't. A dumbphone does mostly the things I don't use a phone for.
And I don't mean fortnite and tickytocks, I've grown up through (most) of the history of mobile phones, I started with my mothers old Nokia 2110 back in like... 1998? I remember how awesome it was to finally have a phone, then to be able to get the bus schedules with the painfully slow WAP connection so I didn't have to call home, then to have navigation, replace the mp3 player, camera, and eventually even mostly my laptop.
I want to have a datapad with access to all the devices and information in my pocket at all times. If I need it to do something, I know there's an app for it probably. It's awesome.
I'd really prefer that the datapad wouldn't then leech all of my information in return, though.
Oh, and bring back physical keyboards. I'd give my left nut for an HTC Desire Z with 2025 hardware.
It's not really that different, the exact temperatures are slightly higher but most intel processors will boost up to 105C, then start throttling to maintain that 105C as a maximum, and if that's not possible they'll halt at 110C.
AMD does the same, just the temps are (for the one specific CPU I remember them for) 80-85C for starting dialing down the boost, 90C for throttling below the normal freq, and 95C for TjMax which either halts the system or just drops the power usage so low it doesn't matter - I'm not about to take a heatgun to my CPU to see what it does as it wasn't capable of hitting that on its own.
But it shouldn't be possible to break your CPU from over temperature, no matter what those temps are, because they should be capable of protecting themselves, even if that means dropping to 386 speeds when you are running them in the Death Valley with not cooler whatsoever.
"It's not AI, it's just <multiple things that all fall under the category of Artificial Intelligence>".
AI is a huge field of computer science. It's not the one tiny narrow definition of artificial general intelligence like HAL 9000 or Skynet or Detroit Become Human.
There are plenty of people developing apps that require root, and users who run those are already jumping through a million hoops of cat and mouse to keep their fucking mcdonalds app detecting it so they can get cheaper coffees and free fries.
Like seriously, wtf McDonalds, your app is like the ultimate root/safetynet/device id detection tool, I don't think there exists even a banking app that is as hard to fool.
It is, because it's actually the term that defines the process of transferring files not from an external networked device - downloading - or to an external networked device - uploading - but between two local devices - sideloading.
It's over two decades old, you downloaded an mp3 from kazaa, and then sideloaded it to your player.
For android apps, I believe the term originates from the method of using ADB to directly write the app to the phone memory, the command of which is "adb sideload filename"
It's a launcher, it can do a heck of a lot of things, and needs permissions to be able to do so.
But you don't actually have to allow any of them in modern Androids if you don't need those features. Nova Launcher also has quite a list, but I haven't actually enabled any of them, it has never asked to, and everything works fine.
We do, but in the last decade youtube has doubled its yearly user count to something like 2.5 billion. That's a lot of more people-hours to spend as well.
A simple line of code that goes "if moisture < 0.25 then loaddone" of "water = weight * 0.43" isn't AI, true.
But when you start stacking enough of them with the goal and results being "We could get a chef to check how the pizza is doing every few seconds and, and control all of the different temperatures of this oven until it's perfectly done, but we have made a computer algorithm that manages to do that instead", then it's quite hard to argue it isn't software that is "performing a task typically associated with human intelligence, such as ... perception, and decision-making."
Especially if that algorithm was (I have no idea if it was in this case btw) not done by just stacking those if clauses and testing stuff manually until it works, but by using machine learning to analyze a mountain of baking data to create a neural network that does it itself. Because at that point, it definitely is artificial intelligence - it's not an artificial general intelligence, which many people think is the only type of "true AI", but it is an AI.