I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.
I don't expect them to replace traditional chips in my lifetime if ever.
Just how jealous can he be of Obama? Run for president because a joke at the correspondence dinner and willing to start WWIII because Obama won the Nobel Peace Prize. FFS
Yeah, but mostly going older games. Newer games usually have accessibility settings now, probably due to laws in various parts of the world. Anything since HiDPI became common should be fine.
You are more likely to run into text being to large in that scenario. I have Ultrawide 40" 5k2k & 32" 4k computer monitor and run linux with Steam. Text is fine, It is just going to get larger on a bigger screen. Steam has scaling for its interface and Plasma, the desktop for SteamOS, is feature rich for most people's usages. I am certain people already are using Linux and Steam with TVs. If there is an issue, it would also be an issue on Windows. Mostly older games that weren't updated for HiDPI screens.
Apparently you need a Ryzen 3500 with a GTX 1660 to get 30fps and a 7600 with and RTX 3060Ti to get 60 which seems taxing. Specifically for the cpu side.
When did it get acceptable to have 30fps as a target on PC?
MAJOR version when you make incompatible API changes
MINOR version when you add functionality in a backward compatible manner
PATCH version when you make backward compatible bug fixes
then I think that would be on like 3.77.0 or something right now. Not terrible, but honestly prefer it to be like the major upped in the new year every year. It is about 43 years old,so 43.x in 2026. Would be easier to know how old a kernel release is without looking it up.
I'd rather have screws than those clip-in covers that break or having to pry the device open like some brands of devices, ie most of the tech industry. Somewhere in the middle. Quickly being able to replace a battery easily a plus don't get me wrong, but I don't want it getting torn up in the process one either extreme. I am ok with it may take several minutes, but not with "can I buff this out" or "where is the tape/glue".
There is a good reason to move away from a image based subtitle vs a text based one. It CAN look better, but not always. The problem is that their player or your web browser needs to allow changing the fonts sizing and typeset. On Apple's devices this can be done in system settings, but will be different on other platforms if possible at all.
Doesn't surprise to hear this. If Nvidia was really holding back, then AMD would have past them. I feel like they are starting to experience what the cpu side started seeing when they hit 4ghz and had to start chipping away at more clocks. It took longer as they are doing easily parallel operations, but it was bound to happen. I really wonder how both AMD and Nvidia will compare to their prior architectures next iteration. Will my 4080 still be faster than a 6070(ti)?
Yeah, this is weird. Her being one of the most "reasonable" republican even more concerning to me. I know they a screwed up, but when they go so far that she isn't willing to go....
I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.
I don't expect them to replace traditional chips in my lifetime if ever.