Well, the ones based on Chromium aren’t, anyway. I’ve heard some major criticisms of Safari in the last few years, for what that’s worth.
Well, the ones based on Chromium aren’t, anyway. I’ve heard some major criticisms of Safari in the last few years, for what that’s worth.
The NHS’ virtual appointment service in the UK doesn’t support Firefox either, only Chrome, Safari and Edge. The dark days of “please view this website in Internet Explorer 6” are creeping closer to the present again. I hate the modern internet.
Funnily enough, one of the few legitimately impactful non-enterprise uses of AVX512 I’m aware of is that it does a really good job of accelerating emulation of the Cell SPUs in RPCS3. But you’re absolutely right, those things are very funky and implementing their functions is by far the most difficult part of PS3 emulation.
Luckily, I think most games either didn’t do much with them or left programming for them to middleware, so it would mostly be first- and second-party games that would need super-extensive customisation and testing. Sony could probably figure it out, if they were convinced there was sufficient demand and potential profit on the other side.
There’s even rumours that the next version of Windows is going to inject a bunch of AI buzzword stuff into the operating system. Like, how is that going to make the user experience any more intuitive? Sounds like you’re just going to have to fight an overconfident ChatGPT wannabe that thinks it knows what you want to do better than you do, every time you try opening a program or saving a document.
The Xbox 360 was based on the same weird, in-order PowerPC 970 derived CPU as the PS3, it just had three of them stuck together instead of one of them tied to seven weird Cell units. The TL;DR of how Xbox backwards compatibility has been achieved is that Microsoft’s whole approach with the Xbox has always been to create a PC-like environment which makes porting games to or from the Xbox simpler.
The real star of the show here is the Windows NT kernel and DirectX. Microsoft’s core APIs have been designed to be portable and platform-agnostic since the beginning of the NT days (of course, that isn’t necessarily true of the rest of the Windows operating system we use on our PCs). Developers could still program their games mostly as though they were targeting a Windows PC using DirectX since all the same high-level APIs worked in basically the same way, just with less memory and some platform-specific optimisations to keep in mind (stuff like the 10MB of eDRAM, or that you could always assume three 3.2GHz in-order CPU cores with 2-way SMT).
Xbox 360 games on the Xbox One seem to be run through something akin to Dolphin’s “Übershaders” - in this case, per-game optimised modifications of an entire Xenon GPU stack implemented in software running alongside the entire Xbox 360 operating environment in a hypervisor. This is aided by the integration of hardware-level support for certain texture and audio formats common in Xbox 360 games into the Xbox One’s CPU design, similarly to how Apple’s M-series SoCs integrate support for x86-style memory ordering to greatly accelerate Rosetta 2.
Microsoft’s APIs for developers to target tend to be fairly platform-agnostic - see Windows CE, which could run on anything from ARM handhelds to the Hitachi SH-4 powered Sega Dreamcast. This enables developers who are mostly experienced in coding for x86 PCs running Windows to relatively easily start writing programs (or games) for other platforms using those APIs. This also has the beneficial side-effect of allowing Microsoft to, with their collective first-hand knowledge of those APIs, create compatibility layers on an x86 system that can run code targeted at a different platform.
Yeah, federated network things.
Did you read the article? Excerpts include:
Generally, in business, it is sensible to provide your customers with what they want. With Twitter, the meme-makers’ favourite billionaire is doing the opposite. The cyber-trucker is trying his best to cull his customer base.
Threads is what would happen if Twitter and Instagram made out in a bowling alley. It’s all their worst parts combined - but it may well succeed. Rocket-man Musk’s changes to Twitter have not exactly made it ‘brand friendly’. Threads, meanwhile, is shaping up to be a paradise for in-your-face brands - and the AdTech industry would love for you to join them
and
Threads’ naffness won’t stop its success. It’s data-scraping fluffily dressed up as substandard corporate twaddle. It’s a cringe-inducing privacy invasion. It’s not meant for users, but that doesn’t really matter: you’re not a user, you’re a product.
It’s describing Threads as a product not for users, but advertisers. The perfect brand-friendly non-place for companies to stick their marketing crap. That doesn’t really come across as a ringing endorsement to me.
This is a total affront to the ethos of the web and everyone involved in drafting this awful proposal should be publicly shamed. Stick sandwich boards on each of them saying “I tried to build the Torment Nexus”, chain them together and march them through the streets while ringing a bell and chanting “shame”.
Well, you’re paying for all that performance, might as well get as much out of it as possible. God knows Snaps or Windows 11 can sometimes drag even the best hardware down to a crawl.
If you want one for your phone, Feedly is pretty good. On desktop, I use Liferea.
Seconding Liferea.
I would have to assume that credits are a largely bureaucratic unit of account that most Federation citizens will never work with or even hear about, but are used by internal departments of the Federation as a means of budgeting the capacity of things like transport ships, industrial replication facilities and shipyards.
This also allows them to function as a de facto currency for trade with outside powers who have achieved warp travel but aren’t yet in a post-scarcity state, or as a way of managing resource usage at the edges of Federation space where infrastructure is still developing and resources need to be priorities for that development.
Basically, the Federation doesn’t have “money” as an everyday societal phenomenon, but it does retain the economic capacity to issue something usable as currency when the situation calls for it such as during periods of scarcity (whether localised or across the Federation) or when conducting trade with non-Federation entities such as the Ferengi.
I mean, gold is pretty abundant in space and presumably not that difficult to replicate.
I’d be fine paying Google for YouTube Premium if I could use it without being logged in. I’d take an access key for anonymous ad-free viewing for $20 a month. But Google is never going to offer that because the data-harvesting is the whole point of YouTube to them. Google is a data-slurping company with an advertising division that dabbles in video, search and phones as side hustles.
In any case, if they really do crack down on adblockers, there are always other methods of watching their videos ad-free, and if I really like a creator, I’ll subscribe to their Patreon or watch them on Nebula.
Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple’s “UltraFusion” and AMD’s “Infinity Fabric” to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.
As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback… at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.
To be fair, a lot of these are accurate, or at least were at the time.
Multi-GPU just never caught on. There’s a reason you don’t see even the most hardcore gaming machines running SLI today.
The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.
Spore is largely forgotten, despite the enormous hype it had before release. It’s kind of the Avatar of video games.
It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn’t notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.
Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.
Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)
I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.
The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.
The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier… the iPhone family did make a huge impact in the long run, but it wasn’t until the 3GS that it was a true competitor to something like a Symbian device.
The only entry on this list that’s really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck’s guts and has proudly never had a Facebook account.
I have mixed feelings about the iPhone. On one hand, the device itself was very sleek for the time and its touch-driven, easy-to-use interface was a revelation for 2007. On the other hand, it was the harbinger of the locked-down, walled-garden hellscape that is the modern tech industry, and its success paved the way for horrors like Windows 8/10/11 and the modern Mac OS which gets very testy if you try running app that hasn’t been notarised by Apple.
Looks like they’re holding out big hopes for July 1st to be the platform’s big resurgence, and that everything will calm down once they throw the switch on API access. Sure, let us know how that works out for you, Digg 5.0.
That’s incredible. Certified “Directive #4” moment.