Someone is trying to re-create the virus from Snow Crash
Someone is trying to re-create the virus from Snow Crash
While it was kinda lame for Mozilla to add it with it already opted-in the way they did
That’s really the rub here. Reading the technical explainer on the project, it’s a pretty good idea. The problem is that they came down on the side of “more data” versus respecting their users:
Having this enabled for more people ensures that there are more people contributing to aggregates, which in turn improves utility. Having this on by default both demands stronger privacy protections — primarily smaller epsilon values and more noise — but it also enables those stronger protections, because there are more people participating. In effect, people are hiding in a larger crowd.
In short, they pulled a “trust us, bro” and turned an experimental tracking system on by default. They fully deserve to be taken to task over this.
Have you considered just beige boxing a server yourself? My home server is a mini-ITX board from Asus running a Core i5, 32GB of RAM and a stack of SATA HDDs all stuffed in a smaller case. Nothing fancy, just hardware picked to fulfill my needs.
Limiting yourself to bespoke systems means limiting yourself to what someone else wanted to build. The main downside to building it yourself is ensuring hardware comparability with the OS/software you want to run. If you are willing to take that on, you can tailor your server to just what you want.
Switched to full time Arch because I didn’t want to run Windows Privacy Invasion Goes to 11. And it’s been pretty good. Valve gets a big “thank you” for their contributions to WINE and making gaming on Linux nearly as seamless as Windows.
It’s probably still true that “Next year” will be the year of Linux on the desktop, and it will be for several more years to come. But, it’s starting to feel like cracks are forming in the Microsoft wall.
It’s getting Fark’d
I do agree with what you are saying, but for a complete beginner, and a very general overview, I didn’t want to complicate things too much. I personally run my own stuff in containers and am behind CG-NAT (it’s why I gave it a mention).
That said, if you really wanted to give the new user that advice, go for it. Rather than just nit pick and do the “but actshuly” bit, start adding that info and point out how the person should do it and what to consider. Build, instead of just tearing down.
No, but you are the target of bots scanning for known exploits. The time between an exploit being announced and threat actors adding it to commodity bot kits is incredibly short these days. I work in Incident Response and seeing wp-content
in the URL of an attack is nearly a daily occurrence. Sure, for whatever random software you have running on your normal PC, it’s probably less of an issue. Once you open a system up to the internet and constant scanning and attack by commodity malware, falling out of date quickly opens your system to exploit.
Short answer: yes, you can self-host on any computer connected to your network.
Longer answer:
You can, but this is probably not the best way to go about things. The first thing to consider is what you are actually hosting. If you are talking about a website, this means that you are running some sort of web server software 24x7 on your main PC. This will be eating up resources (CPU cycles, RAM) which you may want to dedicated to other processes (e.g. gaming). Also, anything you do on that PC may have a negative impact on the server software you are hosting. Reboot and your server software is now offline. Install something new and you might have a conflict bringing your server software down. Lastly, if your website ever gets hacked, then your main PC also just got hacked, and your life may really suck. This is why you often see things like Raspberry Pis being used for self-hosting. It moves the server software on to separate hardware which can be updated/maintained outside a PC which is used for other purposes. And it gives any attacker on that box one more step to cross before owning your main PC. Granted, it’s a small step, but the goal there is to slow them down as much as possible.
That said, the process is generally straight forward. Though, there will be some variations depending on what you are hosting (e.g. webserver, nextcloud, plex, etc.) And, your ISP can throw a massive monkey wrench in the whole thing, if they use CG-NAT. I would also warn you that, once you have a presence on the internet, you will need to consider the security implications to whatever it is you are hosting. With the most important security recommendation being “install your updates”. And not just OS updates, but keeping all software up to date. And, if you host WordPress, you need to stay on top of plugin and theme updates as well. In short, if it’s running on your system, it needs to stay up to date.
The process generally looks something like:
Optionally, you may want to consider using a Dynamic DNS service (DDNS) (e.g. noip.com) to make reaching your server easier. But, this is technically optional, if you’re willing to just use an IP address and manually update things on the fly.
Good luck, and in case I didn’t mention it, install your updates.
Holy Misleading Headline, Batman…
The actual first sentence of the article:
Since 2019, the U.S. Department of Defense has been asking for a waiver from legislation barring it from doing business with companies reliant on telecommunications equipment manufactured by Huawei.
Emphasis added. This isn’t the DoD saying “we need to use Huawei hardware”, it’s the DoD saying “a fuck-ton of companies we do business with use Huawei hardware.” And that’s because Huawei hardware is cheap and businesses like cheap. While I do think the DoD has some leverage in contracts to say, “welcome to the Defense Industrial Base (DIB), you cannot use anything manufactured by Huawei in infrastructure which is within scope”. If the text of the law says that the DoD can’t do business with companies who use Huawei hardware at all, then that’s going to be very limiting.
By combining Mozilla’s scale and trusted reputation with Anonym’s cutting-edge technology…
Ya, that reputation is taking a big hit right now.
Congratulations, you have now arrived at the Trough of Disillusionment:
It remains to be seen if we can ever climb the Slope of Enlightenment and arrive at reasonable expectations and uses for LLMs. I personally believe it’s possible, but we need to get vendors and managers to stop trying to sprinkle “AI” in everything like some goddamn Good Idea Fairy. LLMs are good for providing answers to well defined problems which can be answered with existing documentation. When the problem is poorly defined and/or the answer isn’t as well documented or has a lot of nuance, they then do a spectacular job of generating bullshit.
That is a possibility. To be honest, I haven’t tried very hard yet. I’m currently working on spinning up a Win10 VM in KVM and I’ll see how that works. And Android emulator is another good idea, I’ll have to give that a go.
And once you have found your specific collection of plugins that happen not to put the exact features you need behind a paywall but others, you ain’t touching those either.
And this is why, when I’m investigating phishing links, I’ve gotten used to mumbling, “fucking WordPress”. WordPress itself is pretty secure. Many WordPress plugins, if kept up to date, are reasonably secure. But, for some god forsaken reason, people seem to be allergic to updating their WordPress plugins and end up getting pwned and turned into malware serving zombies. Please folks, if it’s going to be on the open internet, install your fucking updates!
Did the same. The writing has been on the wall for a long time, Microsoft’s anti-user behavior is only set to get worse. I made the jump to Linux (Arch) and things have been reasonably smooth. I did have a few issues with Enshrouded, but was able to get past those with Proton-GE. The only issue I haven’t worked around yet is Roblox with the kids. But, I may just have to pick up a cheap tablet for that.
I think AI is good with giving answers to well defined problems. The issue is that companies keep trying to throw it at poorly defined problems and the results are less useful. I work in the cybersecurity space and you can’t swing a dead cat without hitting a vendor talking about AI in their products. It’s the new, big marketing buzzword. The problem is that finding the bad stuff on a network is not a well defined problem. So instead, you get the unsupervised models faffing about, generating tons and tons of false positives. The only useful implementations of AI I’ve seen in these tools actually mirrors you own: they can be scary good at generating data queries from natural language prompts. Which is, once again, a well defined problem.
Overall, AI is a tool and used in the right way, it’s useful. It gets a bad rap because companies keep using it in bad ways and the end result can be worse than not having it at all.
what is the advantage in using Windows 11 over 10?
Many years ago, I was at a Windows XP launch event and the Microsoft Rep had a really honest line:
“Why should you start using Windows XP? Because we’re going to stop supporting Windows 98!”
And ya, that’s pretty much been the cattle prod Microsoft uses to push new versions, eventually you stop getting security updates for the older OS and at some point there are enough security vulnerabilities which make it no longer safe for daily use. That said, with Windows becoming more and more user hostile, other options start to make more sense.
The answer to that will be everyone’s favorite “it depends”. Specifically, it depends on everything you are trying to do. I have a fairly minimal setup, I host a WordPress site for my personal blog and I host a NextCloud instance for syncing my photos/documents/etc. I also have to admit that my backup situation is not good (I don’t have a remote backup). So, my costs are pretty minimal:
The Domain fee is obvious, I pay for my own domain. For the containers, I have 2 containers hosted by the bought up husk of Linode. The first is just a Kali container I use for remote scanning and testing (of my own stuff and for work). So, not a necessary cost, but one I like to have. The other is a Wireguard container connecting back to my home network. This is necessary as my ISP makes use of CG-NAT. The short version of that is, I don’t actually have a public IP address on my home network and so have to work around that limitation. I do this by hosting NGinx on the Wireguard container and routing all traffic over a Wireguard VPN back to my home router. The VPN terminates on the outside interface and then traffic on 443/tcp is NAT’d through the firewall to my “server”. I have an NGinx container listening on 443 and based on host headers traffic goes to either the WordPress or NextCloud container which do their magic respectively. I also have a number of services, running in containers, on that server. But, none of those are hosted on the internet. Things like PiHole and Octoprint.
I don’t track costs for electricity, but that should be minimal for my server. The rest of the network equipment is a wash, as I would be using that anyway for home internet. So overall, I pay $11/month in fixed costs and then any upgrades/changes to my server have a one-time capital cost. For example, I just upgraded the CPU in it as it was struggling under the Enshrouded server I was running for my Wife and I.
Attempt at serious answer (warning: may be slightly offensive)
Wow, you are a fucking moron. But, there is an interesting question buried in there, you just managed to ask it in a monumentally stupid way. So, let’s pick this apart a bit. Assuming Trump gets re-elected and speed-runs the US into global irrelevancy, what happens to the various standards and standards bodies? tl;dr: Not much.
For this reason, and a lot of other reasons, I am in favor of liberterianism because then, it would not be a government ran by octogenarians deciding standards for communication,
It’s ok, I was young and stupid once too. The fact is that, while many telecommunications standards started off in the US, and some even in the USG, most of them have long since been handed off to industry groups. The Internet Engineering Task Force is responsible for most of the standards we follow today. They were spun off from the USG in 1993 and are mostly a consensus driven organization with input from all over the world. In a less US centric world, the makeup of the body might change some. But, I suspect things would keep humming along much as they have for the last few decades.
Will we live in a post-standard world?
This depends on the level of fracturing of networks. Over time, there has been a move towards standardization because it makes sense. Sure, companies resist and all of them try to own the standard, but there has been a lot of pushback against that and often from outside the US. For example, the EU’s law to require common charging ports. In many ways, the EU is now doing more for standardization than the US.
Worse, cryptography. Well, for ‘serious shit’, people roll their own crypto because…
Tell me you know fuck all about security without saying you know fuck all about security. There is a well accepted maxim, called “Schneier’s law” based on this classic essay. It’s often shortened to “Don’t roll your own crypto”. And this goes back to that FIPS standard mentioned earlier. FIPS is useful mostly because it keeps various bits of the USG from picking bad crypto. The algorithms listed in FIPS are all bog-standard stuff, from things like the Advanced Encryption Standard (AES) process. The primitives and standards are the primitives and standards because they fucking work and have been heavily tested and shown to be secure over a lot of years of really smart people trying to break them. Ironically, it was that same sort of open testing that resulted in the NSA being caught trying to create a crypto backdoor.
So no, for ‘serious shit’ no one rolls their own crypto, because that would be fucking dumb.
But what about primitives? For every suite, for every protocol, people use the same primitives, which are standardized.
And ya, they would continue to be, as said above, they have been demonstrated over and over again to work. If they are found not to work, people stop using them (se:e SHA1, MD5, DES). Its funny that, for someone who is “in favor of liberterianism” you seem to be very poorly informed of examples where private groups and industry are actually doing a very good job of things without government oversight.
Overall, you seem to have a very poor understanding of how these standards get created in the modern world. Yes, the US was behind a lot of them. But, as they have been handed over to private (and often international) organizations, they have moved further and further away from US Government control. Now, that isn’t to say that US Based companies don’t have a lot of clout in those organizations. Let’s face it, we are all at the mercy of Microsoft and Google way too often. But, even if those companies fall to irrelevance, the organizations they are part of will likely continue to do what they already do. It’s possible that we’d see a faster balkanization of the internet, something we already see a bit of. Countries like China, Iran or Russia may do more to wall their people off from US/EU influence, if they don’t have an economic interest in some communications. Though, it’s just as likely that trade will continue to keep those barriers to the flow of information as open as possible.
The major change could really be in language. Without the US propping it up, English may lose it’s standing as the lingua franca of the world. As it stands right now, it’s not uncommon for two people, neither of which speaks English as their native language, to end up conversing in English as that is the language the two of them share. If a new superpower rises, perhaps the lingua franca shifts and the majority of sites on the internet shift with it. Though, that’s likely to be a multi-generational change. And it could be a good thing. English is a terrible language, it’s less a language and more three languages dressed up in a trench coat pretending to be one.
So yes, there would likely be changes over time. But, it’s likely more around the edges than some wholesale abandoning of standards. And who knows, maybe we’ll end up with people learning to write well researched and thought out questions on the internet, and not whatever drivel you just shat out. Na, that’s too much to hope for.
Microsoft: “We’re sorry, this is all just a misunderstanding. We thought you were too stupid to notice.”
Can we follow this up by murdering most of the generic Top Level Domains (gTLD)? I have yet to see anything except spam and malware coming out of the .top domain.