EDIT: There’s a fix. https://unpackerr.zip Automatically unzips these rar containers into coherent files for importing via sonarr/radarr. I suppose you can do this manually with tar if you’re brave.
EDIT: There’s a fix. https://unpackerr.zip Automatically unzips these rar containers into coherent files for importing via sonarr/radarr. I suppose you can do this manually with tar if you’re brave.
amateur. I just manifest the correct IP address for my desired resource and fetch it with curl
The trick to writing a JavaScript web app is that first you consider literally any other technology to solve your problem and only then consider using javascript.
Rsync over FTP. i use it for a weekly nextcloud backup to a hetzner storage box
Shouldnt do so that bad. my raspberry pi 4b can do jellyfin and nextcloud without pushing 15W at full load.
x86 is inefficient, especially older models, but youll likely only push anything over 10W when actually streaming something that requires transcoding. Most of the time your home server is gonna sit idle or doing some tiny cron job that won’t really blast the CPU at all.
idk what resolution you use for streaming but my raspberry pi 4B runs plex at 1080p just fine as long as it isnt using x265/AV1 (but on jellyfin you might be able to use the Pi’s GPU for transcoding).
I use nextcloud too but it’s a tiny bit slower than I’d like, but that’s likely a wifi issue i think.
Literally any PC on Amazon for $200 CAD, then add your own SSD. I’d say 8GB of RAM but that’s just for cache, youll rarely go over 4 in general use.
That, or a raspberry pi 4B/5 which runs you about $150 once you get a case, power supply, powered USB dock for sticking SSDs into (just for safety since technically the pi’s USB ports cant handle certain SSDs power reqs.) and then stick SSDs into that.
Use dietpi (dietpi.com) for setting up your services and it’ll run nice and smooth for anything not H265, which might be annoying but Plex and possibly jellyfin let you transcode stuff in the background which is nice.
I moved from a 1080p monitor to a 1440p one for my main display and it’s actually really worthwhile. Not only is your daily computing sharper, but multitasking becomes easier because smaller windows are still legible.
IMO it’s a lot easier on the eyes when things are sharper, too.
1080p is still more than enough, but I think 1440p is worth it for a screen you’re using for hours every day :)
I’d keep the physical library around and just digitize as and when she asks for specific stuff. You’ll probably never back up half the library. That or stick it on a HDD out of the way and transfer the few she wants, then tuck the drive in a draw forever in case she wants something else.
Jellyfin must have a feature like Plex where certain user accounts can have certain libraries attached? You could use that to avoid having to look at those crappy movies in your library.
I don’t really have much of an issue with family recommendations but I do tell them that the space isn’t unlimited so if they don’t watch something they asked for I’m likely to remove it for something we WILL watch. In your case, you could at least have leverage to get her to narrow down what needs hosting and what doesnt.
If the modern internet teaches us anything, its that everything is ephemeral even when you stringently catalogue every last byte of data. People just dont need access to 90% of YouTube’s library, yet Youtube has to pay big money to make 100% of that library available 24/7 365.
There’s already rips at the seams of these systems. Time is not on the side of YouTube.
appflowy is foss, self-hostable via docker, and supports notes, tables, etc. but also kanban boards which i find useful for self management.
I’m using notion atm (the software appflowy has cloned to bring it to FOSS) as I’ve not set up docker yet :'(
Loongarch is a few years ahead of RISC-V atm. the fastest RV cores are comparable to ARM A53 (Raspberry pi 3-ish) whilst Loongarch is comparable to Intel Core 10th generation.
I think its a sunk cost thing. Loongarch was set in motion before RISC-V International was established in a neutral country, and now they’d be giving up a faster proprietary ISA for one that’s much slower atm.
The fact Alibaba is putting their money on RISC-V is probably enough of a sign that Loongarch will likely have a short-lived time at the top of the Chinese DIY processor stack.
Nothing about RISC-V disallows hardware-level surveillance. Most if not all surveillance hardware on our devices are really just super-low-power ARM CPUs. You can in theory just make a RISC-V chip capable of doing the same work.
I do think you’re probably right that it’s more about having exclusive control over the intellectual property and the ISA specification. RISC-V does allow you to close-source your chip designs, but the foundation behind it was only moved to a relatively-neutral country (Switzerland) in 2019, which is some years after Loongson moved to proprietary CPU designs.
They’re the only ones going proprietary as far as i know, most are going for RISC-V
reminds me of the infamous NSA backdoor patch blog for Notepad++
I like Dietpi. It’s just a few homelab scripts on top of a stripped down debian ISO designed to reduce resource usage for homelabs while giving some utilities for installing popular homelab software by wrapping common projects around its own “software repo” (custom scripts for installing and configuring projects so they’re a lot easier to get running than normal).
I run mine from a raspberry pi 4b but you can use x86 or other SBCs if you like.
Seeing all the hetzner mentions made me finally look into it and
yep, they seem to be cheaper than alternatives without getting into shady territory and
pretty easy to set up! I finally have an offsite backup of my home server and it only took me like an hour to do
Depends on a few things with your setup: age of your GPU, the resolution/refresh rate of your monitor. I think even the choice of DP/HDMI can have an impact too
I tried their experimental Wayland session and it’s still super buggy on high refresh rate/high DPI screens (loads of graphical errors and artifacts) so still a ways to go imo
Heres my based af workflow:
git checkout -b feature-branch
rebase on top of dev whilst working locally
git rebase origin/dev-branch && git push -f
if i need to fix conflicts with dev-branch during a PR
git merge origin/dev
Yeah I didn’t realise they were rar formats from how they show up on disk - Usually people name.their.torrents.like.this so it fucks up typical file name conventions.
I’ll keep that in mind too, thanks! Not using qbitmanage yet though I’ll have to look into that 👀