Really? I thought the last version for those was Monterrey and that that went EOL in 2024.
That reminds me; the other day a client walked up to the help desk I work at with a 2015 MBP still running El Capitan.
“Life forms. You precious little lifeforms. You tiny little lifeforms. Where are you?”
- Lt. Cmdr Data, Star Trek: Generations
Really? I thought the last version for those was Monterrey and that that went EOL in 2024.
That reminds me; the other day a client walked up to the help desk I work at with a 2015 MBP still running El Capitan.
I’m confused. How did you save your parent’s Masters in Business Administration? /s
(Sorry. I can’t help but think that every time someone acronym’s Macbook Air.)
OCLP?


I will point out that the card support for ROCm has improved; AMD explicitly supports most RDNA3 or later consumer GPUs, and I think support might go back to RDNA2, maybe RDNA1.
Supposedly it’s technically possible to use Polaris, but it’s very broken at this point.


The Arch Wiki is probably the sungle most useful documentation for any Linux user; I don’t even use Arch and it’s still extremely helpful.
I could see the benefits of using Arch just so almost every function my system has is near-perfectly documented in Arch Wiki.
As for the distro itself, it has the newest packages, and often good repos with interesting packages that Debian and others may lack. It also expects you to choose and install the components you want, whereas the Debian installer will usually just install defaults; you can use Debootstrap for a minimal Debian install, but that’s not as well supported for installing Debian due to the way tools as set up on the install medium.
The reason I choose Debian over Arch is because if I don’t use a device for several months and have to install updates (like my school laptop over the summer), Debian Stable is more likely to survive that than Arch; I’ve destroyed several Arch VMs by trying to update them after not using them for months. I’m sure I could have salvaged them if I tried, but I’d rather just make a new VM.


It looks like NetBSD and OpenBSD might be good OSs for 32-bit; the next FreeBSD version is dropping support. I don’t use any BSDs, but I think a BSD is probably the best-supported modern Unix operating system for this kind of hardware as the last of the major distros drop i386.
Linux distro support is really thinning out for x86_32, so for this use case; I’m sure the distros still exist, but they’re often niche projects. Gentoo may do the trick if you want to; I can’t tell if they compile their newfangled precompiled packages for i386 though, so if they don’t, you’ll probably have to set up a cross compiling setup from a more powerful x86_64 machine, which you’d need to use every time you update.


Wasn’t necessarily suggesting 11 LTSC; just my personal choice.


Not really.
Ampere’s for servers; if you have the cash to blow, you can get a fancy workstation, but not a laptop. It’s really a shame; I think Ampere might be able to do well in the consumer CPU market if they wanted to face Qualcomm (and assuming they can get their single core performance up). A lot of their hardware seems to follow standards pretty well.
Graviton is only used internally inside Amazon and not sold to customers.
The only semi-decent ARM laptops you can get right now are Snapdragon ones, some of which kind of support Linux but with a lot of caveats and obnoxious quarks.


I have to have Windows for my university’s test-taking spyware, so I just have a barebones 11 LTSC installed on a secondary drive.


I just looked it up, and it seems a lot of the pre-Apple Silicon MacBook had swappable airport cards that used a completely standard mini PCIE slot. From a cursory google search, it looks completely possible to swap in something like an Intel Wi-Fi card that is supported natively by the kernel.
A mini-PCIE Wi-Fi modem can be had for not too expensive, around the $30 range; in fact, if you have a good stack of old Wintel laptops, one of those might have a card that works well. In fact, I did that with my sister‘s laptop (although she was using Windowd) – her Realtek Wi-Fi card was causing endless misery, so I ripped the Intel modem out of an ultra book from circa 2016 and put it in her laptop. No more issues.


Exactly. Luckily, back in high school, my IB History class spent a good couple months just learning about authoritarian rulers and their tactics.
I especially like pulling out Pinochet because he’s a clear and relatively recent example of right wing authoritarianism, manipulation of existing religious structures, and US government support of authoritarian regimes that help contextualize its trend towards authoritarianism.


I mean, I had an uncle showing me HTML at 7 (not a programming language, but still). I learned basic JS on Khan Academy at 11, and if I’d known it had existed earlier, I would have started earlier.


That’s pretty normal for most UEFI x86_64 things up to 2020 or so.


UEFI first became common on new computers in 2011-2012, so I don’t a lot of 2014 computers were BIOS.
I have a cheapo laptop from 2012 (one of last Gateways) and it’s a UEFI machine.
At this point, I think 15 years ago is a more realistic estimate for the last legacy BIOS machines - my Win7 box with a 1st gen i5 is legacy BIOS.


Mostly, he uses Photoshop for printing, though, and I don’t know if Krita has as powerful a printing dialog.


My grandfather asked me about Linux, but unfortunately, he’s still using Photoshop for now.


+1 for Clevis. I’ve been using it on my laptop for a year and it works like a charm. Sometimes, you need to update bindings after kernel updates, but it’s overall quite smooth.


From what I’ve heard, ROCm may be finally getting out of its infancy; at the very least, I think by the time we get something useful, local, and ethical, it will be pretty well-developed.
Honestly, though, I’m in the same boat as you and actively try to avoid most AI stuff on my laptop. The only “AI” thing I use is I occasionally do an image upscale. I find it kind of useless on photos, but it’s sometimes helpful when doing vector traces on bitmap graphics with flat colors; Inkscape’s results aren’t always good with lower resolution images, so putting that specific kind of graphic through “cartoon mode” upscales sometimes improves results dramatically for me.
Of course, I don’t have GPU ML acceleration, so it just runs on the CPU; it’s a bit slow, but still less than 10 minutes.


I feel like most people who use Nvidia on Linux just got their machine before they were Linux users, with a small subset for ML stuff.
Honestly, I hear ROCm may finally be getting less horrible, is getting wider distro support, and supports more GPUs than it used to, so I really hope AMD will become as livable ML dev platform as it is a desktop GPU.
I’ve never even owned a Mac; I know about it because I’ve Hackintoshed a few times, so I’m familiar with OpenCore.