This thread is for general discussion for technology. Whether it is AI, computers, software, hardware, automation, computation, you name it; if it involves technology of any sort.
For internet privacy, there is a thread for that: >>>/b/381/
This thread is for general discussion for technology. Whether it is AI, computers, software, hardware, automation, computation, you name it; if it involves technology of any sort.
For internet privacy, there is a thread for that: >>>/b/381/
Due to recent trade tariffs happening in the United States, I panic bought a Framework 16 laptop, even though the laptop in question was made in Taiwan (doesn't really exclude it from US trade tariffs since there is a far fetched pipe dream of having these being made in the USA, along with many other computer and computer component brands).
I bought the DIY edition to save some money, since I got hit with a nearly 200 USD sales tax in my state, but at least I got free FedEx Express shipping.
Though the specifications for this base model Ryzen™ 7 7840HS (other details: no RAM, no NVMe SSDs, as I can get them much cheaper on sale on Amazon) are a bit old for a brand new laptop (they are definitely planning on using the Ryzen AI chips on a brand new Framework 16 motherboard, but it is not ready yet, unlike their other lineup of computers), the person I had in mind for this laptop likely does not care about having the latest and greatest, nor having overkill.
Battery life, screen size, and portability are the major concerns, as well as an easy to repair design, when going into a sketchy repair shop in the middle of nowhere, that is likely unable to fix most anti-repairable laptops out for sale today.
The laptop, when I was holding it in my hand, is as thick as the 2015 Macbook Pro, which is not bad for a computer of this size, and it is all metal enclosure, so expect dents but not cracks when dropping this laptop, by default.
I also bought quite a few adapters: 2 USB-C slots (one is required in order to power the system with a 100W adapter by default, if you get the discrete GPU add-on for the back, you will need 180W, and Framework does sell their own official charger for 100 USD), 2 USB-A slots, an SD card slot, and HDMI slot. All of these modular plugs use USB-C to interface and lock into place with the chassis when the locking switch is flipped after inserting them in, and it is very sturdy and won't go anywhere; even better, if one of these plugs go bad, you can just chuck out the bad ones, and get a 15+ USD replacement without having to open the whole laptop to de-solder and re-solder a new port back into the motherboard.
Battery life of this is average, roughly the same, if not worst than the 2015 Macbook Pro it is replacing, but Apple is usually king when it comes to battery life on laptops, so it is a difficult thing to get right, especially using Windows 11's Modern Standby feature; it is noticeably worse when using Windows 10 since the standby feature of that operating system is far less optimized and is no longer getting any major feature updates since 2022.
I'll report back when time has passed using this laptop.
>>4508
Another thing I want to add: if you plan on using GNU/Linux with this laptop, the experience is noticeably far better, especially with Framework providing official support for any popular GNU/Linux distros. The catch, however, is you need to be using newer distros like Fedora 41 and Ubuntu 25.04 (soon to be released), for better compatibility sake, since patches didn't make it into the kernel. If you plan on using a relatively "stable" but very outdated out-of-the-box distro like Debian, you are not going to have a good time, unless you get a used older Framework laptop model, which I strongly do not recommend, given Framework is constantly listening to customers' feedback regarding design oversights in the DIY experience and the general design of these computers.
In short, bleeding edge (e.g. Fedora) or rolling release distros (e.g. Gentoo, Arch, Void, openSUSE Tumbleweed, etc.) are highly recommended when it comes to Framework computers, especially when bought new.
>>4509
Correction: Patches didn't make it into more "stable" kernel releases. So you kind of have to be using distros that ship with more recent kernel releases like Fedora and any rolling release distros like Arch.
is there any reason right now to upgrade from a 3080 if its not AI related?
>>4516
Depends on what games you want to play. If you're not planning on playing games that's slated to release in 2027 or later, there's no reason to.
Side note: I would mainly advise against buying GPUs older than the 30 series, if you want concurrent support since they're halfway through their official support lifespan.
Just recently, it's announced that the GTX 10 and 16 series GPUs are gonna have CUDA SDK support drop on the newer releases after 12.8; though, I'm still grateful that FSR 3 support is literally extending the lifespan of the legendary GTX 1080Ti.
>>4516
Want to also update and just make this point clear as well before it escapes me once again: you're extremely lucky to be on the 30 series of GPUs.
Despite its support lifecycle span being halfway, so far, every new driver update has yet to, in a major way, affect negatively the 30 series GPUs. 40 series has some negative affects, and 50 series is a complete garbage fire.
Not to mention, 32-bit CUDA support still being done on the 30 series as well, making it a very valuable series of cards to play new and older games alike in the current tech landscape.
I might just bite the bullet and get a used EVGA 3090 later on down the line, even though it has at most 5-6 more years of official support from NVIDIA left for CUDA and game/studio driver updates.
Another update to the NVIDIA driver fiasco:
https://www.nvidia.com/en-us/geforce/forums/game-ready-drivers/13/563719/announcing-geforce-hotfix-driver-57615-released-42/
The amount of missteps and insane mismanagement of these drivers, GPU hardware designs, and so forth; I just don't know why I bother with NVIDIA other than computing/AI. If you're just gaming, AMD seems to be the sole option as I've heard Intel is just ramping down their dGPU department due to how much financial debt they're in right now.