fancyfredbot 9 months ago

I suspect relatively few Blackwell DGX GB200 servers will become e-waste in the near future. Look how much an A100 goes for on eBay today. Even if/when the AI bubble bursts there will be buyers for this stuff for years to come. I do suspect people will break it down to smaller components at some point as relatively few people are happy about running a 200kw rack. If these do end up as waste in the next decade I'll be diving into the skip to retrieve them.

  • pornel 9 months ago

    You're namedropping the hot new card, but every card before it was also the hot new card when it was released. It reminds me of the NEVER OBSOLETE sticker e-machines put on their unbelievably fast 500Mhz Pentium PCs.

    These cards will inevitably become worse than worthless when the increased running costs of older-generation hardware exceed the cost of buying next-generation hardware. At some point it won't make sense to use more electricity, more cooling, more rack space to run a hospice for old cards, when the same workload can be done easier, quicker, and cheaper on newer hardware, even after adding the cost of buying the new hardware.

    The Xeons that used to cost $4000 can now be found on eBay for 1% of their original sticker price, because they're so unprofitable to run.

    • fancyfredbot 9 months ago

      The GB200 is specifically called out by the linked article - I didn't pick it at random.

      "The researchers point out that the weight of Nvidia's latest Blackwell platform in a rack system — designed for intensive LLM inference, training and data processing tasks — tips the scales at 1.36 tons, demonstrating how material-intensive GenAI can be"

      While I certainly agree that old Xeons are selling for 1% of their original MSRP, it doesn't really seem like this disagrees with what I'm saying - if someone's buying it for 40$, that's not e-waste (yet). I do agree that eventually things can become e-waste if the initial savings are significantly offset by running costs. However it's not clear how much longer we're going to continue to see such large generational improvements in power efficiency or whether these GB200s will be entirely obsolete when such improvements eventually stop happening.

    • Incipient 9 months ago

      Those would be 10 year old or so CPUs. Not only is it because they're unprofitable to run, but they're end of life. I know CPUs do and can run longer, but realistically they're towards the end.

  • MrHamburger 9 months ago

    I think that you are right. We can already see this happening on Aliexpress where one can buy cheap X99 ATX motherboards, which are build around LGA2011-3 to take Intel Xeon E5, which is server processor from 10 years ago. Elegant repurposing of old e-waste.

SubiculumCode 9 months ago

We have a whole lot of problems in the world, and yes we can walk and chew gum (mostly), but trash piles is the least concerning to me. Properly disposed of e-waste is not a problem, and our dumps are not threatening to over-run the countryside. The biggest problems with generative AI from an environmental perspective is carbon production, and it seems that nuclear is being re-implemented (maybe not fast enough for climate change). Existential threats from AI are also very concerning (we will see if its a jobs threat), and include malicious propaganda, biological and chemical weapons. But there is also hope that the speed of science will increase with AI, which perhaps at least could help solve issues with climate change, even if the cleaner environment is only enjoyed by rogue terminators.

123yawaworht456 9 months ago

A6000/A100/H100/etc will sell like hotcakes to hobbyists and small companies when data centers start to get rid of them (if they're allowed to). the market is starved for affordable hardware. P40 was <100$ 1.5 years ago, now it's 250$+

  • kkielhofner 9 months ago

    As one example take a peek at /r/LocalLLaMA[0] (I suspect you know). These people are snapping up anything and everything they can get their hands on at a reasonable price.

    To your point on the P40, it's an eight year old card but fortunately Nvidia has a history of long term support (especially for "datacenter" GPUs). The Pascal series is still fully supported by the latest Nvidia driver and CUDA releases, and projects like llama.cpp are still fairly regularly adding performance optimizations for even Maxwell series GPUs!

    Current V100/A100/H100/etc hardware families are not going to end up as e-waste anytime soon. In fact, compare used pricing (and demand) of GPUs to CPUs, RAM, disk, motherboards, etc from eight years ago... That hardware ends up in the trash/at e-waste recyclers much, much sooner (even with /r/homelab).

    [0] - https://old.reddit.com/r/LocalLLaMA/

human_llm 9 months ago

What is a good place for buying used servers / GPU's for when the AI bubble bursts?

foota 9 months ago

Seems stupid to assume such strong exponential growth.

Havoc 9 months ago

I don’t think it’ll be all that different from existing cpu farms. Doubt there are many 15 year old xeons still powering aws

  • jeffbee 9 months ago

    You can rent a Harpertown Xeon on AWS right now, if you want it. Those are 17 years old.

kurisufag 9 months ago

Good news for homelabbers and bootstrappers.