Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Quote in September last year for a high-memory compute server.

Quote in September last year for a high-memory compute server.

Scheduled Pinned Locked Moved Uncategorized
31 Posts 22 Posters 74 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • fuzzygroup@ruby.socialF fuzzygroup@ruby.social

    @digitalraven That's a 330% increase. That's way, way, way beyond anything I've seen which makes me wonder if this particular vendor is price gouging. Out of curiosity, have you shopped around at all?

    digitalraven@retro.pizzaD This user is from outside of this forum
    digitalraven@retro.pizzaD This user is from outside of this forum
    digitalraven@retro.pizza
    wrote last edited by
    #5

    @fuzzygroup I have, and that increase is across the board for all vendors on the UK academic purchasing contracts. This server needs a reasonable amount of RAM — 4TB+, a lot by consumer standards, but mid-range for the compute we're doing.

    w_b@mastodon.worldW 1 Reply Last reply
    0
    • fuzzygroup@ruby.socialF fuzzygroup@ruby.social

      @digitalraven That's a 330% increase. That's way, way, way beyond anything I've seen which makes me wonder if this particular vendor is price gouging. Out of curiosity, have you shopped around at all?

      mikebabcock@floss.socialM This user is from outside of this forum
      mikebabcock@floss.socialM This user is from outside of this forum
      mikebabcock@floss.social
      wrote last edited by
      #6

      @fuzzygroup @digitalraven this is not one vendor. Everything in RAM and solid state storage is basically sold out.

      1 Reply Last reply
      0
      • digitalraven@retro.pizzaD digitalraven@retro.pizza

        To say that everyone involved in Anthropic, OpenAI, nVidia, Oracle, and the rest of the spicy-autocorrect bubble that surrounds these scum should be burned alive is AN INSULT TO FIRE

        craignicol@glasgow.socialC This user is from outside of this forum
        craignicol@glasgow.socialC This user is from outside of this forum
        craignicol@glasgow.social
        wrote last edited by
        #7

        @digitalraven hospitals need fire. It's the only thing that destroys pathogens and parasites.

        1 Reply Last reply
        0
        • digitalraven@retro.pizzaD digitalraven@retro.pizza

          Quote in September last year for a high-memory compute server. £28,000.

          Quote today for the _exact same machine_. £90,500

          This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

          @davidgerard.co.uk @edzitron.com

          burnoutqueen@todon.nlB This user is from outside of this forum
          burnoutqueen@todon.nlB This user is from outside of this forum
          burnoutqueen@todon.nl
          wrote last edited by
          #8

          @digitalraven that could be used for serious calculations in physics and chemistry.

          Imagine the amount of DFT iterations you could do on that thing

          1 Reply Last reply
          0
          • digitalraven@retro.pizzaD digitalraven@retro.pizza

            Quote in September last year for a high-memory compute server. £28,000.

            Quote today for the _exact same machine_. £90,500

            This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

            @davidgerard.co.uk @edzitron.com

            fluffykittycat@furry.engineerF This user is from outside of this forum
            fluffykittycat@furry.engineerF This user is from outside of this forum
            fluffykittycat@furry.engineer
            wrote last edited by
            #9

            @digitalraven

            What a mess. They shouldn't be allowed to.buy up the whole global supply of RAM Like that

            out of curiosity, what research are you doing?

            digitalraven@retro.pizzaD 1 Reply Last reply
            0
            • digitalraven@retro.pizzaD digitalraven@retro.pizza

              Quote in September last year for a high-memory compute server. £28,000.

              Quote today for the _exact same machine_. £90,500

              This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

              @davidgerard.co.uk @edzitron.com

              anachronistjohn@zia.ioA This user is from outside of this forum
              anachronistjohn@zia.ioA This user is from outside of this forum
              anachronistjohn@zia.io
              wrote last edited by
              #10

              @digitalraven@retro.pizza This is why I can't understand the idea that groups that need this kind of computing wouldn't just buy the hardware and learn to or find someone to manage it. A few months at September's prices would pay for some ridiculously powerful hardware. A few months at today's prices would pay for some ridiculously powerful and ridiculously overpriced hardware now.

              Too many people believed the marketing bull telling everyone to move everything to "the cloud". People who don't understand that they're being led by salespeople also don't have the foresight to understand how badly things can go, at least not until they go badly, like they have now.

              digitalraven@retro.pizzaD 1 Reply Last reply
              0
              • anachronistjohn@zia.ioA anachronistjohn@zia.io

                @digitalraven@retro.pizza This is why I can't understand the idea that groups that need this kind of computing wouldn't just buy the hardware and learn to or find someone to manage it. A few months at September's prices would pay for some ridiculously powerful hardware. A few months at today's prices would pay for some ridiculously powerful and ridiculously overpriced hardware now.

                Too many people believed the marketing bull telling everyone to move everything to "the cloud". People who don't understand that they're being led by salespeople also don't have the foresight to understand how badly things can go, at least not until they go badly, like they have now.

                digitalraven@retro.pizzaD This user is from outside of this forum
                digitalraven@retro.pizzaD This user is from outside of this forum
                digitalraven@retro.pizza
                wrote last edited by
                #11

                @AnachronistJohn I do not see the relevance of what you are saying, since it appears at best tangential to what I posted.

                My research group _is_ buying the hardware and I am managing it, there is no "cloud" involved (except for the AI-boosting scum). It's a matter of timings, research grants, and budget approvals that we have to buy now rather than back in September.

                anachronistjohn@zia.ioA 1 Reply Last reply
                0
                • digitalraven@retro.pizzaD digitalraven@retro.pizza

                  @AnachronistJohn I do not see the relevance of what you are saying, since it appears at best tangential to what I posted.

                  My research group _is_ buying the hardware and I am managing it, there is no "cloud" involved (except for the AI-boosting scum). It's a matter of timings, research grants, and budget approvals that we have to buy now rather than back in September.

                  anachronistjohn@zia.ioA This user is from outside of this forum
                  anachronistjohn@zia.ioA This user is from outside of this forum
                  anachronistjohn@zia.io
                  wrote last edited by
                  #12

                  @digitalraven@retro.pizza Oops. I misread. I thought this was cloud spending. My apologies

                  Yes, getting approval in appropriate amounts of time is difficult and timing always has Murphy's Law working against us.

                  digitalraven@retro.pizzaD 1 Reply Last reply
                  0
                  • anachronistjohn@zia.ioA anachronistjohn@zia.io

                    @digitalraven@retro.pizza Oops. I misread. I thought this was cloud spending. My apologies

                    Yes, getting approval in appropriate amounts of time is difficult and timing always has Murphy's Law working against us.

                    digitalraven@retro.pizzaD This user is from outside of this forum
                    digitalraven@retro.pizzaD This user is from outside of this forum
                    digitalraven@retro.pizza
                    wrote last edited by
                    #13

                    @AnachronistJohn It's all good. Apologies if I was a bit sharp there, I've been arguing with techbros on various platforms.

                    And yeah, this is in some ways just the latest way that research funding has found to make things as inconvenient as possible.

                    1 Reply Last reply
                    0
                    • digitalraven@retro.pizzaD digitalraven@retro.pizza

                      Quote in September last year for a high-memory compute server. £28,000.

                      Quote today for the _exact same machine_. £90,500

                      This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                      @davidgerard.co.uk @edzitron.com

                      woody@pleroma.pch.netW This user is from outside of this forum
                      woody@pleroma.pch.netW This user is from outside of this forum
                      woody@pleroma.pch.net
                      wrote last edited by
                      #14
                      @digitalraven

                      Yep. We have to keep up with queries to the root and TLD nameservers, and the machines we need to do that have tripled in price in the last three months.

                      I'm _really_ looking forward to this bubble bursting. I cannot express in words, how much I yearn for that day to arrive.
                      1 Reply Last reply
                      0
                      • digitalraven@retro.pizzaD digitalraven@retro.pizza

                        Quote in September last year for a high-memory compute server. £28,000.

                        Quote today for the _exact same machine_. £90,500

                        This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                        @davidgerard.co.uk @edzitron.com

                        andthisismrspeacock@mas.toA This user is from outside of this forum
                        andthisismrspeacock@mas.toA This user is from outside of this forum
                        andthisismrspeacock@mas.to
                        wrote last edited by
                        #15

                        @digitalraven Our compute vendor warned us that after March 30 their prices are going up 250% and to order everything we need before then, so we scrambled to put in a $30M order by this week.

                        Yesterday they told us they not only wouldn't be filling that order (we can try to resubmit it at the higher prices), but they won't guarantee they can fill anything at all for the rest of the year.

                        Things is gettin interesting.

                        #IT #SysAdmin #DatacenterLife

                        rootwyrm@weird.autosR 1 Reply Last reply
                        0
                        • digitalraven@retro.pizzaD digitalraven@retro.pizza

                          Quote in September last year for a high-memory compute server. £28,000.

                          Quote today for the _exact same machine_. £90,500

                          This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                          @davidgerard.co.uk @edzitron.com

                          dunwichtype@typo.socialD This user is from outside of this forum
                          dunwichtype@typo.socialD This user is from outside of this forum
                          dunwichtype@typo.social
                          wrote last edited by
                          #16

                          @digitalraven this is how AI solves climate change. If enough people die carbon output will drop.

                          1 Reply Last reply
                          0
                          • R relay@relay.an.exchange shared this topic
                          • digitalraven@retro.pizzaD digitalraven@retro.pizza

                            Quote in September last year for a high-memory compute server. £28,000.

                            Quote today for the _exact same machine_. £90,500

                            This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                            @davidgerard.co.uk @edzitron.com

                            mort@floss.socialM This user is from outside of this forum
                            mort@floss.socialM This user is from outside of this forum
                            mort@floss.social
                            wrote last edited by
                            #17

                            @digitalraven I bought a used EPYC CPU + supermicro motherboard + 512GB RM combo å bit over å year ago for ~$1500.

                            The same combo today? Capped to 128GB RAM, at $2500. 1/4 the RAM at 1.7x the price for one year older hardware.

                            Oh and the hard drives we put in the server? Over 2x the price for the same capacity. The SSDs too.

                            It's so bad.

                            1 Reply Last reply
                            0
                            • digitalraven@retro.pizzaD digitalraven@retro.pizza

                              If there were any justice, Sam Altman's impact on the world would be measured in mass graves.

                              ? Offline
                              ? Offline
                              Guest
                              wrote last edited by
                              #18

                              @digitalraven If there were any justice, accused child rapist Sam Altman's impact on the world would be measured in meters of ground penetration

                              1 Reply Last reply
                              0
                              • digitalraven@retro.pizzaD digitalraven@retro.pizza

                                If there were any justice, Sam Altman's impact on the world would be measured in mass graves.

                                naturepunk@ecoevo.socialN This user is from outside of this forum
                                naturepunk@ecoevo.socialN This user is from outside of this forum
                                naturepunk@ecoevo.social
                                wrote last edited by
                                #19

                                @digitalraven how many mass graves to you get to a drowned child attempting to get to claim asylum in a country that has distanced itself from their plight.

                                I'm sure somebody, in some government agency, has that calculation on hand 😞

                                1 Reply Last reply
                                0
                                • fluffykittycat@furry.engineerF fluffykittycat@furry.engineer

                                  @digitalraven

                                  What a mess. They shouldn't be allowed to.buy up the whole global supply of RAM Like that

                                  out of curiosity, what research are you doing?

                                  digitalraven@retro.pizzaD This user is from outside of this forum
                                  digitalraven@retro.pizzaD This user is from outside of this forum
                                  digitalraven@retro.pizza
                                  wrote last edited by
                                  #20

                                  @fluffykittycat We study cognitive and neurological factors over a long-term population (the first lot recruited in the 1920s). A lot of the results are into markers of Alzheimer's and various kinds of dementia, both potential ways of reducing effects and early detection.

                                  1 Reply Last reply
                                  0
                                  • R relay@relay.infosec.exchange shared this topic
                                  • digitalraven@retro.pizzaD digitalraven@retro.pizza

                                    Quote in September last year for a high-memory compute server. £28,000.

                                    Quote today for the _exact same machine_. £90,500

                                    This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                                    @davidgerard.co.uk @edzitron.com

                                    quoidian@mastodon.onlineQ This user is from outside of this forum
                                    quoidian@mastodon.onlineQ This user is from outside of this forum
                                    quoidian@mastodon.online
                                    wrote last edited by
                                    #21

                                    @digitalraven
                                    I can see the BOINC,
                                    https://en.wikipedia.org/wiki/Berkeley_Open_Infrastructure_for_Network_Computing , or similar, becoming more popular.

                                    1 Reply Last reply
                                    0
                                    • fuzzygroup@ruby.socialF fuzzygroup@ruby.social

                                      @digitalraven That's a 330% increase. That's way, way, way beyond anything I've seen which makes me wonder if this particular vendor is price gouging. Out of curiosity, have you shopped around at all?

                                      _eike@chaos.social_ This user is from outside of this forum
                                      _eike@chaos.social_ This user is from outside of this forum
                                      _eike@chaos.social
                                      wrote last edited by
                                      #22

                                      @fuzzygroup @digitalraven unfortunately not just a single vendor. We have seen similar increases in similar hardware with multiple distributors and OEMs. Anything that has decent amounts of RAM, flash, or HDD has been going up ludicrously fast with availability going down and being forecast to be near zero come Q3/Q4.

                                      Likewise not doing "AI" bs, regular old compute on-premise.

                                      If you want to follow along, open a price hisory graph for available ddr5 rdimm 64 GB RAM, go to a 6 month timescale.

                                      1 Reply Last reply
                                      0
                                      • digitalraven@retro.pizzaD digitalraven@retro.pizza

                                        Quote in September last year for a high-memory compute server. £28,000.

                                        Quote today for the _exact same machine_. £90,500

                                        This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                                        @davidgerard.co.uk @edzitron.com

                                        kimsj@mastodon.socialK This user is from outside of this forum
                                        kimsj@mastodon.socialK This user is from outside of this forum
                                        kimsj@mastodon.social
                                        wrote last edited by
                                        #23

                                        @digitalraven
                                        Don’t worry. When ChatGPT goes broke there’ll be loads of cheap RAM as the insolvency firm tries to monetise the assets.

                                        rbairwell@mastodon.org.ukR 1 Reply Last reply
                                        0
                                        • kimsj@mastodon.socialK kimsj@mastodon.social

                                          @digitalraven
                                          Don’t worry. When ChatGPT goes broke there’ll be loads of cheap RAM as the insolvency firm tries to monetise the assets.

                                          rbairwell@mastodon.org.ukR This user is from outside of this forum
                                          rbairwell@mastodon.org.ukR This user is from outside of this forum
                                          rbairwell@mastodon.org.uk
                                          wrote last edited by
                                          #24

                                          @KimSJ @digitalraven Alas, I've heard that whilst the memory chips themselves are the same the surface-mounted boards are different to conventional PCs so would need reflowing/desoldering and then manually resoldering to new boards (which are normally done by pick'n'place machines which require the chips to be on sheets). I suspect even if the AI boards are "sold" at 1/4 price the resulting sticks will still be higher priced than they were a few months ago.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups