Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Quote in September last year for a high-memory compute server.

Quote in September last year for a high-memory compute server.

Scheduled Pinned Locked Moved Uncategorized
31 Posts 22 Posters 74 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • fluffykittycat@furry.engineerF fluffykittycat@furry.engineer

    @digitalraven

    What a mess. They shouldn't be allowed to.buy up the whole global supply of RAM Like that

    out of curiosity, what research are you doing?

    digitalraven@retro.pizzaD This user is from outside of this forum
    digitalraven@retro.pizzaD This user is from outside of this forum
    digitalraven@retro.pizza
    wrote last edited by
    #20

    @fluffykittycat We study cognitive and neurological factors over a long-term population (the first lot recruited in the 1920s). A lot of the results are into markers of Alzheimer's and various kinds of dementia, both potential ways of reducing effects and early detection.

    1 Reply Last reply
    0
    • R relay@relay.infosec.exchange shared this topic
    • digitalraven@retro.pizzaD digitalraven@retro.pizza

      Quote in September last year for a high-memory compute server. £28,000.

      Quote today for the _exact same machine_. £90,500

      This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

      @davidgerard.co.uk @edzitron.com

      quoidian@mastodon.onlineQ This user is from outside of this forum
      quoidian@mastodon.onlineQ This user is from outside of this forum
      quoidian@mastodon.online
      wrote last edited by
      #21

      @digitalraven
      I can see the BOINC,
      https://en.wikipedia.org/wiki/Berkeley_Open_Infrastructure_for_Network_Computing , or similar, becoming more popular.

      1 Reply Last reply
      0
      • fuzzygroup@ruby.socialF fuzzygroup@ruby.social

        @digitalraven That's a 330% increase. That's way, way, way beyond anything I've seen which makes me wonder if this particular vendor is price gouging. Out of curiosity, have you shopped around at all?

        _eike@chaos.social_ This user is from outside of this forum
        _eike@chaos.social_ This user is from outside of this forum
        _eike@chaos.social
        wrote last edited by
        #22

        @fuzzygroup @digitalraven unfortunately not just a single vendor. We have seen similar increases in similar hardware with multiple distributors and OEMs. Anything that has decent amounts of RAM, flash, or HDD has been going up ludicrously fast with availability going down and being forecast to be near zero come Q3/Q4.

        Likewise not doing "AI" bs, regular old compute on-premise.

        If you want to follow along, open a price hisory graph for available ddr5 rdimm 64 GB RAM, go to a 6 month timescale.

        1 Reply Last reply
        0
        • digitalraven@retro.pizzaD digitalraven@retro.pizza

          Quote in September last year for a high-memory compute server. £28,000.

          Quote today for the _exact same machine_. £90,500

          This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

          @davidgerard.co.uk @edzitron.com

          kimsj@mastodon.socialK This user is from outside of this forum
          kimsj@mastodon.socialK This user is from outside of this forum
          kimsj@mastodon.social
          wrote last edited by
          #23

          @digitalraven
          Don’t worry. When ChatGPT goes broke there’ll be loads of cheap RAM as the insolvency firm tries to monetise the assets.

          rbairwell@mastodon.org.ukR 1 Reply Last reply
          0
          • kimsj@mastodon.socialK kimsj@mastodon.social

            @digitalraven
            Don’t worry. When ChatGPT goes broke there’ll be loads of cheap RAM as the insolvency firm tries to monetise the assets.

            rbairwell@mastodon.org.ukR This user is from outside of this forum
            rbairwell@mastodon.org.ukR This user is from outside of this forum
            rbairwell@mastodon.org.uk
            wrote last edited by
            #24

            @KimSJ @digitalraven Alas, I've heard that whilst the memory chips themselves are the same the surface-mounted boards are different to conventional PCs so would need reflowing/desoldering and then manually resoldering to new boards (which are normally done by pick'n'place machines which require the chips to be on sheets). I suspect even if the AI boards are "sold" at 1/4 price the resulting sticks will still be higher priced than they were a few months ago.

            1 Reply Last reply
            0
            • andthisismrspeacock@mas.toA andthisismrspeacock@mas.to

              @digitalraven Our compute vendor warned us that after March 30 their prices are going up 250% and to order everything we need before then, so we scrambled to put in a $30M order by this week.

              Yesterday they told us they not only wouldn't be filling that order (we can try to resubmit it at the higher prices), but they won't guarantee they can fill anything at all for the rest of the year.

              Things is gettin interesting.

              #IT #SysAdmin #DatacenterLife

              rootwyrm@weird.autosR This user is from outside of this forum
              rootwyrm@weird.autosR This user is from outside of this forum
              rootwyrm@weird.autos
              wrote last edited by
              #25

              @andthisismrspeacock @digitalraven @stroz and it's been a lot worse than that for people who can't make $30M orders, for some time now.

              Equipment they'd already paid for sold out from under them, and being told it'll get filled in 2027 maybe, tough luck.
              Orders that were allegedly already being built suddenly being cancelled.
              Even systems that the hyperscalers are too stupid to use or admin, suddenly getting shoved back quarters at a time for lack of RAM or storage.

              1 Reply Last reply
              0
              • digitalraven@retro.pizzaD digitalraven@retro.pizza

                Quote in September last year for a high-memory compute server. £28,000.

                Quote today for the _exact same machine_. £90,500

                This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                @davidgerard.co.uk @edzitron.com

                iamdoon@mspsocial.netI This user is from outside of this forum
                iamdoon@mspsocial.netI This user is from outside of this forum
                iamdoon@mspsocial.net
                wrote last edited by
                #26

                @digitalraven yep. 😞😡

                For our medical imaging researchers, I ordered a ~$60k server in October. The same config quoted yesterday is ~$127k.

                1 Reply Last reply
                0
                • digitalraven@retro.pizzaD digitalraven@retro.pizza

                  Quote in September last year for a high-memory compute server. £28,000.

                  Quote today for the _exact same machine_. £90,500

                  This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                  @davidgerard.co.uk @edzitron.com

                  manux@mastodon.opencloud.luM This user is from outside of this forum
                  manux@mastodon.opencloud.luM This user is from outside of this forum
                  manux@mastodon.opencloud.lu
                  wrote last edited by
                  #27

                  @digitalraven LLM or American oligarchs?

                  digitalraven@retro.pizzaD 1 Reply Last reply
                  0
                  • manux@mastodon.opencloud.luM manux@mastodon.opencloud.lu

                    @digitalraven LLM or American oligarchs?

                    digitalraven@retro.pizzaD This user is from outside of this forum
                    digitalraven@retro.pizzaD This user is from outside of this forum
                    digitalraven@retro.pizza
                    wrote last edited by
                    #28

                    @manux LLM bros and their bubble in particular. To blame all oligarchs (and there is a lot to blame them for in general) absolves the LLM shitheads from specific responsibility.

                    1 Reply Last reply
                    0
                    • digitalraven@retro.pizzaD digitalraven@retro.pizza

                      @fuzzygroup I have, and that increase is across the board for all vendors on the UK academic purchasing contracts. This server needs a reasonable amount of RAM — 4TB+, a lot by consumer standards, but mid-range for the compute we're doing.

                      w_b@mastodon.worldW This user is from outside of this forum
                      w_b@mastodon.worldW This user is from outside of this forum
                      w_b@mastodon.world
                      wrote last edited by
                      #29

                      @digitalraven @fuzzygroup

                      I'm in the market for a new personal machine at home. Prices in this segment are about triple what they should be too.

                      1 Reply Last reply
                      0
                      • drajt@fosstodon.orgD drajt@fosstodon.org shared this topic
                      • digitalraven@retro.pizzaD digitalraven@retro.pizza

                        Quote in September last year for a high-memory compute server. £28,000.

                        Quote today for the _exact same machine_. £90,500

                        This is for medical research. Saving lives. When I say LLMs are killing people by killing research computing, this is what I mean.

                        @davidgerard.co.uk @edzitron.com

                        adx@infosec.exchangeA This user is from outside of this forum
                        adx@infosec.exchangeA This user is from outside of this forum
                        adx@infosec.exchange
                        wrote last edited by
                        #30

                        @digitalraven We are getting quotes that are now only good for 4 days. Not even 4 business days. So if they send the quote on Friday afternoon you have to purchase by Monday.

                        digitalraven@retro.pizzaD 1 Reply Last reply
                        0
                        • adx@infosec.exchangeA adx@infosec.exchange

                          @digitalraven We are getting quotes that are now only good for 4 days. Not even 4 business days. So if they send the quote on Friday afternoon you have to purchase by Monday.

                          digitalraven@retro.pizzaD This user is from outside of this forum
                          digitalraven@retro.pizzaD This user is from outside of this forum
                          digitalraven@retro.pizza
                          wrote last edited by
                          #31

                          @adx We are seeing similar, quote lifetime is down as well as prices being up.

                          1 Reply Last reply
                          0
                          Reply
                          • Reply as topic
                          Log in to reply
                          • Oldest to Newest
                          • Newest to Oldest
                          • Most Votes


                          • Login

                          • Login or register to search.
                          • First post
                            Last post
                          0
                          • Categories
                          • Recent
                          • Tags
                          • Popular
                          • World
                          • Users
                          • Groups