• BlackLaZoR@kbin.run
    link
    fedilink
    arrow-up
    53
    arrow-down
    1
    ·
    2 months ago

    This constant shuttling of information back and forth is responsible for consuming as much as 200 times the energy used in the computation, according to this research.

    Press x to doubt. I know moving data costs more energy than computation itself, but that sounds like a pure BS.

    • oakey66@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      3
      ·
      2 months ago

      Everyone is trying to cash in before it all collapses because big tech has just turned into what I dub as hype hopping since they have no good ideas.

      • Sanctus@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 months ago

        Where the fuck are the automatic dishwashers that put the dishes away? Dryers that fold your clothes for you? We’ve got a long fucken way to go and we haven’t even half pillaged the Jetsons. Let’s get on with the progress already! Fuck chasing profit!

          • TomSelleck@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            Let me know when A.I. can make me a decent sandwich, and then I’ll start care.

          • Aniki 🌱🌿@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            2 months ago

            I’ve been toying with the idea that giving rocks logic or moving FTL is just not something we’re going to see cracked in our lifetimes. Certainly not while its capitalism or bust.

            We’ll need breakthroughs that involve the whole of humanity.

  • A_A@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    2 months ago

    Experimental demonstration of magnetic tunnel junction-based computational random-access memory
    “In this work, a CRAM array based on magnetic tunnel junctions (MTJs) is experimentally demonstrated. First, basic memory operations, as well as 2-, 3-, and 5-input logic operations, are studied. Then, a 1-bit full adder with two different designs is demonstrated.”
    https://www.nature.com/articles/s44335-024-00003-3
    So, this is experimentally demonstrated, yet, only at small scale.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    2 months ago

    It probably doesn’t matter from a popular perception standpoint. The talking point that AI burns massive amounts of coal for each deepfake generated is now deeply ingrained, it’ll be brought up regularly for years after it’s no longer true.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Casual consumers don’t care one bit about that. Companies would, because this would save money

    • palordrolap@kbin.run
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      2 months ago

      To stick with the analogy, this is like putting a small CPU inside the bottle, so the main CPU<->RAM bottleneck isn’t used as often. That said, any CPU, within RAM silicon or not, is still going to have to shift data around, so there will still be choke points, they’ll just be quicker. Theoretically.

      Thinking about it, this is kind of the counterpart to CPUs having an on-chip cache of memory.

      Edit: counterpoint to counterpart

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    2
    ·
    2 months ago

    I hope this is true. ai has its uses but it can’t be way more inneficient. It would be great if it answering used no more energy than a standard web query

  • Ilovethebomb@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    2 months ago

    Arm’s CEO recently suggested that by 2030, AI may consume a quarter of all energy produced in the U.S.

    No way does AI produce enough value that they could afford this.