• chirospasm@lemmy.ml
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    6 months ago

    “We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet.”

    Look at what they need to mimic just a fraction of our power.

      • utopiah@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        I’d be curious about the access speed comparison, because I’d assume for the brain it’s be RAM equivalent, not SDD

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          Just gotta lower the clock speed enough for us not to notice. As long as we don’t interact with the outside world, just other stored human brains, it can be slow as molasses and we won’t notice.

        • Vivendi@lemmy.zip
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          The brain is a tightly coupled biological computer , it’s access speed is practically instantaneous

          Also data/processing in the brain is some mighty uncovered field of science

    • Korkki@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      it’s not like human brain memory or consciousness is that information dense. They just did that high of a definition of a scan.

    • SirEDCaLot@lemmy.today
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      In fairness, the scan required such astronomical resources because of how they were scanning it. They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice. That’s why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.

      Once they have the 3D structure, the scans are useless and can be deleted.

      In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.

      • redcalcium@lemmy.institute
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        Imagine donating your body to science and the scientists slice your brain and scan them, then decades later you suddenly wake up in a virtual space because the scientists are finally able to emulate a copy of your brain in a supercomputer.

        • SirEDCaLot@lemmy.today
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Sounds good to me. You should also look into cryonics. Basically you sign up with a company and donate your body to them, when you die they pump you full of antifreeze and then vitrify you in liquid nitrogen. Right now there’s no way to recover from it, the antifreeze is toxic and we don’t yet know how to undo the cell damage from freezing. But the idea is someday in the future we will figure those things out, and then hopefully be able to thaw the frozen dead person, fix the damage caused by the freezing process, fix whatever problem killed them in the first place, and reanimate them.

          For a lower fee, they will cut off your head and just freeze that. Idea being that someday in the future they will be able to transplant your brain into an artificially created body.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            But they really only unfreeze people who knew Richard Dawkins and Mrs. Garrison. Then laugh at you when you want to play Nintendo Wii.

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      It’s not even complete. You might have the physical brain tissue, but that tissue is stateful. The tissue contains potentials and electrical charges that must be included in a complete model.

  • Evil_Shrubbery@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    6 months ago

    Yes, humans kinda brute-forced intelligence with current assets - made it bigger (with some birthing issues) & more power hungry (with some cooling issues), but it mostly works.

  • riplin@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    That’s capturing everything. Ultimately you need only a tiny fraction of that data to emulate the human brain.

    Numenta is working on a brain model to create functional sections of the brain. Their approach is different though. They are trying to understand the components and how they work together and not just aggregating vast amounts of data.

    • remotelove@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Ultimately you need only a tiny fraction of that data to emulate the human brain.

      I am curious how that conclusion was formed as we have only recently discovered many new types of functional brain cells.

      While I am not saying this is the case, that statement sounds like it was based on the “we only use 10% of our brain” myth, so that is why I am trying to get clarification.

      • biscuitswalrus@aussie.zone
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        They took imaging scans, I just took a picture of a 1MB memory chip and omg my picture is 4GB in RAW. That RAM the chip was on could take dozens of GB!

      • MajorSauce@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        Not taking a position on this, but I could see a comparison with doing an electron scan of a painting. The scan would take an insane amount of storage while the (albeit ultra high definition) picture would fit on a Blu-ray.

    • kakes@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Of course, not to say the data isn’t also important though. It’s very possible that we’re missing something crucial regarding how the brain functions, despite everything we know so far. The more data we have, the better we can build/test these more streamlined models.

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      No it does not. It captures only the physical structures. There’s also chemical and electrical state that’s missing.

      • biscuitswalrus@aussie.zone
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        Think of this:

        You find a computer from 1990. You take a picture (image) of the 1KB memory chip which is on a RAM stick, there are 4 RAM sticks. You are using a DSLR camera. Your image in RAW comes out at 1GB. You project because there’s 8 chips per stick, and 4 sticks it’ll 32GB to image your 4KB of RAM.

        You’ve described nothing about the ram. This measurement is meaningless other than telling you how detailed the imaging process is.

  • SouthFresh@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    “Google to shutter human brains”

    Why anyone teams up with a company with its greatest achievement being a high score on the “I wish they hadn’t shut that down” list, is beyond my understanding.

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      I mean kubernetes, android, tensorflow, and the only OpenSource PDK for silicon that I know of.

      They have a lot of bed rock contributions in the tech space.

    • GolfNovemberUniform@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Because almost nobody else has enough money for such research and governments won’t pay for it because it’s not very useful