HeyListenWatchOut

  • 0 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle
















  • Because “AI” isn’t actually “artificial intelligence.” It’s the marketing term that seems to have been adapted by every corporation to describe “LLMs…” which are more like extra fancy power guzzling parrots.

    Its why the best cases for them are mimicking things brainlessly, like voice cloning for celebrity impressions… but that doesn’t mean it can act or comprehend emotion, or know how many fingers a hand should have and why they constantly hallucinate contextless bullshit… because just like a parrot doesn’t actually know any meaning of what it is saying when it goes “POLLY WANT A CRACKER…” it just knows the tall thing will give it a treat if it makes this specific squawk with its beak.



  • I hate to break this to everyone who thinks that “AI” (LLM) is some sort of actual approximation of intelligence, but in reality, it’s just a fucking fancy ass parrot.

    Our current “AI” doesn’t understand anything or have context, it’s just really good at guessing how to say what we want it to say… essentially in the same way that a parrot says “Polly wanna cracker.”

    A parrot “talking to you” doesn’t know that Polly refers to itself or that a cracker is a specific type of food you are describing to it. If you were to ask it, “which hand was holding the cracker…?” it wouldn’t be able to answer the question… because it doesn’t fucking know what a hand is… or even the concept of playing a game or what a “question” even is.

    It just knows that it makes it mouth, go “blah blah blah” in a very specific way, a human is more likely to give it a tasty treat… so it mushes its mouth parts around until its squawk becomes a sound that will elicit such a reward from the human in front of it… which is similar to how LLM “training models” work.

    Oversimplification, but that’s basically it… a trillion-dollar power-grid-straining parrot.

    And just like a parrot - the concept of “I don’t know” isn’t a thing it comprehends… because it’s a dumb fucking parrot.

    The only thing the tech is good at… is mimicking.

    It can “trace the lines” of any existing artist in history, and even blend their works, which is indeed how artists learn initially… but an LLM has nothing that can “inspire” it to create the art… because it’s just tracing the lines like a child would their favorite comic book character. That’s not art. It’s mimicry.

    It can be used to transform your own voice to make you sound like most celebrities almost perfectly… it can make the mouth noises, but has no idea what it’s actually saying… like the parrot.

    You get it?


  • Yep. LG OLED disconnected from all web connections - literally has its MAC address blacklisted from my router, changed the launcher from stock Google for my NVIDIA Shield to a non ad-riddled launcher called WOLF, side-loaded SmartTube for ad-free YT, and Plex = non-dystopian media experience.

    My next experiment will be to install a glue-tun bound Invidious instance via docker on my NAS, and then point Yattee on my iPhone to it for a mobile version of YT that’s not filled with non-stop ads… now if I can just figure out how to launch it correctly… 😓