• 0 Posts
  • 34 Comments
Joined 3 months ago
cake
Cake day: July 12th, 2024

help-circle






  • The actual point was, bomb making instructions have been floating around on search engine results since the days of dial up. That particular manuscript itself has existed since before the days of the Internet. There’s nothing cgpt could give you that you couldn’t have found by typing the same query into Google. Getting the instructions is literally the easiest, least effort, least risk part of building a bomb.





  • The measure, aimed at reducing potential risks created by AI, would have required companies to test their models and publicly disclose their safety protocols to prevent the models from being manipulated to, for example, wipe out the state’s electric grid or help build chemical weapons.

    How exactly do LLMs do that? If you’ve given an LLM’s pseudorandom output control over your electrical grid, no regulation will mitigate your stupidity.