this post was submitted on 22 Oct 2025
335 points (97.5% liked)

Technology

76337 readers
1266 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Brkdncr@lemmy.world 27 points 3 days ago (2 children)

Can’t. It’s an arms race.

[–] tal@lemmy.today 19 points 3 days ago* (last edited 3 days ago) (1 children)

That's one issue.

Another is that even if you want to do so, it's a staggeringly difficult enforcement problem.

What they're calling for is basically an arms control treaty.

For those to work, you have to have monitoring and enforcement.

We have had serious problems even with major arms control treaties in the past.

https://en.wikipedia.org/wiki/Chemical_Weapons_Convention

The Chemical Weapons Convention (CWC), officially the Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction, is an arms control treaty administered by the Organisation for the Prohibition of Chemical Weapons (OPCW), an intergovernmental organization based in The Hague, Netherlands. The treaty entered into force on 29 April 1997. It prohibits the use of chemical weapons, and the large-scale development, production, stockpiling, or transfer of chemical weapons or their precursors, except for very limited purposes (research, medical, pharmaceutical or protective). The main obligation of member states under the convention is to effect this prohibition, as well as the destruction of all current chemical weapons. All destruction activities must take place under OPCW verification.

And then Russia started Novichoking people with the chemical weapons that they theoretically didn't have.

Or the Washington Naval Treaty:

https://en.wikipedia.org/wiki/Washington_Naval_Treaty

That had plenty of violations.

And it's very, very difficult to hide construction of warships, which can only be done by large specialized organizations in specific, geographically-constrained, highly-visible locations.

But to develop superintelligence, probably all you need is some computer science researchers and some fairly ordinary computers. How can you monitor those, verify that parties involved are actually following the rules?

You can maybe tamp down on the deployment in datacenters to some degree, especially specialized ones designed to handle high-power parallel compute. But the long pole here is the R&D time. Develop the software, and it's just a matter of deploying it at scale, and that can be done very quickly, with little time to respond.

[–] danzabia@infosec.pub 3 points 3 days ago

But to develop superintelligence, probably all you need is some computer science researchers and some fairly ordinary computers. How can you monitor those, verify that parties involved are actually following the rules?

I do not think this statement is accurate. It requires many, very expensive, highly specialized computers that are completely spoken for. Monitoring can be done with hardware geolocation and verification of the user. We are probably 1-2 years away from this already, due to the fact that a) US wants to win the AI race vs China but b) the White House is filled with traitors long NVDA.

[–] FaceDeer@fedia.io 12 points 3 days ago (1 children)

Yup. We're in a situation where everyone is thinking "if we don't, then they will." Bans are counterproductive. Instead we should be throwing our effort into "if we're going to do it then we need to do it right."

[–] stealth_cookies@lemmy.ca 6 points 3 days ago (1 children)

This is actually an interesting point I hadn't thought about or see people considering with regards to the high investment cost into AI LLMs. Who blinks first when it comes to stopping investment into these systems if they don't prove to be commercially viable (or viable quick enough)? What happens to the West if China holds out for longer and is successful?

[–] ToastedRavioli@midwest.social 4 points 3 days ago

The thing is that there is a snake eating its tail type of logic for why so many investors are dumping money into it. The more it is interacted with, the more it is trained, and then the better it allegedly will be. So these companies push shoehorning it into everything possible, even if it is borderline useless, on the assumption that it will become significantly more useful as a result. Then be more valuable for further implementation, making it worth more.

So no one wants to blink, and theyve practically dumped every egg in that basket