The Oath

Gary Marcus

Rank 19 of 47
|
Score 64
Gary Marcus
@GaryMarcus
--
@boyphysiker @freediebird lol
6/16/2025, 11:25:31 PM
X
In reply to:
Michele Reilly
@boyphysiker
·
27d
@GaryMarcus @freediebird gary's right but deeper issue is shannon's noisy channel theorem - llms are lossy compression of training data

can't recover intelligence from noisy channel without knowing original signal. boring info theory doesn't pump valuations
Gary Marcus
@GaryMarcus
·
28d
@freediebird some kind of architecture will lead us to AGI.

I don’t think architecture is immediately adjacent to where we are now.


i do think we are overinvested in small tweaks, when we need a more radical overhaul.
David Freed
@freediebird
·
29d
@GaryMarcus One consideration here: while existing models may have upper limits that we see as laughably far from AGI, discovery of a new model structure could be the catalyst that prompts a step-change in capabilities.
Gary Marcus
@GaryMarcus
·
29d
If we were 2 years from some kind of singularity-like AGI, I would expect current AI to be able to

• (minimally) do essentially anything cognitive that a bright 10 year child could do, such as understand movies, acquire the basics of new skills quickly, learn complex,

The statement 'lol' is a lighthearted, conversational response to a technical discussion about AI and AGI. It does not engage substantively with the public issues being discussed, such as AI architecture or information theory, and therefore does not constitute public discourse.

FacebookInstagramTwitterYouTube

© 2023-2024 The Oath, All rights reserved.