Discussion about this post

User's avatar
Andrew Keenan Richardson's avatar

You didn't mention what I consider to be the core difference! The transformer model does quadratic work, comparing each token to each other token, but humans appear to be doing something like nlogn work, at least at the sentence level, because we're so attentive to the syntax tree and do major computational work only at the opening and closing of subtrees.

Expand full comment
Nicholas N. Eberstadt's avatar

So how long will it take AI to develop embodied form and capacity for intent--months, years, decades, infinity? Will you be shring your assessment in the final section?

Expand full comment

No posts