The problem with the AGI concept
-
It says that the most important AI system capabilities to create are
exactly those things that people can do well. But is this what we want? Is
this what...
1 hour ago
5 comments:
Didn't Frank Lloyd Wright have the same problem with Fallingwater?
http://en.wikipedia.org/wiki/Fallingwater - hmm..., same problem perhaps as the unbounded optimists of pie in the sky, but I'm not quite fathoming the analogy to The Architect of the matrix.
The creator is flawed
or at the very least he failed to incorporate into and compensate for the effect on his construction of irrational factors..,
lol, the subtitles actually helped. Was not a fan of the sequels but maybe I should give them another chance.
Post a Comment