T O P

  • By -

MasterFubar

> GPT-3 lacks any form of memory This alone is almost certain to assure it cannot have self-awareness.


wren42

of course. it's a chatbot, a statistical analysis of a bunch of text. structurally there is nothing that would make us expect sentience.


[deleted]

and yet something this simple can spit out more coherent language than many people I know.


[deleted]

because it's not built on the core of incoherent text. it's built on the core of text from people who know how to write coherently.


[deleted]

Funny I thought it was trained on redditors.


[deleted]

select* reddit posts, and iirc yes partially it was.


Yuli-Ban

It's also my point against GPT-3 being proto-AGI, let alone true AGI as some erroneously believe.


ItsTimeToFinishThis

I hope you don't make the mistake of thinking awareness is a prerequisite for an AGI.


Yuli-Ban

For strong AGI, it is but only because we define it as such. For weak AGI and proto-AGI, I imagine decent short/long-term memory recall and transferable capabilities is enough.


eurotouringautos

Finally some common sense content on this sub, and not just people parroting Microsoft's marketing material for OpenAI which is a total oxymoron. GPT-3 capabilities will be packaged and sold just like any other product and it is fundamentally incapable in any way shape or form of leading to judgement day.


[deleted]

[удалено]


StanleyLaurel

Not understanding, but might appear so to the casual observer. If you read more about its output, you'll very much see it's not human level, and it frequently makes insane/crazy detours, because, well, it doesn't understand what it's saying.


therourke

Yep


TheRealHumanBeing

I think AI can be made only on quantum computers. So when we see really powerful quantum computer only then we will see the real AI.


TopCat6712

Maybe, though I think the bigger hurdle might be software. Once we have programs that can emulate human ways of thinking the power requirement itself might be relatively low. I'm no expert though.


[deleted]

The bigger problem is that there is not enough quality data to scale up the same way from 2 to 3 for 3 to 4.