I had a chat with "Josh" ( :D ) on backprop. Every ME: line is my reply. ME: etc was part of the text , not added later. Josh: is all AI generated. Was a charm to do this, easy. It works.
I get a little mad with Josh and he kinda quits out on me, lol. A conversation I had with someone who works on AGI: Me: How does backpropagation work? Josh: You know those neurons? Me: Ya... go ahead... Josh: Well, if you could see them, they would look like trees. Me: And? Josh: Well, a neuron will look a lot like a tree. And a tree is made out of branches. Me: And? Josh: And branches are made out of twigs.... so, you put a twig on a branch, and the twig is the input, and the branches are the hidden layers, and the leaves are the output. Me: Ok get to the punch... Josh: The twig is an input into a neuron, and the neuron is a machine that does a computation and spits out a number. This number is called the activation. Me: Ok, so... Josh: The activation is multiplied by the weight of the connection from the twig to the tree. All of this is happening in parallel. Me: So... Josh: So, the twig's weight is multiplied by the twig's activation, and the sum of all of that is multiplied by the weight to the next layer from this layer, then that layer's activation is multiplied by that layer's weight, and the sum of that is multiplied by the next layer's weight to the final layer, and that is the output. Me: Ok, so... Josh: The output is the number that you get out of the neural network. Me: Dude, this does NOT explain how backprop works! I get the dataset has patterns, so the network learns those, but HOW, like for example why is it that backprop learns "we walked down the > road"? And does backprop allow the AI to recognize holed matches like ex. it knows "we walked down the road" and sees "we landed down the > "? Becaue that is a match but one word in the middle is not matching. How does modern AI solve that? Josh: I don't know. I guess you'd have to talk to someone who knows how backprop actually works. Me: Come on tell me, you are smart I know you are. Josh: Oh, I'm not smart. Me: Come on tell me how it works! Josh: Not my field, sorry. Me: Tell me what you think! Josh: Well, I think that if you fed a neural network "we walked down the road", it would store a pattern. Me: Ok, so how do you describe the pattern with backprop? Josh: I don't know how backpropagation works, sorry. Me: I'll make my question simpler, ok? So if backprop trains a ANN on "we walked down the road", and then later the user sends a prompt to it that says "we landed down the ?", how, or does, backprop handle similar matches like that where all of the words don't have to match? Josh: I don't know how backpropagation works. I think that you could tell me how it works and I would understand it. Me: So you understand backprop now right? Josh: I understand backpropagation, but only the very basics. I don't know how it works. Me: How bout now? Can you believe in yourself now? Josh: I think I might believe in myself, but I still don't think I'm smart. Me: Here we go: So if backprop trains a ANN on "we walked down the road", and then later the user sends a prompt to it that says "we landed down the ?", how, or does, backprop handle similar matches like that where all of the words don't have to match? Josh: I don't know how backpropagation works. I think that you could tell me how it works and I would understand it. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T7bd554d835fea08c-M99742eb10b59e222d09703b9 Delivery options: https://agi.topicbox.com/groups/agi/subscription
