Nick, Re: your queston about stochastic processes....

## Advertising

`Yes, your specific description "AND its last value" is what most uses of`

`"stochastic process" imply. But, technically all that is required to be`

`a "stochastic process" is that each next step in the process is`

`unpredictable, whether or not the outcome of one step influences the`

`outcome of the next. An example of this is the process of flipping a`

`coin several times in a row. Generally, we assume that the outcomes of`

`two adjacent flips are stochastically (or statistically) independent,`

`and that there is no influence between the steps. So, the steps of an`

`independent stochastic process are not dependent on their previous steps.`

`On the other hand, selecting dinner tonight probably depends on what you`

`had last night, because you would get bored with posole too many nights`

`in a row. And maybe your memory goes back more than just one night, and`

`your selection of dinner tonite is affected by what you had for 2 or`

`more nites before. If your memory goes back only one night, then your`

`"dinner selection process" is a kind of stochastic process called a`

`"Markov process". Markov processes limit their "memory" to just one`

`step. (That keeps the math simpler.)`

`In any event, stochastic processes whose steps depend on the outcomes of`

`previous steps are "less random" than those that don't, because the`

`earlier steps "give you extra information" that help you narrow down the`

`options and to better predict the future steps - some more than others.`

`So, LEARNING can occur inside of these dependent stochastic processes.`

`In fact, the mathematics of information theory is all about taking`

`advantage of these dependent (or "conditional") stochastic processes to`

`hopefully predict the outcomes of future steps. The whole thing is based`

`on conditional probability. Info theory uses formulas with names such as`

`joint entropy, conditional entropy, mutual information and entropy rate.`

`These formulas can measure /how much /stochastic dependency is at work`

`in a particular process - i.e how predictable it is. Entropy rate in`

`particular works with conditional stochastic processes and tries to use`

`that "extra information" provided by stochastic dependencies to predict`

`future outcomes.`

`Re: your "evolution" question... I have been speaking of biological`

`evolution.`

HTH Grant On 8/9/17 8:47 AM, Nick Thompson wrote:

Hi everybody, Thanks for your patience as I emerge (hopefully) from post-surgical fog. I figured I best start my own thread rather than gum up yours.First. I had always supposed that a stochastic process was one whosevalue was determined by two factors, a random factor AND it’s lastvalue. So the next step in a random walk is “random” but the currentvalue (it’s present position on a surface, say) is “the result of astochastic process.” From your responses, and from a short rummage inWikipedia, I still can’t tell if I am correct or not.Now remember, you guys, my standard critique of your discourse is thatyou confuse your models with the facts of nature. What is this“evolution” of which you speak? Unless you tell me otherwise, I willassume you are speaking of the messy biological process of which weare all a result: -- */The alteration of the design of taxa overtime/*. Hard to see any way in which that actual process isevidently random. We have to dig deep into the theory that EXPLAINSevolution to find anything that corresponds to the vernacular notionof randomness. There is constraint and predictability all over theplace in the evolution I know. Even mutations are predictable. Inother words, the randomness of evolution is a creation of yourimaginations concerning the phenomenon, not an essential feature ofthe phenomenon, itself.So what kind of “evolution” are you guys talking about?Yes, and forgive me for trolling, a bit. I am trying to wake myselfup, here.nick Nicholas S. Thompson Emeritus Professor of Psychology and Biology Clark Universityhttp://home.earthlink.net/~nickthompson/naturaldesigns/<http://home.earthlink.net/%7Enickthompson/naturaldesigns/>============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove