[FRIAM] truth, reality, & narrative

Frank Wimberly wimberly3 at gmail.com
Wed Jan 6 13:22:23 EST 2021


I haven't tried to prove it but I suspect those two definitions are
equivalent.

In our statistical causal reasoning work we used the causal Markov
condition:

every node in a Bayesian network
<https://en.m.wikipedia.org/wiki/Bayesian_network> is conditionally
independent <https://en.m.wikipedia.org/wiki/Conditionally_independent> of
its non-descendents, given its parents.

This was an essential assumption in our causal reasoning algorithms.
Famous philosopher Nancy Cartwright denies that it is universally true.
But according to my equally esteemed boss, she is wrong.

Frank
---
Frank C. Wimberly
140 Calle Ojo Feliz,
Santa Fe, NM 87505

505 670-9918
Santa Fe, NM

On Wed, Jan 6, 2021, 10:40 AM uǝlƃ ↙↙↙ <gepropella at gmail.com> wrote:

> Back when we were talking about the adequacy of deposition systems as
> analog to the lost opportunity updating mechanism for free will, I
> cross-checked definitions for the Markov property in 2 of my (old) control
> textbooks. Here they are:
>
> from [Estimation Theory and Applications, NE Nahi]
> > "A stochastic sequence x(k) is Markov of first order, or simply Markov,
> if for every i the following relationship holds
> >
> >   p{x(i)|x(i–1),…, x(1)} = p{x(i)|x(i–1)} (1.189)
> >
> > In other words, the conditional probability density function of x(i)
> conditioned on all its past (given) values is the same as using the value
> in the Equation (1.189) is to be satisfied for all i."
> from [Applied Optimal Control, Bryson & Ho]
> > "A random sequence, x(k), k=0,1,…,N is said to be markovian if
> >
> >   p[x(k+1)/x(k),x(k–1),… ,x(0)] = p[x(k+1)/x(k)] (11.1.1)
> >
> > for all k; that is, the probability density function of x(k+1) depends
> only on knowledge of x(k) and not on x(k-l), l=1,2,…. The knowledge of x(k)
> that is required can be either deterministic [exact value of x(k) known] or
> probabilistic (p[x(k)] known). In words, the markov property implies that a
> knowledge of the present separates the past and the future."
>
> I think the difference is interesting, particularly the "in other words"
> and "in words" parts. The choice of "i-1" vs. "k+1" is also interesting,
> but much less than "[future] conditioned on its past" versus "separates
> past and future". If we read like a modernist, through the presentations to
> some Platonic object behind them, we get the same damned thing, as
> transformed through fairly standard transforms (from what you read to what
> you think). But if we read it as a postmodernist, we can ask *why* Nahi
> chose i and i-1 where B&H chose k+1,k,k-1? And how *might* that choice be
> related to the more nuanced phrase "separates past from future"? And there
> are other differences, like Nahi's choice of "Markov of first order, or
> simply Markov" vs. B&H taking license to avoid allusion to higher order
> memory.
>
> A skeptical reader simply has to ask what does this *actually* mean? How
> are these authors intending to use and reuse this definition later? How
> will it compose with other concepts? Etc. Maybe it's all accidental and
> merely a function of the authoring/editing processes in each case. Or maybe
> not.
>
>
> On 1/5/21 2:28 PM, uǝlƃ ↙↙↙ wrote:
> > ... my contrarian nature (and my laziness) forces me to cross-check a
> proposition from one narrative to another,
>
> --
> ↙↙↙ uǝlƃ
>
> - .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
> FRIAM Applied Complexity Group listserv
> Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam
> un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
> archives: http://friam.471366.n2.nabble.com/
> FRIAM-COMIC <http://friam.471366.n2.nabble.com/FRIAM-COMIC>
> http://friam-comic.blogspot.com/
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20210106/2951a249/attachment.html>


More information about the Friam mailing list