Excellent, if you know Elon you should ask him.

On Mon, Feb 20, 2017 at 1:19 AM, Jan Matusiewicz <[email protected]>
wrote:

> I happen not to know Elon Musk in person ;-) You do?
> I see that there are much more notable "collaborators": Larry Page, Mark
> Zuckerberg, Bill Gates. Do you claim that they really collaborate on your
> project?
>
> Think before you answer. Your creditibility is at stake. Nothing is lost
> in the Internet, if someone is going to find out "Is Dorian Aur from
> Stanford University a reliable scientist" - he or she might find this
> discussion.
>
> On Sun, Feb 19, 2017 at 7:35 PM, Dorian Aur <[email protected]> wrote:
>
>> You should ask Elon, already the number has doubled.
>>
>> On Sat, Feb 18, 2017 at 1:35 PM, Jan Matusiewicz <
>> [email protected]> wrote:
>>
>>> Is this project really supported by Elon Musk? It has only 45 reads and
>>> 3 followers so I doubt that.
>>> On Sat, 18 Feb 2017 19:20 Dorian Aur, <[email protected]> wrote:
>>>
>>>> That's the reliable path to build AGI and conscious machines
>>>> https://www.researchgate.net/project/Build-Conscious-Machines
>>>>  We will be greatly appreciate your constructive contribution
>>>>
>>>>
>>>> On Sat, Feb 18, 2017 at 8:15 AM, Jim Bromer <[email protected]>
>>>> wrote:
>>>>
>>>> An example of a substantive reason to support some AI theories is that
>>>> many programs have the ability to make some 'prediction' based on the
>>>> reaction to
>>>> features of input. Working from there, the ability to make predictions
>>>> means that knowledge or capabilities which had been integrated could
>>>> be used meaningfully. My counter argument is that 'prediction' and
>>>> 'predictive utility' as used in contemporary AI is just as noisy and
>>>> lossy as any other would be AGI facility. So although this kind of
>>>> ability may be necessary for AGI it (along with feedback integration
>>>> differentiation exclusion and a host of other contemporary AI
>>>> methods), it is far from sufficient. And I acknowledge that reasonable
>>>> expectation that progress in AGI is not going to be instantaneous or
>>>> smooth.
>>>> Jim Bromer
>>>>
>>>>
>>>>
>>>> On Sat, Feb 18, 2017 at 10:08 AM, Jim Bromer <[email protected]>
>>>> wrote:
>>>> > I will look at some of the links when I get some time, but you clearly
>>>> > did not understand my criticism of the theory. Even if I did
>>>> > misunderstand something from your presentation, so what? It does not
>>>> > necessarily mean the criticism is irrelevant. An example of a
>>>> > substantive reason to support some AI theories is that many programs
>>>> > have the ability to make some 'prediction' based on the reaction to
>>>> > features of input. If you had used that argument I would have pointed
>>>> > out that the ability of AI to make predictions (or expectations) is
>>>> > wiped out by extraneous features (noise) and by the complexity of any
>>>> > AI program to find which features would be relevant in different
>>>> > situations (lossy 'insights'). I am not saying this is relevant to
>>>> > what you are saying I am just trying to give you an example of how
>>>> > reasoning can be used to support a theory and how some reasoning which
>>>> > is substantial in theory may not be that strong in practice.
>>>> > Jim Bromer
>>>> >
>>>> >
>>>> > On Sat, Feb 18, 2017 at 8:03 AM, Logan Streondj <[email protected]>
>>>> wrote:
>>>> >> -----BEGIN PGP SIGNED MESSAGE-----
>>>> >> Hash: SHA256
>>>> >>
>>>> >>
>>>> >>
>>>> >> On 2017-02-17 10:03 PM, Jim Bromer wrote:
>>>> >>> Sorry. OK, you said the meditating was high phi. But your response
>>>> >>>  ignored (and was a distraction from) the point that I made that
>>>> >>> it would be possible to create computer programs that were capable
>>>> >>> of integration and differentiation (and therefore were capable of
>>>> >>> learned exclusion) which were not capable of anything resembling
>>>> >>> true intelligence.
>>>> >>
>>>> >> Do you have examples?
>>>> >>
>>>> >>> And I explicitly included the possibility that improvements in AGI
>>>> >>> might be slow and uneven.
>>>> >>
>>>> >> You just said there wasn't even a glimmer.
>>>> >>
>>>> >>> I do not have the time to waste 'doing research' into conjectures
>>>> >>> which are not demonstrable and which cannot be buttressed by
>>>> >>> reasoning.
>>>> >>
>>>> >> That is very sad. If you don't do research, I don't understand how
>>>> you
>>>> >> intend to contribute anything of value.
>>>> >>
>>>> >> Some studies that used/demonstrated integrated information theory in
>>>> >> humans:
>>>> >>
>>>> >> Improved Measures of Integrated Information
>>>> >> <http://journals.plos.org/ploscompbiol/article?id=10.1371/jo
>>>> urnal.pcbi.1
>>>> >> 005123>
>>>> >> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2930263/
>>>> >> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2823915/
>>>> >> http://stm.sciencemag.org/content/5/198/198ra105
>>>> >> Signature of consciousness in the dynamics of resting-state brain
>>>> activi
>>>> >> ty
>>>> >> http://www.pnas.org/content/112/3/887.short
>>>> >>
>>>> >>
>>>> >> A simplified explanation of various predictions and explaining powers
>>>> >> of IIT (cites numerous demonstrating studies):
>>>> >> <http://www.scholarpedia.org/article/Integrated_information_
>>>> theory#Predi
>>>> >> ctions_and_explanations>
>>>> >>
>>>> >> Some Machine Intelligence specific studies:
>>>> >>
>>>> >> An affective computational model for machine consciousness
>>>> >> https://arxiv.org/pdf/1701.00349.pdf
>>>> >> High Integrated Information in Complex Networks Near Criticality
>>>> >> http://link.springer.com/chapter/10.1007/978-3-319-44778-0_22
>>>> >> Group Minds and the Case of Wikipedia
>>>> >> https://arxiv.org/abs/1407.2210
>>>> >> Integrated Information Theory and Artificial Consciousness
>>>> >> <https://books.google.ca/books?hl=en&lr=&id=YIIJDgAAQBAJ&oi=
>>>> fnd&pg=PA1&d
>>>> >> q=artificial&ots=nYUVMmJnSw&sig=WSi1ECoHAhYrwgrNdUHu3hX4kWU#
>>>> v=onepage&q=
>>>> >> artificial&f=false>
>>>> >> The Information-theoretic and Algorithmic Approach to Human, Animal
>>>> >> and Artificial Cognition
>>>> >> https://arxiv.org/abs/1501.04242
>>>> >>
>>>> >>> You have not even begun to respond to the crucial criticisms.
>>>> >>>
>>>> >>
>>>> >> I wasn't aware of any criticisms, other than misunderstandings on
>>>> your
>>>> >> behalf.
>>>> >>
>>>> >>
>>>> >>> Jim Bromer
>>>> >>
>>>> >>
>>>> >> Thanks,
>>>> >> motivated me to post a bunch of links to studes.
>>>> >>
>>>> >>>
>>>> >>> On Fri, Feb 17, 2017 at 7:38 PM, Logan Streondj <[email protected]
>>>> >>> <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>>> You said that meditating might be a form of low phi.
>>>> >>>
>>>> >>> no i didn't. i said several times in different ways:
>>>> >>>
>>>> >>> Actually passive consciousness can be high phi, the phi level is
>>>> >>> determined by integration, information and exclusion, not by
>>>> >>> activity.
>>>> >>>
>>>> >>>> we should have seen glimmers of true AGI even if those glimmers
>>>> >>>> did not equal strong AI. So the question is why haven't we seen
>>>> >>>> that?
>>>> >>>
>>>> >>> we have, your just turning a blind eye to it.
>>>> >>>
>>>> >>> On February 17, 2017 6:04:40 PM EST, Jim Bromer
>>>> >>> <[email protected] <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>> Here is my point of view. GOFAI should have worked, in the sense
>>>> >>> that it should have kept improving over the years. That
>>>> >>> improvement might have been slow and uneven but it should have
>>>> >>> worked. What we see now is that most of the creative processes seem
>>>> >>> to occur within neural nets, especially in hybrids that use both
>>>> >>> neural nets and systems that have been designed for more discrete
>>>> >>> (or more discrete-like) kinds of reasoning.
>>>> >>>
>>>> >>> If history had unfolded in the way I think it should have, then we
>>>> >>>  should have seen glimmers of true AGI even if those glimmers did
>>>> >>> not equal strong AI. So the question is why haven't we seen that?
>>>> >>> To say that a theory which cannot be demonstrated is able to
>>>> >>> actually express consciousness, even glimmers of consciousness,
>>>> >>> needs a lot of supporting reasoning. It might turn out to be a
>>>> >>> good theory but if it can't pull its own weight then it is just
>>>> >>> dreaming.
>>>> >>>
>>>> >>> You have to explain why this conjecture might be useful to us.
>>>> >>>
>>>> >>> You said that meditating might be a form of low phi. But when a
>>>> >>> person is meditating he is able to demonstrate that he is capable
>>>> >>> of strong reasoning. A computer program seems to integrate and
>>>> >>> differentiate data based on abstract principles. A learning
>>>> >>> program could then turn new learning into abstract principles which
>>>> >>> could be used to integrate and differentiate new data. But a
>>>> >>> program that did that would not have to be thinking or learning in
>>>> >>> a useful way. Jim Bromer
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Feb 17, 2017 at 11:34 AM, Logan Streondj
>>>> >>> <[email protected] <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> On 2017-02-14 04:10 PM, Jim Bromer wrote:
>>>> >>>
>>>> >>> Well again you are talking about interesting concepts like passive
>>>> >>>  consciousness (or low phi - that's a great expression).
>>>> >>>
>>>> >>>
>>>> >>> Actually passive consciousness can be high phi, the phi level is
>>>> >>> determined by integration, information and exclusion, not by
>>>> >>> activity.
>>>> >>>
>>>> >>> But does that actually make sense? Wouldn't passive consciousness
>>>> >>> be zombie like?
>>>> >>>
>>>> >>>
>>>> >>> It would be more like alpha meditation, passive awareness. Ready to
>>>> >>> spring into action.
>>>> >>>
>>>> >>> You are suggesting that there may be something in between but
>>>> >>> which has very limited effects.
>>>> >>>
>>>> >>>
>>>> >>> I'm not sure what you are referring to.
>>>> >>>
>>>> >>> That is cool, but not really demonstrable with current AI concepts
>>>> >>>  is it? How do you show that there is dormant consciousness in an
>>>> >>> AI application without awaking it?
>>>> >>>
>>>> >>>
>>>> >>> well if a program has high phi can be determined by it's structure
>>>> >>> and connections. So if you have access to that information, can be
>>>> >>> done in an offline setting.
>>>> >>>
>>>> >>> It's dormancy status doesn't change it's phi level. For example it
>>>> >>>  could be waiting for an interrupt or a packet to process.
>>>> >>>
>>>> >>> Or if it is a process loaded into RAM that is sharing CPU
>>>> >>> resources, but is not currently scheduled -- from it's perspective
>>>> >>>  nothing is happening, unless it is being processed by the CPU.
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> Jim Bromer
>>>> >>>
>>>> >>>
>>>> >>> On Mon, Feb 13, 2017 at 7:16 PM, Logan Streondj
>>>> >>> <[email protected] <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>>
>>>> >>> On 2017-02-13 05:43 PM, Mike Archbold wrote:
>>>> >>>
>>>> >>> I agree with Jim Bromer. Partly I guess it depends how Tononi
>>>> >>> defines consciousness. I've studied it very generally, but not in
>>>> >>> detail. Intuitively, he seems to have identified an essential
>>>> >>> component but by no means everything.... Obviously our
>>>> >>> consciousness is saturated in feedback, but just having a feedback
>>>> >>>  property does not make a computer conscious. What is a more
>>>> >>> complete definition under IIT? It can't be just feedback... what I
>>>> >>>  have seen of IIT looks interesting.
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> "it has been pointed out that the brain (and many other systems) is
>>>> >>> full of reentrant circuits, many of which do not seem to contribute
>>>> >>> to consciousness [51]. IIT offers some specific insights with
>>>> >>> respect to these issues. First, the need for reciprocal
>>>> >>> interactions within a complex is not merely an empirical
>>>> >>> observation, but it has theoretical validity because it is derived
>>>> >>> directly from the phenomenological axiom of (strong) integration.
>>>> >>> Second, (strong) integration is by no means the only requirement
>>>> >>> for consciousness, but must be complemented by information and
>>>> >>> exclusion. Third, for IIT it is the potential for interactions
>>>> >>> among the parts of a complex that matters and not the actual
>>>> >>> occurrence of ‘‘feed-back’’ or ‘‘reentrant’’ signaling, as is
>>>> >>> usually assumed. As was discussed above, a complex can be
>>>> >>> conscious, at least in principle, even though none of its neurons
>>>> >>> may be firing, no feed-back or reentrant loop" IIT3.0
>>>> >>>
>>>> >>>
>>>> >>> Basically from my limited understanding of IIT3.0
>>>> >>>
>>>> >>> Consciousness requires three things Information, Integration and
>>>> >>> Exclusion.
>>>> >>>
>>>> >>> Feedback satisfies integration, as the components have to be
>>>> >>> interconnected.
>>>> >>>
>>>> >>> Information implies that there are past memories which can affect
>>>> >>> present actions, or that memories could be acquired to affect
>>>> >>> future actions.
>>>> >>>
>>>> >>> And Exclusion means that the consciousness has defined borders.
>>>> >>>
>>>> >>>
>>>> >>> It would seem that it may be a bit tricky with the program
>>>> >>> switching that happens in a modern CPU. Though I'm fairly certain
>>>> >>> that with FPGA's it would apply quite smoothly. For instance after
>>>> >>>  some FPGA circuits have been set up, even if they are not active,
>>>> >>>  they could still be passively conscious, ready for input.
>>>> >>>
>>>> >>> It may be similar if you consider things loaded in RAM or cache as
>>>> >>>  passively conscious and actively conscious when processing in CPU.
>>>> >>>  Similarly a kernel loaded in GPU would be conscious, though I
>>>> >>> think there would be some question as to the quality of
>>>> >>> consciousness, whether it might be highly modular, and thus of a
>>>> >>> low phi. IF however all the kernels are working together, via local
>>>> >>> or glboal memory, they could be considered strongly integrated, and
>>>> >>> would be a singular consciousness.
>>>> >>>
>>>> >>> On 2/13/17, Steve Richfield <[email protected]
>>>> >>> <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>> Your Central Metabolic Control System (CMCS) is clearly both
>>>> >>> intelligent and independent. It appears to have abilities
>>>> >>> approximating a PhD Control Systems Engineer, and often works at
>>>> >>> cross purposes to your conscious intentions to keep you alive and
>>>> >>> healthy.
>>>> >>>
>>>> >>> CMCS malfunctions often look a LOT like demonic possession.
>>>> >>>
>>>> >>> Steve
>>>> >>>
>>>> >>> On Feb 11, 2017 12:11 PM, "Dr Miles Dyson"
>>>> >>> <[email protected]
>>>> >>> <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>> When I fall asleep and loose consciousness, the neurons in my
>>>> >>> brain do not rearrange themselves such that no feedback loops
>>>> >>> exist. And there are many feedback loops that exist in the brain,
>>>> >>> but I don't have many consciousnesses, I have but one. For both of
>>>> >>> those reasons consciousness and neural net feedback loops are not
>>>> >>> one and the same thing.
>>>> >>>
>>>> >>> On Sat, Feb 11, 2017 at 10:04 AM, Jim Bromer <[email protected]
>>>> >>> <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>> I don't concur, but it is an interesting placement of the minimum
>>>> >>> for (machine) 'consciousness'. I did not realize that 'stateless'
>>>> >>> 'pure functions' could be called 'feed forward'. If global effects
>>>> >>>  were sufficient to induce 'consciousness' then any program with
>>>> >>> global effects could be called conscious. Even assuming that you
>>>> >>> were being more specific than that I still don't think structures
>>>> >>> that can carry states between calls (in the ways that you were
>>>> >>> thinking) would be sufficient for conscious behaviors to emerge..
>>>> >>>
>>>> >>> Jim Bromer
>>>> >>>
>>>> >>> On Sat, Feb 11, 2017 at 9:37 AM, Logan Streondj <[email protected]
>>>> >>> <mailto:[email protected]>> wrote:
>>>> >>>
>>>> >>> I've been promoting Integrated Information Theory for a while but
>>>> >>> I finally sat down and read the whole thing yesterday.
>>>> >>>
>>>> >>> Explicitly it mentions that feed forward neuronets are 'zombies'
>>>> >>> or unconscious while recurrent neuronets are conscious due to the
>>>> >>> feedback loops.
>>>> >>>
>>>> >>> So now I'm wondering which classical programming structures are
>>>> >>> 'zombies' and which are conscious.
>>>> >>>
>>>> >>> It would seem by analogy that stateless or pure functions are
>>>> >>> zombies since they simply feed forward.
>>>> >>>
>>>> >>> Wheras structures that carry state between calls such as objects
>>>> >>> and actors are conscious.
>>>> >>>
>>>> >>> Do you concur?
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> -- Sent from my Android device with K-9 Mail and OpenKeychain. my
>>>> >>> fingerprint is bd7e 6e2a e625 6d47 f7ed 30ec 86d8 fc7c fad7 2729
>>>> >>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now>>
>>>> >>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b
>>>> >>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b>
>>>> >>>
>>>> >>> 5>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> | Modify <https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;> Your
>>>> >>>
>>>> >>> Subscription <http://www.listbox.com>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now>>
>>>> >>> <https://www.listbox.com/member/archive/rss/303/26973278-698fd9ee
>>>> >>> <https://www.listbox.com/member/archive/rss/303/26973278-698fd9ee>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> | Modify <https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;> Your Subscription
>>>> >>>
>>>> >>> <http://www.listbox.com>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now>>
>>>> >>> <https://www.listbox.com/member/archive/rss/303/10443978-6f4c28ac
>>>> >>> <https://www.listbox.com/member/archive/rss/303/10443978-6f4c28ac>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> | Modify <https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;> Your Subscription
>>>> >>>
>>>> >>> <http://www.listbox.com>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives:
>>>> >>> https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
>>>> >>> <https://www.listbox.com/member/archive/rss/303/11943661-d9279dae>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;
>>>> >>>
>>>> >>> Powered by Listbox: http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives:
>>>> >>> https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d
>>>> >>> <https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> Modify Your Subscription:
>>>> >>>
>>>> >>> https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;
>>>> >>>
>>>> >>> eb0005
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> Powered by Listbox: http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives:
>>>> >>> https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/24379807-653794b5
>>>> >>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5>
>>>> >>>  Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>; Powered by Listbox:
>>>> >>> http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives:
>>>> >>> https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d
>>>> >>> <https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d>
>>>> >>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>;
>>>> >>>
>>>> >>> eb0005
>>>> >>>
>>>> >>>
>>>> >>> Powered by Listbox: http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives: https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/24379807-653794b5
>>>> >>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5>
>>>> >>>  Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;>; Powered by Listbox:
>>>> >>> http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >>> ------------------------------------------------------------
>>>> ----------
>>>> >> - --
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> AGI Archives: https://www.listbox.com/member/archive/303/=now
>>>> >>> <https://www.listbox.com/member/archive/303/=now> RSS Feed:
>>>> >>> https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d
>>>> >>> <https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d>
>>>> >>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >>> <https://www.listbox.com/member/?&;> Powered by Listbox:
>>>> >>> http://www.listbox.com
>>>> >>>
>>>> >>>
>>>> >>> -- Sent from my Android device with K-9 Mail and OpenKeychain. my
>>>> >>> fingerprint is bd7e 6e2a e625 6d47 f7ed 30ec 86d8 fc7c fad7 2729
>>>> >>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> >>>
>>>> >>> <https://www.listbox.com/member/archive/rss/303/24379807-653794b5>
>>>> >>> | Modify <https://www.listbox.com/member/?&;> Your Subscription
>>>> >>> [Powered by Listbox] <http://www.listbox.com>
>>>> >>>
>>>> >>>
>>>> >>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> >>>  <https://www.listbox.com/member/archive/rss/303/5037279-a88c7a6d>
>>>> >>> | Modify
>>>> >>> <https://www.listbox.com/member/?&;
>>>> >> beb0005>
>>>> >>>
>>>> >>>
>>>> >>>
>>>> >> Your Subscription       [Powered by Listbox] <http://www.listbox.com
>>>> >
>>>> >>>
>>>> >>
>>>> >> - --
>>>> >> Logan Streondj,
>>>> >> A dream of Gaia's future.
>>>> >> twitter: https://twitter.com/streondj
>>>> >>
>>>> >> You can use encrypted email with me,
>>>> >> how to: https://emailselfdefense.fsf.org/en/
>>>> >> key fingerprint:
>>>> >> BD7E 6E2A E625 6D47 F7ED 30EC 86D8 FC7C FAD7 2729
>>>> >> -----BEGIN PGP SIGNATURE-----
>>>> >> Version: GnuPG v2
>>>> >>
>>>> >> iQIcBAEBCAAGBQJYqEYZAAoJEIbY/Hz61ycpa4oQAK3ROLUCVh11SPWnTwnmgW4Q
>>>> >> yURerCTdLsiVEW+CwoogQZxgSFnf34ZJ2FehBkw9VfKnTVZ6EQFaIdnPvHpZKG2f
>>>> >> HHsnESCViLJeInCxSajrfmJLgJ7dYfeKXki/Wmd7OhY5Lw3t21p/hK+iSjvXTpBe
>>>> >> M5Sy+RJPpFzzbcn/B1SsSgFVjvUF58IkUYwKLN2LgZuRJv3I755NK7xjXD5IDvrQ
>>>> >> yuPe/CLlRitRpShHDuDZfLFRgwnIemsilIHpExkn2eWqff4qrP7dO16VqOdFAim8
>>>> >> xYdy07XAD3nF0BWUlML42jQY7lGI6KL7YR21FhiZO3HDuZbNNYzN34Lcx0vGPEck
>>>> >> XRCEk2tnjPYMCVzWBolzW+dTnTa8hsUAOQ0TGgAa+0bEkR/TQ8Ndrzbsq5Rx3TvA
>>>> >> Xv3/4mwJ6E6TDTdfNYMblLDBQGuxfvMbnpwV/Nhu/BJdoR//N7s9jguNzO448cSX
>>>> >> HA6yCewE545NUEsqmxNA1XPvkfJgiwHOYU/0QFr9iyRFDJAILNf7dtlE/O93hOCV
>>>> >> cr2CXjuT0t9nth6ZApon1/sk5NTsJpV5LHqAzket+l0RJeLv9WUKD3vkja84ryLk
>>>> >> a4KJd83tUbFR7ATYF1k0daVa1MPSC9jYWDNTJVkiYu7i6sdXltodyyM2yZB5xCpm
>>>> >> 8W2GdFDhHlQISZTTzz0R
>>>> >> =5F6L
>>>> >> -----END PGP SIGNATURE-----
>>>> >>
>>>> >>
>>>> >> -------------------------------------------
>>>> >> AGI
>>>> >> Archives: https://www.listbox.com/member/archive/303/=now
>>>> >> RSS Feed: https://www.listbox.com/member
>>>> /archive/rss/303/24379807-653794b5
>>>> >> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> >> Powered by Listbox: http://www.listbox.com
>>>>
>>>>
>>>> -------------------------------------------
>>>> AGI
>>>> Archives: https://www.listbox.com/member/archive/303/=now
>>>>
>>>> RSS Feed: https://www.listbox.com/member/archive/rss/303/17795807-366c
>>>> fa2a
>>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>>
>>>>
>>>> Powered by Listbox: http://www.listbox.com
>>>>
>>>>
>>>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/28565694-f30243b8> |
>>>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>>>> <http://www.listbox.com>
>>>>
>>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>> <https://www.listbox.com/member/archive/rss/303/17795807-366cfa2a> |
>>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>>> <http://www.listbox.com>
>>>
>>>
>> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>> <https://www.listbox.com/member/archive/rss/303/28565694-f30243b8> |
>> Modify <https://www.listbox.com/member/?&;> Your Subscription
>> <http://www.listbox.com>
>>
>
> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/17795807-366cfa2a> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to