The falling tree makes sound, the wind make sound, the . makes sound
regardless of your presence (or the presence of others) to hear that sound.
To argue anything else is utter nonsense.
wrb
From: everything-list@googlegroups.com
[mailto:everything-list@googlegroups.com] On Behalf Of C
On Tuesday, March 5, 2013 5:52:32 PM UTC-5, William R. Buckley wrote:
>
> I do not hold that the acceptor must exist, for then I
>
> am making a value judgment, and I have already scolded
>
> Craig for the same thing.
>
>
>
> Think of it this way. A volume of gas has a measure of
>
> entrop
On 3/5/2013 3:03 PM, William R. Buckley wrote:
Craig,
You build an automaton, place it and turn it on, and from that point
in time forward
the automaton reacts to acceptable information all on its own.
You contradict yourself -- - I don't think it has to be human --
machines only help
no
I do not hold that the acceptor must exist, for then I
am making a value judgment, and I have already scolded
Craig for the same thing.
Think of it this way. A volume of gas has a measure of
entropy. This means that the molecules are found in
a specific sequence of microstates, and th
Craig:
The mistake you make is clearly stated in your words:
“…doesn’t mean that they communicated with judgment.”
You are anthropomorphizing. The value is no more nor no
less than the action taken upon signal acceptance.
wrb
From: everything-list@googlegroups.com
[mailto:eve
On Tuesday, March 5, 2013 4:19:31 PM UTC-5, William R. Buckley wrote:
>
> The machine is informed.
>
Trivially, yes, but information is all about multiple levels. My mailbox
could be informed when it receives mail - but that's just a figure of
speech. No machine is ever literally or richly i
On 3/5/2013 6:53 AM, Bruno Marchal wrote:
Why would anyone want to make decisions that were not determined by their learning and
memories and values.
Indeed. But even more when they feel such value as being universal or close to
universal.
But based on your experience with salvia, Bruno,
On Tuesday, March 5, 2013 3:07:00 PM UTC-5, William R. Buckley wrote:
>
> The fact that a machine can act in a discriminatory based
>
> upon some signal (sign, information) input is demonstration
>
> of value judgment.
>
Only in our eyes, not in its own eyes. It's like telling a kid to say som
The machine is informed. Acceptance demonstrates the act of becoming
informed. The yield of such acceptance is called meaning.
Easily, trivially, this language can be applied to machine and organism
without
concomitant observation of the slightest distinction between them.
The definit
Dear Bil B. you probably have thought in these lines during similar long
periods as I did. It was ~2 decades ago when I defined
i n f o r m a t i o n as something with (at least) 2 ends:
1. the notion (in whatever format it shows up) - and
2. the acceptor (adjusting the notion in whatever context
On Tuesday, March 5, 2013 3:03:31 PM UTC-5, William R. Buckley wrote:
>
> Craig,
>
>
>
> You build an automaton, place it and turn it on, and from that point in
> time forward
>
> the automaton reacts to acceptable information all on its own.
>
Reacts, yes, but it isn't informed by the react
On Tuesday, March 5, 2013 1:16:59 PM UTC-5, John Clark wrote:
>
> On Mon, Mar 4, 2013 Craig Weinberg >wrote:
>
>
>> > No software can be run without being grounded in physical hardware,
>>
>
> And no human mind can exist without a physical brain.
>
I wasn't trying to differentiate machines
Euler's Equation Crackpottery
Feb 18 2013 Published by MarkCC under Bad Math, Bad Physics
One of my twitter followers sent me an interesting piece of
crackpottery.
I debated whether to do anything with it. The thing about
crackpottery
is that it really needs to have some content.
Total incoheren
The fact that a machine can act in a discriminatory based
upon some signal (sign, information) input is demonstration
of value judgment.
Just as there is no *in* in a machine, so to there is no *in*
in a biological organism; they both, machine and organism,
are forms that treat other fo
Craig,
You build an automaton, place it and turn it on, and from that point in time
forward
the automaton reacts to acceptable information all on its own.
You contradict yourself - - I don't think it has to be human - machines only
help
non-machines to interpret - - and if the human poi
On Tuesday, March 5, 2013 12:45:11 PM UTC-5, Bruno Marchal wrote:
>
>
> On 05 Mar 2013, at 08:43, Jesse Mazer wrote:
>
>
>
> On Mon, Mar 4, 2013 at 11:27 PM, Pierz >wrote:
>
>> Really Craig? It invalidates mechanistic assumptions about eyes? I'm sure
>> the researchers would be astonished at suc
On Tuesday, March 5, 2013 3:53:31 AM UTC-5, Alberto G.Corona wrote:
>
> Let´s say that what we call "information" is an extended form of sensory
> input. What makes this input "information" is the usability of this input
> for reducing the internal entropy of the receiver or increase the intern
On Mon, Mar 4, 2013 Craig Weinberg wrote:
> > No software can be run without being grounded in physical hardware,
>
And no human mind can exist without a physical brain.
> and no software can be completely sequestered from any other software
>
And human ideas cannot no that's not right
On Tuesday, March 5, 2013 12:03:28 PM UTC-5, William R. Buckley wrote:
>
> Craig:
>
>
>
> You statement of need for a human to observe the
>
> pattern is the smoking gun to indicate a misunderstanding
>
> of semiotic theory on your part.
>
I don't think that it has to be humans doing the obs
On 05 Mar 2013, at 08:43, Jesse Mazer wrote:
On Mon, Mar 4, 2013 at 11:27 PM, Pierz wrote:
Really Craig? It invalidates mechanistic assumptions about eyes? I'm
sure the researchers would be astonished at such a wild conclusion.
All the research shows is brain plasticity in interpreting s
On Monday, March 4, 2013 7:23:32 AM UTC-5, Bruno Marchal wrote:
>
>
> On 03 Mar 2013, at 20:35, meekerdb wrote:
>
> > On 3/2/2013 11:56 PM, Stathis Papaioannou wrote:
> >>> So you admit that what you say contradicts the fact that you are
> >>> >intentionally saying it?
> >> "Intentional", as
Craig:
You statement of need for a human to observe the
pattern is the smoking gun to indicate a misunderstanding
of semiotic theory on your part.
Specifically, you don't need a human; a machine will do.
Not all machines are man-made.
wrb
From: everything-list@googlegroups.c
On Tuesday, March 5, 2013 8:39:37 AM UTC-5, telmo_menezes wrote:
>
> Hi Craig,
>
> On Tue, Mar 5, 2013 at 1:44 PM, Craig Weinberg
> >
> wrote:
> > On Monday, March 4, 2013 11:27:21 PM UTC-5, Pierz wrote:
> >>
> >> Really Craig? It invalidates mechanistic assumptions about eyes? I'm
> sure
On 04 Mar 2013, at 20:16, meekerdb wrote:
On 3/4/2013 4:23 AM, Bruno Marchal wrote:
On 03 Mar 2013, at 20:35, meekerdb wrote:
On 3/2/2013 11:56 PM, Stathis Papaioannou wrote:
So you admit that what you say contradicts the fact that you are
>intentionally saying it?
"Intentional", as far a
On Tuesday, March 5, 2013 4:48:10 PM UTC+2, advanced...@list.ru wrote:
>
>
> On Tuesday, March 5, 2013 3:33:28 PM UTC+2, Stephen Paul King wrote:
>>
>> On 3/5/2013 6:23 AM, advanced...@list.ru wrote:
>>
>>
>> On Tuesday, March 5, 2013 1:16:15 PM UTC+2, advanced...@list.ru wrote:
>>>
>>>
>>> On
On Tuesday, March 5, 2013 3:33:28 PM UTC+2, Stephen Paul King wrote:
>
> On 3/5/2013 6:23 AM, advanced...@list.ru wrote:
>
>
> On Tuesday, March 5, 2013 1:16:15 PM UTC+2, advanced...@list.ru wrote:
>>
>>
>> On Sunday, January 27, 2013 2:53:12 PM UTC+2, Bruno Marchal wrote:
>>>
>>> Hi Stephen,
On 04 Mar 2013, at 17:06, John Clark wrote:
On Fri, Mar 1, 2013 Craig Weinberg wrote:
>> As I've said before it's important not to confuse levels, a
simulated flame won't burn your computer but it will burn a
simulated object.
> No, that argument is bogus. There is only one physical le
On Tuesday, March 5, 2013 8:27:29 AM UTC-5, stathisp wrote:
>
>
>
> On Tue, Mar 5, 2013 at 1:54 PM, Craig Weinberg
> wrote:
> >
> >
> > On Monday, March 4, 2013 8:11:12 PM UTC-5, stathisp wrote:
> >>
> >> On Tue, Mar 5, 2013 at 6:02 AM, Craig Weinberg
> wrote:
> >>
> >> >> I am responsible fo
Hi Craig,
On Tue, Mar 5, 2013 at 1:44 PM, Craig Weinberg wrote:
> On Monday, March 4, 2013 11:27:21 PM UTC-5, Pierz wrote:
>>
>> Really Craig? It invalidates mechanistic assumptions about eyes? I'm sure
>> the researchers would be astonished at such a wild conclusion. All the
>> research shows is
On 3/5/2013 6:23 AM, advancedguida...@list.ru wrote:
On Tuesday, March 5, 2013 1:16:15 PM UTC+2, advanced...@list.ru wrote:
On Sunday, January 27, 2013 2:53:12 PM UTC+2, Bruno Marchal wrote:
Hi Stephen,
On 25 Jan 2013, at 18:06, Stephen P. King wrote:
>
>
On Tue, Mar 5, 2013 at 1:54 PM, Craig Weinberg
wrote:
>
>
> On Monday, March 4, 2013 8:11:12 PM UTC-5, stathisp wrote:
>>
>> On Tue, Mar 5, 2013 at 6:02 AM, Craig Weinberg
wrote:
>>
>> >> I am responsible for my
>> >> actions because I know what I am doing and I choose to do it. If I
>> >> break
On Tuesday, March 5, 2013 2:06:20 AM UTC-5, William R. Buckley wrote:
>
> There is information (I take information to be a
> manifestation of entropy) and it is always represented
> in the form of a pattern (a distribution) of the units
> of mass/energy of which the Universe is composed.
I
On Tuesday, March 5, 2013 2:43:26 AM UTC-5, jessem wrote:
>
>
>
> On Mon, Mar 4, 2013 at 11:27 PM, Pierz >wrote:
>
>> Really Craig? It invalidates mechanistic assumptions about eyes? I'm sure
>> the researchers would be astonished at such a wild conclusion. All the
>> research shows is brain pl
On Monday, March 4, 2013 11:27:21 PM UTC-5, Pierz wrote:
>
> Really Craig? It invalidates mechanistic assumptions about eyes? I'm sure
> the researchers would be astonished at such a wild conclusion. All the
> research shows is brain plasticity in interpreting signals from unusual
> neural pathw
On Tuesday, March 5, 2013 1:16:15 PM UTC+2, advanced...@list.ru wrote:
>
>
> On Sunday, January 27, 2013 2:53:12 PM UTC+2, Bruno Marchal wrote:
>>
>> Hi Stephen,
>>
>> On 25 Jan 2013, at 18:06, Stephen P. King wrote:
>>
>> >
>> >Have you seen this? What implications does it have?
>> >
>
On Sunday, January 27, 2013 2:53:12 PM UTC+2, Bruno Marchal wrote:
>
> Hi Stephen,
>
> On 25 Jan 2013, at 18:06, Stephen P. King wrote:
>
> >
> >Have you seen this? What implications does it have?
> >
> > http://arxiv.org/ftp/arxiv/papers/1301/1301.5340.pdf
>
> If the result is correct
Let´s say that what we call "information" is an extended form of sensory
input. What makes this input "information" is the usability of this input
for reducing the internal entropy of the receiver or increase the internal
order. The receiver can be a machine, a cell, a person or a society for
examp
37 matches
Mail list logo