On Wed, Feb 13, 2013 at 4:45 PM, Craig Weinberg whatsons...@gmail.com wrote:
What is to stop duplication of, say, the simplest possible conscious
being made up of only a few atoms?
Because I suspect that conscious beings are not made of atoms, rather atoms
exist in the experience of beings.
What's it like? - It's not possible to describe what it's like. Except
maybe DEATH.
What did you see? - ALL
What did you do? - My body did nothing, but lay down. I was no more, just ALL
http://insanebraintrain.blogspot.fr/2011/07/massive-dosing-lsd-thumbprint.html
--
You received this
On Tue, Feb 12, 2013 at 11:49 PM, meekerdb meeke...@verizon.net wrote:
On 2/12/2013 2:40 PM, Telmo Menezes wrote:
I don't know what sort of computer your typed you post on but by 1997
standards it is almost certainly a supercomputer, probably the most powerful
supercomputer in the world. I'll
Almost the same sensations provoked by a stroke:
http://www.youtube.com/watch?v=QTrJqmKoveU
There is nothing in LSD or any other psychodelical drugs, except the
impairement fo the pre-conscious control of what arrives to the conscious
produced by the (different modules of the) brain, That is a
On 13 Feb 2013, at 04:09, Stathis Papaioannou wrote:
On Wed, Feb 13, 2013 at 11:58 AM, Jason Resch jasonre...@gmail.com
wrote:
Consider the following thought experiment, called The Duplicators:
At 1:00 PM tomorrow, you will be abducted by aliens. The aliens
will tell
you not to worry,
On 13 Feb 2013, at 06:45, Craig Weinberg wrote:
On Tuesday, February 12, 2013 10:09:40 PM UTC-5, stathisp wrote:
On Wed, Feb 13, 2013 at 12:24 PM, Craig Weinberg
whats...@gmail.com wrote:
1. Do you consider yourself to have experienced the torture in
the case of
the Restorers, even
On 13 Feb 2013, at 02:28, Russell Standish wrote:
On Tue, Feb 12, 2013 at 11:05:37AM -0800, Craig Weinberg wrote:
When we talk about a Bp, relating to consciousness is that we are
making an
assumption about what a proposition is. In fact, if we look
closely, a
proposition can only be
On 2/13/2013 10:26 AM, Bruno Marchal wrote:
On 13 Feb 2013, at 06:45, Craig Weinberg wrote:
On Tuesday, February 12, 2013 10:09:40 PM UTC-5, stathisp wrote:
On Wed, Feb 13, 2013 at 12:24 PM, Craig Weinberg
whats...@gmail.com javascript: wrote:
1. Do you consider yourself to
On 12 Feb 2013, at 20:05, Craig Weinberg wrote:
When we talk about a Bp, relating to consciousness is that we are
making an assumption about what a proposition is. In fact, if we
look closely, a proposition can only be another level of B. p is
really nothing but a group of sub-personal
*Wouldn’t Simulated Intelligence be a more appropriate term than Artificial
Intelligence?*
Thinking of it objectively, if we have a program which can model a
hurricane, we would call that hurricane a simulation, not an ‘artificial
hurricane’. If we modeled any physical substance, force, or
On 2/13/2013 3:10 AM, Telmo Menezes wrote:
The main reason Watson and similar programs fail to have human like
intelligence is that they lack human like values and motivations
True, but they could have generic intelligence -- the ability to learn
something new in a new domain, just by being
On 11 Feb 2013, at 21:43, Stephen P. King wrote:
On 2/11/2013 10:23 AM, Bruno Marchal wrote:
On 10 Feb 2013, at 20:36, Craig Weinberg wrote:
On Sunday, February 10, 2013 11:16:31 AM UTC-5, Bruno Marchal wrote:
On 09 Feb 2013, at 22:07, Craig Weinberg wrote:
On Saturday, February 9,
On 13 Feb 2013, at 16:25, Jason Resch wrote:
On Wed, Feb 13, 2013 at 9:18 AM, Bruno Marchal marc...@ulb.ac.be
wrote:
On 13 Feb 2013, at 04:09, Stathis Papaioannou wrote:
On Wed, Feb 13, 2013 at 11:58 AM, Jason Resch jasonre...@gmail.com
wrote:
Consider the following thought
Bruno,
Thanks for your response. I think I understand now.
Jason
On Wed, Feb 13, 2013 at 11:13 AM, Bruno Marchal marc...@ulb.ac.be wrote:
On 13 Feb 2013, at 16:25, Jason Resch wrote:
On Wed, Feb 13, 2013 at 9:18 AM, Bruno Marchal marc...@ulb.ac.be wrote:
On 13 Feb 2013, at 04:09,
Hi Stephen,
On 13 Feb 2013, at 16:53, Stephen P. King wrote:
On 2/13/2013 10:26 AM, Bruno Marchal wrote:
On 13 Feb 2013, at 06:45, Craig Weinberg wrote:
On Tuesday, February 12, 2013 10:09:40 PM UTC-5, stathisp wrote:
On Wed, Feb 13, 2013 at 12:24 PM, Craig Weinberg
whats...@gmail.com
On Tue, Feb 12, 2013 Telmo Menezes te...@telmomenezes.com wrote:
So far, nobody has been able to figure out a learning algorithm as
generic as the one our brains contains.
The developers of Watson have come very close to doing exactly that.
there is definitely room for generalists.
Then
On 13 Feb 2013, at 17:35, Craig Weinberg wrote:
Wouldn’t Simulated Intelligence be a more appropriate term than
Artificial Intelligence?
A better term would be natural imagination. But terms are not
important.
Thinking of it objectively, if we have a program which can model a
On Wednesday, February 13, 2013 3:58:31 AM UTC-5, stathisp wrote:
On Wed, Feb 13, 2013 at 4:45 PM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
What is to stop duplication of, say, the simplest possible conscious
being made up of only a few atoms?
Because I suspect
On Mon, Feb 11, 2013 Craig Weinberg whatsons...@gmail.com wrote:
Then why can't a one dimensional Turing machine do geometry,
It can solve geometry problems,
Yes.
but it can't generate geometric forms.
Can you generate geometric forms? Your fingers can draw a triangle but are
you
On 2/13/2013 7:26 AM, Bruno Marchal wrote:
Experiences cannot be duplicated literally, because I suspect that unique is the only
thing that experiences can literally be.
I agree with this, in the sense that this follows also from computationalism, and thus
3p-duplicability at some level.
An
On Wednesday, February 13, 2013 12:46:23 PM UTC-5, Bruno Marchal wrote:
On 13 Feb 2013, at 17:35, Craig Weinberg wrote:
*Wouldn’t Simulated Intelligence be a more appropriate term than
Artificial Intelligence?*
A better term would be natural imagination. But terms are not important.
On 2/13/2013 8:04 AM, Bruno Marchal wrote:
On 13 Feb 2013, at 03:03, meekerdb wrote:
On 2/12/2013 5:28 PM, Russell Standish wrote:
On Tue, Feb 12, 2013 at 11:05:37AM -0800, Craig Weinberg wrote:
When we talk about a Bp, relating to consciousness is that we are making an
assumption about
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn’t Simulated Intelligence be a more appropriate term than Artificial
Intelligence?*
Thinking of it objectively, if we have a program which can model a hurricane, we would
call that hurricane a simulation, not an ‘artificial hurricane’. If we
On Wednesday, February 13, 2013 1:23:14 PM UTC-5, John Clark wrote:
On Mon, Feb 11, 2013 Craig Weinberg whats...@gmail.com javascript:wrote:
Then why can't a one dimensional Turing machine do geometry,
It can solve geometry problems,
Yes.
but it can't generate geometric forms.
On Wednesday, February 13, 2013 10:56:05 AM UTC-5, Bruno Marchal wrote:
On 12 Feb 2013, at 20:05, Craig Weinberg wrote:
When we talk about a Bp, relating to consciousness is that we are making
an assumption about what a proposition is. In fact, if we look closely, a
proposition can only
On 2/13/2013 2:36 PM, meekerdb wrote:
On 2/13/2013 7:26 AM, Bruno Marchal wrote:
Experiences cannot be duplicated literally, because I suspect that
unique is the only thing that experiences can literally be.
I agree with this, in the sense that this follows also from
computationalism, and
On 2/13/2013 2:46 PM, meekerdb wrote:
On 2/13/2013 8:04 AM, Bruno Marchal wrote:
On 13 Feb 2013, at 03:03, meekerdb wrote:
On 2/12/2013 5:28 PM, Russell Standish wrote:
On Tue, Feb 12, 2013 at 11:05:37AM -0800, Craig Weinberg wrote:
When we talk about a Bp, relating to consciousness is that
On 2/13/2013 2:58 PM, meekerdb wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn’t Simulated Intelligence be a more appropriate term than
Artificial Intelligence?*
Thinking of it objectively, if we have a program which can model a
hurricane, we would call that hurricane a
On Wednesday, February 13, 2013 2:58:28 PM UTC-5, Brent wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn�t Simulated Intelligence be a more appropriate term than
Artificial Intelligence?*
Thinking of it objectively, if we have a program which can model a
hurricane, we
On Wednesday, February 13, 2013 5:11:32 PM UTC-5, Stephen Paul King wrote:
On 2/13/2013 2:58 PM, meekerdb wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn�t Simulated Intelligence be a more appropriate term than
Artificial Intelligence?*
Thinking of it objectively, if
On 2/13/2013 5:40 PM, Craig Weinberg wrote:
[SPK wrote} What difference that makes a difference does that make
in the grand scheme of things? The point is that we cannot 'prove'
that we are not in a gigantic simulation. Yeah, we cannot prove a
negative, but we can extract a lot
On 2/13/2013 5:21 PM, Craig Weinberg wrote:
On Wednesday, February 13, 2013 2:58:28 PM UTC-5, Brent wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn�t Simulated Intelligence be a more appropriate term
than Artificial Intelligence?*
Thinking of it objectively, if we
On 2/13/2013 5:40 PM, Craig Weinberg wrote:
[SPK wrote: ]'reality = best possible simulation.
I just realized how to translate that into my view: Reality = making
the most sense possible. Same thing really. That's why I talk about
multisense Realism, with Realism being the quality of
Hi Craig,
Thank you for your very well considered point of view on my original post.
I have some interjections that I would enjoy hearing a response to:
On Sunday, January 27, 2013 9:37:03 PM UTC-5, Craig Weinberg wrote:
On Sunday, January 27, 2013 5:35:22 PM UTC-5, freqflyer07281972
On Wednesday, February 13, 2013 7:05:38 PM UTC-5, Stephen Paul King wrote:
On 2/13/2013 5:40 PM, Craig Weinberg wrote:
[SPK wrote: ]'reality = best possible simulation.
I just realized how to translate that into my view: Reality = making the
most sense possible. Same thing really.
On Wednesday, February 13, 2013 5:51:27 PM UTC-5, Stephen Paul King wrote:
On 2/13/2013 5:40 PM, Craig Weinberg wrote:
[SPK wrote} What difference that makes a difference does that make in
the grand scheme of things? The point is that we cannot 'prove' that we are
not in a gigantic
On Wednesday, February 13, 2013 5:37:08 PM UTC-5, Stephen Paul King wrote:
On 2/13/2013 5:21 PM, Craig Weinberg wrote:
On Wednesday, February 13, 2013 2:58:28 PM UTC-5, Brent wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
*Wouldn�t Simulated Intelligence be a more
On Thu, Feb 14, 2013 at 3:35 AM, Craig Weinberg whatsons...@gmail.com wrote:
Wouldn’t Simulated Intelligence be a more appropriate term than Artificial
Intelligence?
Thinking of it objectively, if we have a program which can model a
hurricane, we would call that hurricane a simulation, not an
On Wednesday, February 13, 2013 7:05:39 PM UTC-5, freqflyer07281972 wrote:
Hi Craig,
Thank you for your very well considered point of view on my original post.
I have some interjections that I would enjoy hearing a response to:
Thanks Dan, I'll try my best.
On Sunday, January 27,
On Wednesday, February 13, 2013 9:45:43 PM UTC-5, stathisp wrote:
On Thu, Feb 14, 2013 at 3:35 AM, Craig Weinberg
whats...@gmail.comjavascript:
wrote:
Wouldn’t Simulated Intelligence be a more appropriate term than
Artificial
Intelligence?
Thinking of it objectively, if we
On Thu, Feb 14, 2013 at 2:27 PM, Craig Weinberg whatsons...@gmail.com wrote:
Whether the
intelligence has the same associated consciousness or not is a matter
for debate, but not the intelligence itself.
I disagree. There is no internal intelligence there at all. Zero. There is a
recording
On 2/13/2013 8:09 PM, Craig Weinberg wrote:
[SPK wrote: ]I like the idea of a Matrix universe exactly for that
reason; it takes resources to 'run' it. No free lunch, even for
universes!!!
You can still have the idea of resources if the universe isn't a
simulation though. No
On 2/13/2013 9:41 PM, Craig Weinberg wrote:
On Wednesday, February 13, 2013 5:37:08 PM UTC-5, Stephen Paul King
wrote:
On 2/13/2013 5:21 PM, Craig Weinberg wrote:
On Wednesday, February 13, 2013 2:58:28 PM UTC-5, Brent wrote:
On 2/13/2013 8:35 AM, Craig Weinberg wrote:
43 matches
Mail list logo