Re: [singularity] Quantum Mechanics and Consciousness

2008-05-21 Thread Matt Mahoney
light without darkening me.' Thomas Jefferson, letter to Isaac McPherson, 13 August 1813 -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss

Re: [singularity] future of mankind blueprint and relevance of AGI

2008-05-20 Thread Matt Mahoney
was a year ago and nothing was released yet. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com

Re: [singularity] An Open Letter to AGI Investors

2008-04-16 Thread Matt Mahoney
own development project. So if the value of AGI is all the human labor it replaces (about US $1 quadrillion), how much will it cost to build? Keep in mind there is a tradeoff between waiting for the cost of technology to drop vs. having it now. How much should we expect to spend? -- Matt

Re: [singularity] Stronger than Turing?

2008-04-15 Thread Matt Mahoney
and Entropy of Printed English”, Bell Sys. Tech. J (3) p. 50-64, 1950. 2. Cover, T. M., and R. C. King, “A Convergent Gambling Estimate of the Entropy of English”, IEEE Transactions on Information Theory (24)4 (July) pp. 413-421, 1978. -- Matt Mahoney, [EMAIL PROTECTED

Testing AGI (was RE: [singularity] Vista/AGI)

2008-04-13 Thread Matt Mahoney
reasoning. Including these capabilities would not improve compression. Tests on small data sets could be used to gauge early progress. But ultimately, I think you are going to need hardware that supports AGI to test it. -- Matt Mahoney, [EMAIL PROTECTED

Re: About the Nine Misunderstandings post [WAS Re: [singularity] I'm just not sure how well...]

2008-04-11 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: We already have examples of reproducing agents: Code Red, SQL Slammer, Storm, etc. A worm that can write and debug code and discover new vulnerabilities will be unstoppable. Do you really think your AI will win the race

Re: About the Nine Misunderstandings post [WAS Re: [singularity] I'm just not sure how well...]

2008-04-11 Thread Matt Mahoney
biggest applications. Unfortunately, the knowledge needed to secure computers is almost exactly the same kind of knowledge needed to attack them. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now

Re: About the Nine Misunderstandings post [WAS Re: [singularity] I'm just not sure how well...]

2008-04-11 Thread Matt Mahoney
--- Vladimir Nesov [EMAIL PROTECTED] wrote: On Fri, Apr 11, 2008 at 10:50 PM, Matt Mahoney [EMAIL PROTECTED] wrote: If the problem is so simple, why don't you just solve it? http://www.securitystats.com/ http://en.wikipedia.org/wiki/Storm_botnet There is a trend toward using

Re: [singularity] I'm just not sure how well this plan was thought through [WAS Re: Promoting an A.S.P.C,A.G.I.]

2008-04-10 Thread Matt Mahoney
) was released in 1990, you probably imagined that all search engines would require you to know the name of the file you were looking for. If you have a better plan for AGI, please let me know. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http

RE: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-09 Thread Matt Mahoney
--- John G. Rose [EMAIL PROTECTED] wrote: From: Matt Mahoney [mailto:[EMAIL PROTECTED] The simulations can't loop because the simulator needs at least as much memory as the machine being simulated. You're making assumptions when you say that. Outside of a particular simulation we

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-09 Thread Matt Mahoney
spam and malicious messages at risk of having their own reputations lowered if they fail. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-09 Thread Matt Mahoney
impact on its outcome. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member

Re: [singularity] Re: Promoting an A.S.P.C,A.G.I.

2008-04-09 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: Perhaps you have not read my proposal at http://www.mattmahoney.net/agi.html or don't understand it. Some of us have read it, and it has nothing whatsoever to do with Artificial Intelligence. It is a labor-intensive

Re: [singularity] Re: Promoting an A.S.P.C,A.G.I.

2008-04-09 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: Just what do you want out of AGI? Something that thinks like a person or something that does what you ask it to? Either will do: your suggestion achieves neither. If I ask your non-AGI the following question: How

Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Matt Mahoney
a human mind. I don't believe that one person or a small group can solve the AGI problem faster than the billions of people on the Internet are already doing. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive

RE: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Matt Mahoney
--- Derek Zahn [EMAIL PROTECTED] wrote: Matt Mahoney writes: As for AGI research, I believe the most viable path is a distributed architecture that uses the billions of human brains and computers already on the Internet. What is needed is an infrastructure that routes information

RE: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Matt Mahoney
as the machine being simulated. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Matt Mahoney
and are not in deep conflict? I don't expect the experts to agree. It is better that they don't. There are hard problem remaining to be solved in language modeling, vision, and robotics. We need to try many approaches with powerful hardware. The network will decide who the winners are. -- Matt Mahoney

Re: Promoting AGI (RE: [singularity] Vista/AGI)

2008-04-08 Thread Matt Mahoney
emerging from the Internet bears little resemblance to Novamente. It is simply too big to invest in directly, but it will present many opportunities. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now

Re: [singularity] future search

2008-04-02 Thread Matt Mahoney
--- David Hart [EMAIL PROTECTED] wrote: Hi All, I'm quite worried about Google's new *Machine Automated Temporal Extrapolation* technology going FOOM! http://www.google.com.au/intl/en/gday/ More on the technology http://en.wikipedia.org/wiki/Google's_hoaxes :-) -- Matt Mahoney

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-03-05 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- John G. Rose [EMAIL PROTECTED] wrote: Is there really a bit per synapse? Is representing a synapse with a bit an accurate enough simulation? One synapse is a very complicated system. A typical neural network

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-03-05 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: I was referring to Landauer's estimate of long term memory learning rate of about 2 bits per second. http://www.merkle.com/humanMemory.html This does not include procedural memory, things like visual perception

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-03-05 Thread Matt Mahoney
--- Eric B. Ramsay [EMAIL PROTECTED] wrote: Matt Mahoney [EMAIL PROTECTED] wrote: [For those not familiar with Richard's style: once he disagrees with something he will dispute it to the bitter end in long, drawn out arguments, because nothing is more important than being right

RE: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-03-04 Thread Matt Mahoney
of an associative memory stores 0.15 bits per synapse. But cognitive models suggest the human brain stores .01 bits per synapse. (There are 10^15 synapses but human long term memory capacity is 10^9 bits). -- Matt Mahoney, [EMAIL PROTECTED] --- singularity

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-02-28 Thread Matt Mahoney
that is a product of evolution, and therefore biased toward beliefs that favor survival of the species. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-02-28 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 29/02/2008, Matt Mahoney [EMAIL PROTECTED] wrote: By equivalent computation I mean one whose behavior is indistinguishable from the brain, not an approximation. I don't believe that an exact simulation requires copying

RE: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-02-28 Thread Matt Mahoney
--- John G. Rose [EMAIL PROTECTED] wrote: From: Matt Mahoney [mailto:[EMAIL PROTECTED] By equivalent computation I mean one whose behavior is indistinguishable from the brain, not an approximation. I don't believe that an exact simulation requires copying the implementation down

Re: [singularity] Re: Revised version of Jaron Lanier's thought experiment.

2008-02-28 Thread Matt Mahoney
. And removing a 0.1 micron chunk out of a CPU chip can cause it to fail, yet I can run the same programs on a chip with half as many transistors. Nobody knows how to make an artificial brain, but I am pretty confident that it is not necessary to preserve its structure to preserve its function. -- Matt

Re: [singularity] Definitions

2008-02-19 Thread Matt Mahoney
--- Charles D Hixson [EMAIL PROTECTED] wrote: John K Clark wrote: Matt Mahoney [EMAIL PROTECTED] It seems to me the problem is defining consciousness, not testing for it. And it seems to me that beliefs of this sort are exactly the reason philosophy is in such a muddle

Re: Infinitely Unlikely Coincidences [WAS Re: [singularity] AI critique by Jaron Lanier]

2008-02-17 Thread Matt Mahoney
to use Turing machines in proofs, even though we can't actually build one. Hutter is not proposing a universal solution to AI. He is proving that it is not computable. Lanier is not suggesting implementing consciousness as a rainstorm. He is refuting its existence. -- Matt Mahoney, [EMAIL

Re: [singularity] AI critique by Jaron Lanier

2008-02-17 Thread Matt Mahoney
--- John Ku [EMAIL PROTECTED] wrote: On 2/16/08, Matt Mahoney [EMAIL PROTECTED] wrote: I would prefer to leave behind these counterfactuals altogether and try to use information theory and control theory to achieve a precise understanding of what it is for something

Re: [singularity] AI critique by Jaron Lanier

2008-02-17 Thread Matt Mahoney
--- John Ku [EMAIL PROTECTED] wrote: On 2/17/08, Matt Mahoney [EMAIL PROTECTED] wrote: Nevertheless we can make similar reductions to absurdity with respect to qualia, that which distinguishes you from a philosophical zombie. There is no experiment to distinguish whether you actually

Re: [singularity] AI critique by Jaron Lanier

2008-02-16 Thread Matt Mahoney
is that it doesn't matter. The pleasure of a thousand permanent orgasms is just a matter of changing a few lines of code, and you go into a degenerate state where learning ceases. -- Matt Mahoney, [EMAIL PROTECTED] --- singularity Archives: http://www.listbox.com/member

Re: [singularity] AI critique by Jaron Lanier

2008-02-15 Thread Matt Mahoney
(knowing what we cannot know) to conclude that human brains are just computers and our existence doesn't matter. It is ironic that our programmed beliefs leads us to advance technology to the point where the question can no longer be ignored. -- Matt Mahoney, [EMAIL PROTECTED

Re: [singularity] Quantum resonance btw DNA strands?

2008-02-07 Thread Matt Mahoney
is that the long distance Van-der-Waals bonding strengths between A-T pairs or C-G pairs in double stranded DNA is slightly greater than the bonding strengths between A-T and C-G (although much weaker than the hydrogen bonds between A and T or C and G). -- Matt Mahoney, [EMAIL PROTECTED] - This list

Re: [singularity] Replication/Emulation and human brain, definition of models

2008-01-18 Thread Matt Mahoney
replicating nanobots. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=87473459-bd643d

Re: [singularity] World as Simulation

2008-01-13 Thread Matt Mahoney
on your choice of mathematical model. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=85465376-f0c66e

Re: [singularity] World as Simulation

2008-01-13 Thread Matt Mahoney
--- Gifting [EMAIL PROTECTED] wrote: There is plenty of physical evidence that the universe is simulated by a finite state machine or a Turing machine. 1. The universe has finite size, mass, and age, and resolution etc. -- Matt Mahoney, [EMAIL PROTECTED] I assume

RE: [singularity] World as Simulation

2008-01-12 Thread Matt Mahoney
environment that correctly believe that the world is a simulation would be less likely to pass on their genes than agents that falsely believe the world is real. Perhaps you suspect that the food you eat is not real, but you continue to eat anyway. -- Matt Mahoney, [EMAIL PROTECTED

RE: [singularity] World as Simulation

2008-01-12 Thread Matt Mahoney
your DNA. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=85206553-fdbdcb

Re: [singularity] World as Simulation

2008-01-12 Thread Matt Mahoney
is not healthy. It is what motivates kamikaze pilots and suicide bombers. Religion has thrived because it teaches rules that maximize reproduction, such as prohibiting sexual activity for any other purpose. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http

Re: [singularity] World as Simulation

2008-01-12 Thread Matt Mahoney
that the universe is a simulation, nor are any of my other points. I don't believe that a proof is possible. Eric B. Ramsay Matt Mahoney [EMAIL PROTECTED] wrote: --- Eric B. Ramsay wrote: Apart from all this philosophy (non-ending as it seems), Table 1. of the paper referred

Re: [singularity] Requested: objections to SIAI, AGI, the Singularity and Friendliness

2007-12-27 Thread Matt Mahoney
/?; -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=79771594-63e447

Re: [singularity] Wrong question?

2007-12-01 Thread Matt Mahoney
--- Bryan Bishop [EMAIL PROTECTED] wrote: On Friday 30 November 2007, Matt Mahoney wrote: How can we design AI so that it won't wipe out all DNA based life, possibly this century? That is the wrong question. How can we preserve DNA-based life? Perhaps by throwing it out

Re: Bright Green Tomorrow [WAS Re: [singularity] QUESTION]

2007-10-28 Thread Matt Mahoney
, now dropping the assumption of CEV. The question remains whether this AGI would preserve the lives of the original humans or their memories. Not what it should do, but what it would do. We have a few decades left to think about this. -- Matt Mahoney, [EMAIL PROTECTED] - This list

Re: Bright Green Tomorrow [WAS Re: [singularity] QUESTION]

2007-10-27 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: Suppose that the collective memories of all the humans make up only one billionth of your total memory, like one second of memory out of your human lifetime. Would it make much difference if it was erased to make room

Re: Bright Green Tomorrow [WAS Re: [singularity] QUESTION]

2007-10-25 Thread Matt Mahoney
to the relative probability of different outcomes, but just snears at the whole idea with a Yeah, but what if everything goes wrong, huh? What if Frankenstein turns up? Huh? Huh? comment. Happens every time. Richard Loosemore Matt Mahoney wrote: --- Richard Loosemore

Re: [singularity] John Searle...

2007-10-25 Thread Matt Mahoney
the issue of consciousness from the possibility of AI. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=57737187-d7ae0a

Re: Bright Green Tomorrow [WAS Re: [singularity] QUESTION]

2007-10-25 Thread Matt Mahoney
more important? I am not saying that the extinction of humans and its replacement with godlike intelligence is necessarily a bad thing, but it is something to be aware of. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe

Re: [singularity] QUESTION

2007-10-22 Thread Matt Mahoney
, and goal directed agents seem to be necessary for RSI. It raises hard questions about what role humans will play in this, if any. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2

Re: [singularity] Towards the Singularity

2007-09-12 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 11/09/2007, Matt Mahoney [EMAIL PROTECTED] wrote: No, you are thinking in the present, where there can be only one copy of a brain. When technology for uploading exists, you have a 100% chance of becoming the original

Re: [singularity] Towards the Singularity

2007-09-10 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 10/09/07, Matt Mahoney [EMAIL PROTECTED] wrote: No, it is not necessary to destroy the original. If you do destroy the original you have a 100% chance of ending up as the copy, while if you don't you have a 50% chance of ending up

Re: [singularity] Towards the Singularity

2007-09-10 Thread Matt Mahoney
--- Panu Horsmalahti [EMAIL PROTECTED] wrote: 2007/9/10, Matt Mahoney [EMAIL PROTECTED]: - Human belief in consciousness and subjective experience is universal and accepted without question. It isn't. I am glad you spotted the flaw in these statements. Any belief programmed

Re: [singularity] Towards the Singularity

2007-09-09 Thread Matt Mahoney
knowledge to create a reasonable facsimile. For example, given just my home address, you could guess I speak English, make reasonable guesses about what places I might have visited, and make up some plausible memories. Even if they are wrong, my copy wouldn't know the difference. -- Matt Mahoney

Re: [singularity] Towards the Singularity

2007-09-09 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 09/09/07, Matt Mahoney [EMAIL PROTECTED] wrote: Your dilemma: after you upload, does the original human them become a p-zombie, or are there two copies of your consciousness? Is it necessary to kill the human body for your

Re: [singularity] Uploaded p-zombies

2007-09-09 Thread Matt Mahoney
? If it does exist, then is it a property of the computation, or does it depend on the physical implementation of the computer? How do you test for it? Do you claim that the human brain cannot be emulated by a Turing machine? -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI

Re: [singularity] Towards the Singularity

2007-09-08 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 08/09/07, Matt Mahoney [EMAIL PROTECTED] wrote: I agree this is a great risk. The motivation to upload is driven by fear of death and our incorrect but biologically programmed belief in consciousness. The result

[singularity] Chip implants linked to animal tumors

2007-09-08 Thread Matt Mahoney
There has been a minor setback in the plan to implant RFID tags in all humans. http://news.yahoo.com/s/ap/20070908/ap_on_re_us/chipping_america_ii;_ylt=AiZyFu9ywOpQA0T6nXkEAcFH2ocA Perhaps it would be safer to have our social security numbers tattooed on our foreheads? -- Matt Mahoney, [EMAIL

Re: [singularity] Towards the Singularity

2007-09-07 Thread Matt Mahoney
in extinction with no replacement. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604id_secret=39571188-7e5cf6

Re: [singularity] Good Singularity intro in mass media

2007-08-24 Thread Matt Mahoney
editorializing, but rather for a clear short popular mass-media explanation of the Singularity. I think the classic paper by Vernor Vinge expresses it pretty well. http://mindstalk.net/vinge/vinge-sing.html -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http

Re: [singularity] Reduced activism

2007-08-19 Thread Matt Mahoney
--- Samantha Atkins [EMAIL PROTECTED] wrote: On Aug 19, 2007, at 12:26 PM, Matt Mahoney wrote: 3. Studying the singularity raises issues (e.g. does consciousness exist?) that conflict with hardcoded beliefs that are essential for survival. Huh? Are you conscious? I believe that I am

Re: [singularity] critiques of Eliezer's views on AI

2007-06-29 Thread Matt Mahoney
--- Randall Randall [EMAIL PROTECTED] wrote: On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote: --- Stathis Papaioannou [EMAIL PROTECTED] wrote: How does this answer questions like, if I am destructively teleported to two different locations, what can I expect to experience? That's what

Re: [singularity] critiques of Eliezer's views on AI

2007-06-28 Thread Matt Mahoney
--- Stathis Papaioannou [EMAIL PROTECTED] wrote: On 28/06/07, Matt Mahoney [EMAIL PROTECTED] wrote: So how do we approach the question of uploading without leading to a contradiction? I suggest we approach it in the context of outside observers simulating competing agents. How

Re: [singularity] critiques of Eliezer's views on AI (was: Re: Personal attacks)

2007-06-25 Thread Matt Mahoney
? And then the original friend walks in... -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604user_secret=7d7fb4d8

Re: [singularity] critiques of Eliezer's views on AI

2007-06-25 Thread Matt Mahoney
new ways to not invent AI. =((( -- Opera: Sing it loud! :o( )- - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?; -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored

Re: [singularity] critiques of Eliezer's views on AI

2007-06-25 Thread Matt Mahoney
--- Jey Kottalam [EMAIL PROTECTED] wrote: On 6/25/07, Matt Mahoney [EMAIL PROTECTED] wrote: You can only transfer consciousness if you kill the original. What is the justification for this claim? There is none, which is what I was trying to argue. Consciousness does not actually

Re: [singularity] critiques of Eliezer's views on AI (was: Re: Personal attacks)

2007-06-24 Thread Matt Mahoney
as building a 747, and then figuring out what to program with regards to volition, death, human suffering, etc. as learning how to fly the 747 and finding a good destination. - Tom --- Matt Mahoney [EMAIL PROTECTED] wrote: I think I am missing something on this discussion

Re: [singularity] critiques of Eliezer's views on AI (was: Re: Personal attacks)

2007-06-23 Thread Matt Mahoney
? Suppose they build a single AGI, all the agents upload, and the AGI reprograms its goals and goes into a degenerate state or turns itself off. Would you care? -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options

Re: [singularity] What form will superAGI take?

2007-06-16 Thread Matt Mahoney
in your simulated universe, which bear no resemblance to the universe in which the simulation is being run. This will all be clear after you die and wake up. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options

Re: [singularity] What form will superAGI take?

2007-06-16 Thread Matt Mahoney
--- Tom McCabe [EMAIL PROTECTED] wrote: --- Matt Mahoney [EMAIL PROTECTED] wrote: Or if your intellect advanced to the point where you could, you would not be able to describe what you observed to other humans. To use an analogy, a Singularity level intelligence would be as advanced

Re: [singularity] Getting ready for takeoff

2007-06-15 Thread Matt Mahoney
a world where resources are plentiful. For all you know, the latter has already happened. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id

[singularity] Will AGI make us stupid?

2007-05-20 Thread Matt Mahoney
directions, to decide which email we want to read, to do ever more of our work. When machines can do all of our thinking for us, what will happen to us? -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options

Re: [singularity] Will AGI make us stupid?

2007-05-20 Thread Matt Mahoney
bad, or equivalently, create good music? How could human artists compete with machines that can customize their work for each individual in real time? I guess that leaves sex, but I would not be surprised to see some technical innovation here as well. -- Matt Mahoney, [EMAIL PROTECTED

Re: [singularity] Will AGI make us stupid?

2007-05-20 Thread Matt Mahoney
... I plan to upload myself and become transhuman anyway, but maybe the Ben-version who stays a mostly-unimproved human will become a full-time musician ;-) ... Hell, with a few thousand years practice, he may even become a good one!!! -- Ben G -- Matt Mahoney, [EMAIL PROTECTED

Re: [singularity] Will AGI make us stupid?

2007-05-20 Thread Matt Mahoney
--- Nathan Cook [EMAIL PROTECTED] wrote: On 5/21/07, Matt Mahoney [EMAIL PROTECTED] wrote: Now there really is no difference between being able to judge the quality of a movie (relative to a particular viewer or audience), and being able to generate high quality movies. So

Re: [singularity] Will AGI make us stupid?

2007-05-20 Thread Matt Mahoney
machines that obey our commands. But this is controversial. Should a machine obey a command to destroy itself or harm others? Do you want a gun that fires when you squeeze the trigger, or a gun that makes moral judgments and refuses to fire when aimed at another person? -- Matt Mahoney, [EMAIL

Re: Neural language models (was Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page)

2007-05-17 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: What did your simulation actually accomplish? What were the results? What do you think you could achieve on a modern computer? Oh, I hope there's no misunderstanding: I did not build networks to do any kind

Re: Neural language models (was Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page)

2007-05-17 Thread Matt Mahoney
. If we can estimate the complexity of language modeling in a similar way, I see no reason not to. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id

Re: Neural language models (was Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page)

2007-05-17 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: One problem with some connectionist models is trying to assign a 1-1 mapping between words and neurons. The brain might have 10^8 neurons devoted to language, enough to represent many copies of the different senses

Re: Neural language models (was Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page)

2007-05-16 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: I doubt you could model sentence structure usefully with a neural network capable of only a 200 word vocabulary. By the time children learn to use complete sentences they already know thousands of words after exposure

Re: Machine Motivation Gets Distorted Again [WAS Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page]

2007-05-15 Thread Matt Mahoney
of Shane's work. After all, he is the one who proved the correctness of your assertion. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id

Re: Neural language models (was Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page)

2007-05-15 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Tom McCabe [EMAIL PROTECTED] wrote: --- Matt Mahoney [EMAIL PROTECTED] wrote: Personally, I would experiment with neural language models that I can't currently implement because I lack the computing power

Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page

2007-05-14 Thread Matt Mahoney
--- Eugen Leitl [EMAIL PROTECTED] wrote: On Sun, May 13, 2007 at 05:23:53PM -0700, Matt Mahoney wrote: It is not that hard, really. Each of the 10^5 PCs simulates about 10 mm^3 of You know, repeating assertions doesn't make them any more true. brain tissue. Axon diameter varies

Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page

2007-05-14 Thread Matt Mahoney
doesn't know how many fingers I am holding up. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604user_secret=8eb45b07

Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page

2007-05-13 Thread Matt Mahoney
--- Tom McCabe [EMAIL PROTECTED] wrote: --- Matt Mahoney [EMAIL PROTECTED] wrote: --- Tom McCabe [EMAIL PROTECTED] wrote: You cannot get large amounts of computing power simply by hooking up a hundred thousand PCs for problems that are not easily parallelized, because

Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page

2007-05-13 Thread Matt Mahoney
--- Tom McCabe [EMAIL PROTECTED] wrote: --- Matt Mahoney [EMAIL PROTECTED] wrote: Language and vision are prerequisites to AGI. No, they aren't, unless you care to suggest that someone with a defect who can't see and can't form sentences (eg, Helen Keller) is unintelligent. Helen

Re: [singularity] Help get the 400k SIAI matching challenge on DIGG's front page

2007-05-10 Thread Matt Mahoney
--- Tom McCabe [EMAIL PROTECTED] wrote: --- Matt Mahoney [EMAIL PROTECTED] wrote: I posted some comments on DIGG and looked at the videos by Thiel and Yudkowsky. I'm not sure I understand the push to build AGI with private donations when companies like Google are already

Re: [singularity] Why do you think your AGI design will work?

2007-04-24 Thread Matt Mahoney
to dumb down a machine just to duplicate human limitations. If AGI is not the Turing test, then what is it? What test do you propose? Without a definition, we should stop calling it AGI and focus on the problems for which machines are still inferior to humans, such as language or vision. -- Matt

Re: [singularity] Why do you think your AGI design will work?

2007-04-24 Thread Matt Mahoney
--- Eugen Leitl [EMAIL PROTECTED] wrote: On Tue, Apr 24, 2007 at 01:35:31PM -0700, Matt Mahoney wrote: None, because we have not defined what AGI is. AGI is like porn. I'll know it when I'll see it. Not really. You recognize porn because you have seen examples of porn and not-porn

Re: [singularity] Why do you think your AGI design will work?

2007-04-24 Thread Matt Mahoney
from just following the way set by programmed rules. There is an algorithm. We just don't know what it is. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member

Re: [singularity] Implications of an already existing singularity.

2007-03-30 Thread Matt Mahoney
approaching a black hole in a free fall observes nearby objects accelerating away in all directions. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=11983

Re: [singularity] Implications of an already existing singularity.

2007-03-30 Thread Matt Mahoney
--- Charles D Hixson [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Eugen Leitl [EMAIL PROTECTED] wrote: ... A proton is a damn complex system. Don't see how you could equal it with one mere bit. I don't. I am equating one bit with a volume of space about

Re: [singularity] Implications of an already existing singularity.

2007-03-29 Thread Matt Mahoney
a wholesale rearrangement of a large majority of the matter in the solar system. A technology this advanced could also reprogram your neurons to make you believe whatever it wanted. There is no way you could detect this. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored

Re: Entropy of the universe [WAS Re: [singularity] Implications of an already existing singularity.]

2007-03-28 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: *The entropy of the universe is of the order T^2 c^5/hG ~ 10^122 bits, where T is the age of the universe, c is the speed of light, h is Planck's constant and G is the gravitational constant. By coincidence

Re: [singularity] Scenarios for a simulated universe

2007-03-05 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Richard Loosemore [EMAIL PROTECTED] wrote: What I wanted was a set of non-circular definitions of such terms as intelligence and learning, so

Re: [singularity] Scenarios for a simulated universe

2007-03-05 Thread Matt Mahoney
had a Turing machine, you still could not compute a solution to AIXI. It is not computable, like the halting problem. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com

Re: [singularity] Scenarios for a simulated universe

2007-03-04 Thread Matt Mahoney
--- Richard Loosemore [EMAIL PROTECTED] wrote: Matt Mahoney wrote: --- Richard Loosemore [EMAIL PROTECTED] wrote: What I wanted was a set of non-circular definitions of such terms as intelligence and learning, so that you could somehow *demonstrate* that your mathematical

Re: [singularity] Scenarios for a simulated universe

2007-03-03 Thread Matt Mahoney
does not necessarily imply learning. There are other approaches. -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=11983

Re: [singularity] Why We are Almost Certainly not in a Simulation

2007-03-03 Thread Matt Mahoney
would one test for this belief? -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=11983

Re: [singularity] Scenarios for a simulated universe

2007-03-02 Thread Matt Mahoney
to at least be reassuring. -- Matt Mahoney -- Matt Mahoney, [EMAIL PROTECTED] - This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?list_id=11983

  1   2   >