On 3/20/14, CN <[email protected]> wrote:
> Yeah, from what I've read I think you're right. The perception is probably
> more PR and hype, I wonder if that'll hurt them by raising expectations too
> high? On the technical side and evaluating their approaches though, it
> almost to a degree gets to the question of statistical and algorithmic based
> approaches to NLP versus more pure AI related approaches?
>


There is a lot of hype right now about AI, I think to some extent
justified.  I guess there could be another AI winter if things seem to
plateau.

Yesterday I think I mixed up what google was doing and this Denver company:
http://www.zdnet.com/talking-deep-learning-with-alchemyapi-ceo-elliot-turner-7000027496/?s_cid=e539&ttag=e539&ftag=TRE17cfd61

Anybody have comments about this effort?


> -Chris
>
>> On Mar 20, 2014, at 9:12 PM, Mike Archbold <[email protected]> wrote:
>>
>> I get the feeling the public perceives Watson as general intelligence,
>> since they seem to be pursuing multiple application areas.  I was just
>> reading about Deep Mind, and again it is billed as the core of an AI
>> stack.  But, in both cases, they still need considerable programming
>> per specific domain, if I understand correctly what I read about these
>> approaches.
>>
>>> On 3/20/14, Ben Goertzel <[email protected]> wrote:
>>> Neither Watson nor Google is currently addressing the core issues of
>>> AGI.
>>> But they are building infrastructure, and supporting algorithms, that
>>> could
>>> provide platforms for them to address these in future...  Whether they
>>> will
>>> or not, remains to be seen.  Maybe they will wait for some smaller firms
>>> to
>>> make real progress on AGI and then buy these firms....
>>>
>>> Google bought Deep Mind before DM had enough time to make any AGI
>>> breakthroughs.   It remains to be seen whether Google will be a good
>>> environment for them to conduct truly AGI-focused research...
>>>
>>> -- Ben G
>>>
>>>
>>> On Fri, Mar 21, 2014 at 6:24 AM, Chris Nolan
>>> <[email protected]>wrote:
>>>
>>>> Speaking of AI applications, I'm curious what people think of recent
>>>> directions in NLP and QA systems being explored? Say IBM Watson vs
>>>> Google?
>>>> Which do people think has the better potential to expand and is the
>>>> right
>>>> direction to go in comercializing AI systems? When I say expand, I mean
>>>> the
>>>> potential for creating more generality in current AI systems (with the
>>>> understanding that it's not true AGI work).
>>>>
>>>>
>>>>
>>>>  On Wednesday, March 19, 2014 2:15 PM, John Rose
>>>> <[email protected]>
>>>> wrote:
>>>> It could move in and take over our functions, our jobs, and perform
>>>> them
>>>> with such drastically less resource consumption and see us as beings
>>>> that
>>>> have competed with almost every other species and destroyed the very
>>>> nature
>>>> of this planet. AGI could compete with us as an invasive species would
>>>> but
>>>> in a way to make the planet restore to its previous health before
>>>> humans
>>>> were here.
>>>>
>>>> I don't see it totally happening that way. I see it more as AGI embeds
>>>> itself into our systems of civilization, our governments, corporations,
>>>> and
>>>> slowly renders us over time as being not that important entities, less
>>>> and
>>>> less as spiritual beings, more and more as lower level animals with
>>>> less
>>>> rights, sort of like p-zombies, the individual having diminished
>>>> importance,
>>>> we become herded as sheeple. Kind of like what is happening now as we
>>>> acquiesce.
>>>>
>>>> I don't say it will or has to go that way. The future isn't
>>>> predetermined.
>>>> In fact, AGI has the capability to liberate the individual.
>>>>
>>>> There are real things to fear though. It's been relatively calm the
>>>> past
>>>> few
>>>> decades but at any moment you could have some new disease break out, an
>>>> extreme natural disaster, a global famine, an ice age, fear is good.
>>>>
>>>> John
>>>>
>>>> -----Original Message-----
>>>> From: just camel [mailto:[email protected]]
>>>> Sent: Sunday, March 16, 2014 10:21 AM
>>>> To: AGI
>>>> Subject: Re: [agi] Practical Applications for AGI
>>>>
>>>> Why would it be competitive? There just is no reason. It's a highly
>>>> anthropomorphic and contemporary notion of us. Just because we think
>>>> that
>>>> a
>>>> competitive culture is a fruitful thing to have does not make
>>>> competition
>>>> universal. In fact there have been civilizations that existed (despite
>>>> having much less of an abundance) without competition for much longer
>>>> than
>>>> us and way more sustainable.
>>>>
>>>> The problem is that people like Yudkowsky and Bostrom might have an IQ
>>>> of
>>>> over 9000 but they are also spiritually trapped in this belief system
>>>> of
>>>> fear and scarcity and thus will come up with 1000 biased and illogical
>>>> scenarios in which AGI will terminate humanity (instead of pointing out
>>>> that
>>>> the by far biggest existential risk for humanity stems from our
>>>> monetary
>>>> system and neoliberal capitalism). Yet there are thousands of people
>>>> with
>>>> a
>>>> superior consciousness (less inherent
>>>> entropy) who are not as indoctrinated by our culture and it makes no
>>>> sense
>>>> to believe that any superintelligent agent would adopt the irrational,
>>>> fearful, destructive, de-evolutionary and unsustainable polluted
>>>> mindset
>>>> of
>>>> the average westener?
>>>>
>>>>
>>>>> On 03/19/2014 01:22 PM, John Rose wrote:
>>>>> Only prob it might relegate us as biological waste byproducts
>>>>> unnecessary for its competitive survival.
>>>>
>>>>
>>>>
>>>> -------------------------------------------
>>>> AGI
>>>> Archives: https://www.listbox.com/member/archive/303/=now
>>>> RSS Feed:
>>>> https://www.listbox.com/member/archive/rss/303/248029-3b178a58
>>>> Modify Your Subscription:
>>>> https://www.listbox.com/member/?&;
>>>> Powered by Listbox: http://www.listbox.com
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> -------------------------------------------
>>>> AGI
>>>> Archives: https://www.listbox.com/member/archive/303/=now
>>>> RSS Feed:
>>>> https://www.listbox.com/member/archive/rss/303/20347893-f72b365c
>>>>
>>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>> Powered by Listbox: http://www.listbox.com
>>>>
>>>>
>>>>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
>>>> <https://www.listbox.com/member/archive/rss/303/212726-deec6279> |
>>>> Modify<https://www.listbox.com/member/?&;>Your Subscription
>>>> <http://www.listbox.com>
>>>
>>>
>>>
>>> --
>>> Ben Goertzel, PhD
>>> http://goertzel.org
>>>
>>> "In an insane world, the sane man must appear to be insane". -- Capt.
>>> James
>>> T. Kirk
>>>
>>> "Emancipate yourself from mental slavery / None but ourselves can free
>>> our
>>> minds" -- Robert Nesta Marley
>>>
>>>
>>>
>>> -------------------------------------------
>>> AGI
>>> Archives: https://www.listbox.com/member/archive/303/=now
>>> RSS Feed:
>>> https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
>>> Modify Your Subscription:
>>> https://www.listbox.com/member/?&;
>>> Powered by Listbox: http://www.listbox.com
>>
>>
>> -------------------------------------------
>> AGI
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed:
>> https://www.listbox.com/member/archive/rss/303/20347893-f72b365c
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to