Job title isn't the same thing as being a #realdeal engineer.

Notice, a P.E. license isn't really required for manufacturing but it is
required if you are building a bridge. QED, this is the nature of my
argument.

How often does the SEC prosecute someone for an ethics violation?

Kinda important.

-GJS

On Wed, May 6, 2015 at 11:20 AM, Piaget Modeler <[email protected]>
wrote:

> Actually, there are Financial Engineers.  That is a job title.
>
> ~PM
>
> ------------------------------
> Date: Wed, 6 May 2015 09:37:33 -0500
>
> Subject: Re: [agi] AGI as adaptive control
> From: [email protected]
> To: [email protected]
>
> Yo Piaget,
>
> I'm paranoid. So sue me. Maybe it is time to get a little more paranoid.
>
> I still remain unconvinced that robotic microtransactions are a good
> thing, mkay?
>
> These finance guys are not engineers, not a man jack or jill of them has a
> P.E. license while working for JP Morgan Chase or Wells Fargo or whoever.
> This is like saying, "sure, go ahead and practice taking random bolts and
> bricks out of the Brooklyn Bridge, if anything goes wrong, we're sure it
> won't be a big deal."
>
> WTF? What first world civilization freaking thinks like this? STOOOOPID!
>
> *Greed is bad*. Do we really need to burn down our entire house in order
> to figure this out and keep a bunch of Ivy League nitwits in stock
> dividends and Mercedes?
>
> *drops the mic* I'm out.
>
> -GJS
>
> On Mon, May 4, 2015 at 9:55 PM, Piaget Modeler <[email protected]>
> wrote:
>
> Your funds are already being managed by software programs.
>
> AI programs is just a few deployments down the road; that is,  if they're
> not already
> trading securities with neural nets, or case based reasoning data, or
> fuzzy logic, or
> rule based systems.
>
> Nothing new there.
>
> ~PM
>
> ------------------------------
> Date: Mon, 4 May 2015 21:18:57 -0500
> Subject: Re: [agi] AGI as adaptive control
> From: [email protected]
> To: [email protected]
>
>
> Also, I think the idea of turning funds over to AI managed software is a
> little scary. In the event of some major market shaking event, what ethics
> of opportunity topology shuts down the bots, rather than them seeing it as
> just another arbitrage opportunity? I don't see many good ways for the SEC
> to punish a rogue algorithm.
>
> At the least, the Enron traders and their bosses eventually got their
> asses kicked by the legal profession even if it was a day late and a dollar
> short. We had similar nonsense go off when the mortgage meltdown turned a
> bunch of sophisticated mathematical candy floss into a smoking crater in
> the ground (Thank you Congress and a bunch of over-educated Ivy League
> kids).
>
> Hey, what the hell do I know, I'm just a metallurgist and inventor and
> occasional consultant and.. nevermind.
>
> I just worry.. failure analysts are good at anxiety and paranoia.
>
> -GJS
>
> On Mon, May 4, 2015 at 9:07 PM, Greg Staskowski <[email protected]>
> wrote:
>
> All right! #HAJIME! #ITSON! I like seeing the hardware vs. software
> argument presented very rationally and point by point in terms of the
> adaptive control systems paradigm (ecch, I know I couldn't think of a less
> overused word).
>
> I come down on the side of hardware for this stuff and what I really want
> to understand is the quantum biophysics of the brain. If we are really
> going for "the whole enchilada" that is "embodied or unembodied reasoning,
> Turing or non-Turing, consciousness" rather than "just another algorithm,
> JA^2" I think this thing has to start from the hardware or wetware up and
> that means we need to really understand the quantum biophysics of a living,
> functioning human or whale brain. Hey, good luck getting the funding for
> whale brains of course.
>
> This is a "non trivial" problem. Now Wolfram et al seems to think this
> whole deal starts down at the level of cellular automata but don't quote me
> on that because d00d is kinda a Genius at this stuff. I tend to come out
> more on the side of ethics and applications especially in terms of
> diversified system operators, the IOT (Internet of things) and agents for
> selection of what I call "opportunity topology." If we want to start
> experimenting with silicon though, Northwestern has a NuFab facility where
> you can work out new silicon at what are basically bargain rates. Check it
> out. I think this stuff is way too complex to simulate and the only way to
> get there is Wright Brothers style. Iterate, re-design, Iterate, re-design
> till you run out of money of course.
>
> That is my opinion, I could be wrong.
>
> -GJS
>
> On Mon, May 4, 2015 at 6:06 PM, Stefan Pernar <[email protected]>
> wrote:
>
> The page limit apparently applies to publication the proceedings.
>
> A fair bit of bible references for sure. Have a read. Really appreciate
> every pair of eyes that goes over it.
> On 05/05/2015 9:04 AM, "Anastasios Tsiolakidis" <[email protected]>
> wrote:
>
>
> On Tue, May 5, 2015 at 12:30 AM, Stefan Pernar <[email protected]>
> wrote:
>
> My paper building on the subject has been accepted for presentation at the
> AGI Congerence Ben organizes in July in Berlin:
> http://rationalmorality.info/wp-content/uploads/2015/04/TranshumanPhilosophy_formatted.pdf
>
>
>
> Isn't this a bit too long, considering the 10 page limit? Do I detect a
> certain Judaism in the text? Surely will go down with Ben better than my
> Christian references lol.
>
> AT
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/3869219-6583aa40> | Modify
> <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/27055757-c218d4f9> |
> Modify <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>
>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc> |
> Modify <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/27055757-c218d4f9> |
> Modify <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>
>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/19999924-4a978ccc> |
> Modify <https://www.listbox.com/member/?&;> Your Subscription
> <http://www.listbox.com>
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/27055757-c218d4f9> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to