On Sun, Dec 31, 2017 at 11:52:45AM +1300, Douglas Bagnall wrote:
Just by looking at the Athalye et. al. turtle, you can see that the
system associates rifles with the texture of polished wood. And indeed
when you look in the ImageNet rifle category you see a lot of wood,
not only in the guns
It will become progressively harder to have big secret database: it's
just one planet with only so many possible frames, and anything on a
disk eventually becomes (more or less) public. The race for data then
ends as everyone has everything.
Ultimately, the situation gets analogous with
On 29/12/17 20:05, Morlock Elloi wrote:
> Longer version and remarks: current ML systems appear to be linear,
> so it's possible to synthesize diversions even without knowing how a
> particular system works. Systems can be fooled to miscatergorize
> visual images (turtle gets recognized as a
I have only casual understanding of ML, but it was always
counter-intuitive to me that simple polynomial units can somehow produce
magical macro effect which no one understands but it "just works". If it
turns out that it's just a roundabout way of conditioning a linear
system, the magic goes
On Thu, Dec 28, 2017 at 11:05:22PM -0800, Morlock Elloi wrote:
Longer version and remarks: current ML systems appear to be linear, so
it's possible to synthesize diversions even without knowing how a
particular system works. Systems can be fooled to miscatergorize visual
images (turtle gets
There was one remarkable presentation at 34C3 - "Deep Learning
Blindspots" by Katharine Jarmul, available here
https://media.ccc.de/v/34c3-8860-deep_learning_blindspots and here
https://www.youtube.com/watch?v=BVJT-sE0WWQ .
Tl;Dr: the arms race in machine learning ("AI" for consumers) has