Flick Harrison wrote:

> I can understand the temptation to reduce "digital" to "numbers."

There may be such a temptation, but at the end of the day, "digital"
and certain fields of "numbers" (namely discrete ones), as technical
terms, are isomorphic. There's no reduction going on.

> But I think it borders on tautology to define digital as "computable  
> numbers... computable only by a computer."

Who proffered such a definition? (The conversation you're referring to
was a while ago...) That last part is redundant, if not nonsensical.

> As a filmmaker, I like to draw the line between analogue vs digital at  
> the binary code. And binary code is only "numbers" if you choose to  
> call it that.

Any sufficiently reasonable and useful definition of "numbers" would
include any binary code. This isn't simply a matter of nomenclature;
the concept of countable numbers covers binary encoding.

> (Maybe I'm missing some basic computer tech -  
> are there non-binary computers?)

There are non-binary digital computers, and there are non-digital
(analog) computers. There are computers of metal and computers of
flesh and bone and computers of the mind.

> "Digital" is the smooth information curve converted to binary code.

That's "digitization". There are entities which are discrete ab
initio, hence digital but never digitized.

-- 
Michael Wojcik
Micro Focus
Rhetoric & Writing, Michigan State University


#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: [email protected]

Reply via email to