I am still too stupid to operate Evolution :(

(and the comp.sci. department has trouble with its internet connection
so this will get even more delayed until they fix their machines)


-------- Forwarded Message --------
From: Peter Lund <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Subject: Re: [Open-graphics] Re: Jumper count to disconnect DDC from DVI
connector
Date: Tue, 12 Sep 2006 01:01:37 +0200

On Mon, 2006-09-11 at 22:28 +0000, [EMAIL PROTECTED] wrote:
>  -------------- Original message ----------------------
> From: Peter Lund <[EMAIL PROTECTED]>
> > 
> > > I _think_ 12 bits is long enough for each integer, but should we plan
> > > for 13 or 14 to future-proof it?
> > 
> > No.
> > 
> > Future proofing is way overrated.  It will be very easy to change when
> > (if!) necessary.
> 
>    Not if there aren't enough DIP switches on the board.

In that unlikely event there will be a new board with more DIP switches
and a new FPGA image.

Or some existing DIP switches (or combinations thereof) will be
reinterpreted (again with a new FPGA image).

http://xp.c2.com/YouArentGonnaNeedIt.html
http://en.wikipedia.org/wiki/You_Ain't_Gonna_Need_It

"The time spent is taken from necessary functionality.
      * The new features must be debugged, documented, and supported.
      * Any new feature imposes constraints on what can be done in the
        future, so an unnecessary feature now may prevent implementing a
        necessary feature later.
      * Until the feature is actually needed it is not possible to
        define what it should do, or to test it. This often results in
        such features not working right even if they eventually are
        needed.
      * It leads to code bloat; the software becomes larger and more
        complicated while providing no more functionality.
      * Unless there are specifications and some kind of revision
        control, the feature will never be known to programmers who
        could make use of it."

I think all the above apply to this discussion.  The last bullet point
from the above link probably even more so:

"Adding the new feature will inevitably suggest other new features. The
result is a snowball effect which can consume unlimited time and
resources for no benefit."

> > >    A tool with a microcontroller in it would be able to interpret a
> > > literal X modeline statement, and generate a video program for direct
> > > loading as an EEPROM image.
> > 
> > Or a program on a PC could do it.
> 
> The tool will interface to the lab PC (or a terminal) through an
> RS-232 port.  Microcontrollers typically have an internal UART.
> Interfacing to a printer port is a lot more bother, on both ends.

Hmm..  I'll try to spell it out clearer.

People who need boutique resolutions and who do not have a BIOS or X
driver that is good enough will still need this DIP tool.  Or a custom
FPGA image.  Or a tool with an EEPROM with a custom image.

For bringup purposes, a PC will be able to simulate such an EEPROM
nicely, as long as the FPGA image does not try to speak too fast to it.
This has got about as much to do with microcontrollers as Saddam Hussein
had to do with Al-Queda.  A parallel port is quite versatile -- and you
only need very few of its pins.  If the FPGA uses voltages that are not
too far off from the 5V of the parallel port then it is easy to
interface them.  I have not ever suggested 1) putting a full parallel
port on a microcontroller and 2) communicating with a PC over said full
parallel port.

In any case, as long as we are only working on the prototype card the
problem can be solved just as easily by setting a `define in the verilog
and generating a custom FPGA image.

As far as I can tell, there is NO serious need to worry about boutique
frequencies until we get to the detailed planning stage of the ASIC
version.

-Peter

_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to