On Aug 9, 2016, at 7:00 PM, ghdl-discuss-requ...@gna.org wrote:
From: Thomas Dejanovic <thomas.dejano...@gmail.com>
Subject: [Ghdl-discuss] Elapsed time increasing more than exponentially with 
linear increase in simulation time

I have seen this behaviour, without Xilinx primitives.  We were unfortunately 
not able find time to investigate the cause (very much).  It makes our DSP very 
difficult to simulate for meaningful simulation times, unfortunately.

Are you able to determine if it is a specific Xilinx primitive that causes this?

J.

> Hi all,
> 
> I have a simulation sending packets through a system.  It works very
> well until we start to use some Xilinx primitives. If I break the
> system down and leave out the module that has Xilinx primitives, the
> simulation runs at least an order of magnitude faster and elapsed time
> increases linearly with increasing simulation time. Once we
> instantiate code that contains Xilinx primitives, doubling the
> simulation time (i.e. doubling the number of packets sent) increases
> the simulation time by about a factor of 6.
> 
> i.e. - sending 128 packets takes about 2 minutes of sim time. 265 =>
> 12 minutes, 512 packets => ~72 minutes and 1024 packets takes longer
> than I was willing to wait (jobs were killed after running for more
> than 350 minutes).
> 
> So my question is, has anyone else observed this behavior?
> 
> Is there any way for me to profile the program and find where it is
> spending it's time?
> 
> 
> Best Regards, Thomas D
> -- 
> Sent from Tom's Fortress of Solitude!


_______________________________________________
Ghdl-discuss mailing list
Ghdl-discuss@gna.org
https://mail.gna.org/listinfo/ghdl-discuss

Reply via email to