The performance doesn't scale in a perfectly linear manner, but it's not
bad scaling.
But this is where you have to do your own math, because it's not only
about the number of cards you have, but the number of machines and
licenses you have to maintain. It may also be about whether you have
other apps that use the GPU, like AE/Nuke/Fusion, and whether they can
even use multiple GPUs or not.
Once you move into mass rendering with the GPU, things get complex
because it's such a new thing that there just isn't a huge pool of
common knowledge to pull from. You really have to just get your hands
dirty and figure out what works best for you.
-Tim
On 3/23/2014 11:40 AM, Leoung O'Young wrote:
We are interested in Redshift too, just wondering what is the
performance different between having 2 titan in one machine vs 2
machines with 1 titan each?
On 23/03/2014 12:05 AM, Ed Manning wrote:
On the economic advantages of redshift or other gpu renderers.
My current workstations are Mac Pro 3.1s which are left over from the
company I shut down in 2009 (bootcamped into Windows). Essentially
worthless from a CPU standpoint. Putting a single $1000 titan gpu
into one of them makes it more efficient at rendering than any modern
16-core $8,000 workstation running any CPU ray tracer. Putting 2
titans in them is like having my old 162-core blade server renderfarm
without the $5000/month electric bill. Not to mention all the IT
overhead and license costs.
I have never seen a single piece of software (in concert with the
astonishing graphics hardware that is now so cheap and still getting
cheaper) have such a cost-reducing impact.
Plus they are fanatically hard workers and great communicators.
--
Signature