cancel
Showing results for 
Search instead for 
Did you mean: 
OnSugarMountain
Level 1
13 8 0 0
Message 1 of 7
466
Flag Post
HP Recommended

Z820 graphic card upgrade

Z820 v1 w/ 1125 watt PSU
Microsoft Windows 7 (64-bit)

After following the recent @Drawn thread regarding graphics card upgrading concerns in our workstations, it got me thinking about upgrading my original Quadro K6000 to an NVIDIA RTX to take advantage of Redshift RTX Acceleration.

 

passmark2.JPG

I can't upgrade my dual E5-2670 v0 2.60 GHz processors (Z820  v1 motherboard) but I'm thinking a new RTX  card might still improve rendering speed with an optimized RTX renderer. Here's my questions:

 

  1. Based on my slow dual 2.60 GHZ, I'm leaning toward the RTX 2070 Super. Anyone think there's a benefit to a higher end or lower end RTX?
  2. My Quadro K6000 , like I believe all Quadros are, is a "blower" style card, meaning hot air is blown out the rear of the card. These NVIDIA GeForce RTX cards come in both blower and axial fan versions. The dual and triple axial fans are a lot quieter than the blower versions but instead of blowing the hot air outside the case, the axial fan cards blow the hot air into the case.  The Z820 comes with a PCI Retainer panel which covers the I/O slots. I'm only going to use one GPU inside the Z820. Do you think the air circulation inside the PCI Retainer panel is sufficient to support an axial fan RTX, or should I stick to a blower style RTX card?
6 REPLIES 6
Level 11
2,866 2,860 245 565
Message 2 of 7
435
Flag Post
HP Recommended

Z820 graphic card upgrade

before spending 500 plus dollars on a video card, i would check with the software makers of the programs i use to see if the software will actually support/use the card in the way i used the software...........but hey that's just me your ready fire aim approach is used by lots of people,............. of course said people are usually disappointed  in the achieved results in a lot of cases 

 

and one more thing, the 1150 watt supply in the z820 has two GPU power rails (gpu1/gpu2) make sure to attach each gpu rail to the video cards connectors don't use the two gpu connectors on gpu1 only!!! split the load across both rails.

Reply
0 Kudos
OnSugarMountain
Level 1
13 8 0 0
Message 3 of 7
425
Flag Post
HP Recommended

Z820 graphic card upgrade


@DGroves wrote:

before spending 500 plus dollars on a video card, i would check with the software makers of the programs i use to see if the software will actually support/use the card in the way i used the software...........but hey that's just me your ready fire aim approach is used by lots of people,............. of course said people are usually disappointed  in the achieved results in a lot of cases 


There's so many variables, right? First, like you suggest, there's the software: Modeling and rendering software vendors are busy optimizing new code to support the NVIDIA RTX accelerations (mainly for ray tracing).  NVIDIA initially over-promised high percentage rendering speed gains using "optimized" RTX code.  So two variables are, (1) how well written is the RTX software and, (2) the type and complexity of the 3D scene being rendered. A third variable is the impact a slow GHz CPU has on the GPU. That's what I'm most interested in hearing back on: Has anyone tested an NVIDIA RTX card with a first gen Z820 running CPUs under 3.0 GHz?

 

Previous to the RTX generation of NVIDIA graphics cards, all Quadro and all GeForce Founder's Edition graphics cards were "blower" style. The new NVIDIA RTX Founder's Edition cards are axial fans, meaning that hot air is exhausted back into the computer case. I suspect that the HP  engineers designed the Z820 with blower style graphics cards in mind. But those blower cards are always loudly blowing, even when one is just reading emails or writing Forum posts!  Anyone reading this using a 200 watt or higher axial fan Graphics card inside a HP workstation with that PCI Retainer panel thing in place? The test would be if the fans on the graphics card turn off during idle time. That would suggest the Z820's internal cooling is effective. Otherwise the graphics card thermal sensors would keep the axial fans running and you would hear them.

 


@DGroves wrote:

and one more thing, the 1150 watt supply in the z820 has two GPU power rails (gpu1/gpu2) make sure to attach each gpu rail to the video cards connectors don't use the two gpu connectors on gpu1 only!!! split the load across both rails.


Thanks, @DGroves for the good reminder that there's less than 250 watts available per rail! Edit:  But note that there's three auxiliary dongles, labeled G1, G2, and G3, each on its own 12 volt rail.

Reply
0 Kudos
Brian1965
Level 6
251 238 25 74
Message 4 of 7
312
Flag Post
HP Recommended

Z820 graphic card upgrade

Hi OnSugarMountain,

 

I realise I'm a little late in commenting on this thread, but I only just stumbled on this post and thought sharing my own experience might help for your situation. DGroves hit the nail on the head - What software you are using should dictate how to get the biggest bang for your buck!, (boost in performance for the least cost). Check first if the software render engine you're planning on using is a CPU or GPU type. For example,. the default render engine in Maya only supports CPU rendering, but installing the Redshift plug-in (expensive) gives you GPU rendering capability. Similarly, the in-built SolidWorks render engine only supports CPU rendering, whereas the SoildWorks Visualize add-on (more expensive SW Premium Edition) gives you full GPU or Hybrid (CPU+GPU) rendering capability. The same applies to Quadro and GTX cards, e.g. SolidWorks will only run with a Quadro card. Check what your software supports and is optimised for first!!!!

 

Regardless of whether your software supports CPU or GPU rendering, consider upgrading to the E5-2690 v1 CPU(s). The E5-2690 v1 CPU is one of the fastest single-threaded performance CPU's of any of the E5-26xx v1 range. Look carefully (memory read/write) at the first 3 columns in the table below, (my own benchmarks);

 

Benchmarking.JPG

NOTE: The 2690's single threaded performance is reduced significantly by adding a second 2690 CPU.

 

I use SolidWorks Visualize and my son uses the Redshift render plug-in in Maya, which are both GPU/Hybrid render engines so single threaded performance has a major influence. The rendering time is inversely proportional to the CPU's single threaded performance, i.e. the better(higher) the single threaded performance, the shorter the rendering time per frame.

If your software uses a CPU based render engine, then more CPU cores would be better. e.g. 2x E5-2690 v1 CPU's. If your software has a GPU rendering engine, then a higher CPU single threaded performance and more CUDA cores would be better. In terms of pricing, GTX cards generally have more CUDA cores than similarly priced Quadro cards. In my HP Z620 set-up, I have both a Quadro P2000 and GTX1080Ti  to give me the maximum amount of CUDA cores within the Z620 GPU/power limitations. Both SolidWorks Visualize and the Maya Redshift plug-in support multiple GPU's, but be wary, not all GPU render engines can take advantage of multiple GPU cards.

The bottom line is . . . do some investigation before spending loads of money. No point in adding a second GPU if your software uses a CPU render engine!!

 

 

HP Z620 - Liquid Cooled E5-1680v2 @4.7GHz / 64GB Hynix PC3-14900R 1866MHz / GTX1080Ti FE 11GB / Quadro P2000 5GB / Samsung 256GB PCIe M.2 256GB AHCI / Passmark 9.0 Rating = 7147 / CPU 17461 / 2D 1019 / 3D 14464 / Mem 3153 / Disk 15451 / Single Threaded 2551
Level 11
2,866 2,860 245 565
Message 5 of 7
303
Flag Post
HP Recommended

Z820 graphic card upgrade

just a few notes on the previous two posts

 

1.  the z820 has two pci-e slots that are white, these slots are not active unless a second cpu is installed. this is documented in the HP service and user manuals, but many people seem to miss this

 

2. the z820 has been approved for a video card plus  coprocessor card(s) as a factory configuration ......see the avid or Telsa configurations for details on these setups, they can put out a lot of heat and yet the system will still run within the factory thermal envelope without any issues a current (or older) single GPU card no matter how it's cooled will not be able to exceed the system cooling capability, however a blower type that exhausts the heat directly is always preferred and can be a requirement if using multiple high wattage cards in every slot on a dual cpu system with all memory banks filled

 

http://resources.avid.com/supportfiles/config_guides/AVIDHPZ820ConfigguideRevF.pdf

 

https://www.nvidia.com/docs/IO/43395/NV-DS-Tesla-C2075.pdf

OnSugarMountain
Level 1
13 8 0 0
Message 6 of 7
260
Flag Post
HP Recommended

Z820 graphic card upgrade


@Brian1965 wrote:

Regardless of whether your software supports CPU or GPU rendering, consider upgrading to the E5-2690 v1 CPU(s). The E5-2690 v1 CPU is one of the fastest single-threaded performance CPU's of any of the E5-26xx v1 range. Look carefully (memory read/write) at the first 3 columns in the table below, (my own benchmarks);


I wish I could upgrade to the E5-2690 v1 CPU(s). Unfortunately my BIOS's Boot Block Date is stuck at 12/28/11 no matter how many times I upgrade the BIOS. Based on what I've read on this forum, I think I'm stuck with my dual E5-2670 v0 Xeons running at a relatively slow 2.6 GHz. At least I have 32 threads total!

 


@Brian1965 wrote:

The rendering time is inversely proportional to the CPU's single threaded performance, i.e. the better(higher) the single threaded performance, the shorter the rendering time per frame.

If your software uses a CPU based render engine, then more CPU cores would be better. e.g. 2x E5-2690 v1 CPU's. If your software has a GPU rendering engine, then a higher CPU single threaded performance and more CUDA cores would be better.


Very interesting and thanks for sharing this insight.  Since my rendering has always been CPU multi-threaded based, it will be interesting to see how my rendering speeds will compare when I install my soon-to-be-delivered NVIDIA RTX 2080 Super GPU with 3072 CUDA Cores. And I will be investing in Redshift to see if the Redshift RTX acceleration improves my rendering performance by using GPU rendering. From reading your post, I expect my 2.6 GHz processor speed will be a bottleneck.


Also, thanks, @DGroves  for your insight that

 

"a current (or older) single GPU card no matter how it's cooled will not be able to exceed the system cooling capability [of the z820]."

 

That's reassuring news since the RTX 2080 Super GPU I am expecting delivery of is a two internal fan EVGA, which will be blowing the exhaust heat out into the  z820 case. And I'm intending to run just this one GPU in my z820.

Reply
0 Kudos
Brian1965
Level 6
251 238 25 74
Message 7 of 7
185
Flag Post
HP Recommended

Z820 graphic card upgrade

Hi OnSugarMountain,

 

Sorry for the confusion with regards to my CPU terminology. When I say E5-2690 v1 I meant the CPU will work in the v1 motherboard (2011 boot block date). Looking on the ark.intel site there is no such mention of of either the v0 or v1 versions, it simply has no version number, e.g.

 

intel 2690.JPG

 

You are correct in that it is not possible to update the motherboard boot block date. The boot block date is fixed during the motherboards manufacture. It is of course possible to replace or upgrade a v1 motherboard (2011 boot block date) with a newer v2 motherboard (2013 boot block date).

 

If your current software render engine is a CPU based render engine then you will not see any real reduction in rendering time until you install a GPU (CUDA) render plugin even with the newer RTX card.

 

Finally, some test results comparing CPU and GPU(s) rendering time;

 

render times.JPG

 

 

 

 

 

 

 

 

 

 

HP Z620 - Liquid Cooled E5-1680v2 @4.7GHz / 64GB Hynix PC3-14900R 1866MHz / GTX1080Ti FE 11GB / Quadro P2000 5GB / Samsung 256GB PCIe M.2 256GB AHCI / Passmark 9.0 Rating = 7147 / CPU 17461 / 2D 1019 / 3D 14464 / Mem 3153 / Disk 15451 / Single Threaded 2551
Reply
0 Kudos
† The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation