-
×InformationNeed Windows 11 help?Check documents on compatibility, FAQs, upgrade information and available fixes.
Windows 11 Support Center.
-
×InformationNeed Windows 11 help?Check documents on compatibility, FAQs, upgrade information and available fixes.
Windows 11 Support Center.
- HP Community
- Desktops
- Business PCs, Workstations and Point of Sale Systems
- Yet Another Z800 Graphics Card Question
Create an account on the HP Community to personalize your profile and ask a question
09-04-2018 04:22 PM
Z800 version 2
3.04Ghzx1 CPU
1150 Watt PSU
8GB RAM
Latest BIOS Update
I bought it with a PNY GeForce GTX 285 which is...lacking.
Came into a 1080ti and think that I've got the headroom needed to use it, given that I'm using the 1150 Watt PSU.
The machine tells me otherwise. Any time I've pushed it, even with benchmarks, it goes to emergency shutdown/restart. I've checked all of the drivers, HP, Microsoft, Windows stuff, and run all of the self-checks (HP & Win), but nothing seems to be awry.
Any ideas, or is this squarely an incompatibality issue?
Thanks.
FWIW - I used to have 2 CPUs clocked at 3.04, but pulled one when I began to think that I needed to, "Free up," a few watts. I could also pop two, (or one!) of the stock CPU(s) back in, clocked at 2.66Ghz/ea, for still more watts.
09-04-2018 08:12 PM - edited 09-04-2018 08:20 PM
MikeArkus,
I'm wondering though if the system is shutting down based on thermal load. Have you run HWMonitor or similar to see what the CPU(s) temperatures are doing?
Another possiblity is driver conflict and running a driver cleaner and reinstalling the driver might be a first action.
In the list of GPU details:
https://www.studio1productions.com/Articles/NVidia-GPU-Chart.htm
The GTX 285 needs a 550W PSU and the GTX 1080Ti needs 600W. As GPU's needed more power when the z800 was designed than is usual today, I would be surprised if it were a power supply problem in terms of total capacity. However, if you determine the PCIe power is not uffiecint, and the GTX 1080 Ti uses the typical 2X 8-pin, consider trying a 2 into 1 power adapter; the two standard PCIe power connectors into one 8-pin, and run the other 8-pin off the Molex 4-pin. I'm currently test driving a GTX 1070 Ti in a z620 using a 2X 6-pin to 1X 8-pin connector rather than try a 1X 6-pin to 1X 8-pin.
I would note that the 3D performance of the GTX 1080 Ti will be reduced with a 3GHz CPU, but compute power should be strong. In Passmark benchmarks, there are 865 z800's tested and the highest 3D score for GTX 1080Ti is 9202, where the average for GTX 1080 Ti = 14054. The highest score for a z800 with X5675 is 8353 using a GTX 1080 (average = 12309). Even my overclocked Xeon E5-1680 v2 (8C@ 4.3GHz) with the GTX 1070Ti makes a slightly below average 3D mark of 12132 (average =12168) whereas the i7-8700 5GHz hot rods can make 15000+. In keeping the GTX 1080 Ti, consider running one or two X5687 4C@3.6 /3.86GHz (under $50 these days) to improve the single-thread performance. If the current CPU is an X5675, the single thread = 1402, whereas the X5687 is 1570. If more cores are needed, the X5690 6C@ 3.47/3.73GHz (under $100) is a good one as well.
BambiBoomZ
09-05-2018 06:15 AM
the HP z series workstations are not like clone consumer based systems. as such you can not use their methods of determining the needed power supply wattage
unlike most newer consumer systems the hp workstations use a power supply that is a multirail design a not single rail
as such the listed wattage for each listed rail on the supply is the MAX it can deliver. if you exceed that the supply will shutdown
a single rail supply on the other hand is capable fo delivering the units max rated wattage on any voltage, which can be bad if the unit develops a short as all amps will be available!!
pull the power supply out, look at the 12v GPU wattage listed on it's label that is it's max
the 1100 watt supply is able to power a 1070/10 80, as these are 180 watt cards the 1070TI is a around 220 watts
which is within the 1100 watt supply ability.
your card is allready overclocked, there is no guarentee that it will reliably overclock even more.
any overclock risks damaging the card, and the gains from a overclock will only be 1-3 frames which is a poor return
when you risk the card for that small gain