-
×InformationNeed Windows 11 help?Check documents on compatibility, FAQs, upgrade information and available fixes.
Windows 11 Support Center. -
-
×InformationNeed Windows 11 help?Check documents on compatibility, FAQs, upgrade information and available fixes.
Windows 11 Support Center. -
- HP Community
- Desktops
- Business PCs, Workstations and Point of Sale Systems
- how to use Nvidia Quadro K5000 graphic card as processor on ...

Create an account on the HP Community to personalize your profile and ask a question
04-13-2020 02:57 AM
hi all
I am using hp z 820,
I have 64 GB Ram and the processor Intel Xeon E5-2650v2 with 16 cores.
I want to use my graphic card as a processor too.
Did anybody do such kind of thing before..
Solved! Go to Solution.
Accepted Solutions
04-18-2020 07:55 AM
Generally, when you purchase any video software you are limited to CPU rendering by default, e.g. MAYA or Premier Pro, etc. In order to use the GPU CUDA cores to render the output video you need to purchase a compatible 3rd party GPU render plug-in which are usually quite expensive.
You cannot simply transfer your CPU's workload to the GPU unless your software supports GPU compute tasks or the software supports 3rd party plug-ins to utilize the GPU.
The CUDA toolkit is mainly for software developers who create these plug-ins or software packages that uses the GPU CUDA cores.
04-13-2020 04:13 AM - edited 04-18-2020 07:56 AM
Hi Terekeme,
Your graphics card IS currently working as a processor, by taking all the graphic workload away from the main motherboard CPU. There are instances where the GPU will perform the bulk of the computing required, but these are very application specific, e.g.
1. GPU Rendering - There are several software plug-ins such as Redshift, V-ray, Octane Render, etc. that will allow you to take full advantage of your GPU's CUDA cores. Some of these plug-ins, e.g. Redshift can take advantage of multiple GPU's installed on your machine. Some render engines will also support running in 'hybrid' mode - using the main CPU and any available GPU's to render the video output.
2. Engineering/Scientific simulations - Again, very application specific, e.g. mathCAD, ANSYS (Fluid Dynamics), SolidWorks (FEA). etc. Computer systems that are used extensively for engineering or scientific simulations very often have multiple GPU's installed or will have a main GPU for the graphic output and additional compute cards, e.g. Tesla compute cards. (NOTE: A compute card is effectively a standard GPU processor without the graphics output port, i.e. there are no DVI, DP or HDMI ports on the card.
Is there a specific application that you want to take advantage of your GPU's computing power?
04-13-2020 11:44 PM
Yes There are some programs I want to use with this specification.
but my programs are not related with graphics, they are forensic programs. When I run them and do heavy job, My processor's usage is increasing, I want my graphic card to share this weight actually.
"on my desktop-right click-nvidia control panel- manage 3d settings-program settings-add"
I tried to add my programs. but it denied..
and also on small search on internet.
We have found "Nvidia Cudo Toolkit". İt is a tool that converts graphic card to be used as gpu. but still did not do anyhting.. I want to be sure then I will choose a way ..
Thanks for Comments.
04-18-2020 07:55 AM
Generally, when you purchase any video software you are limited to CPU rendering by default, e.g. MAYA or Premier Pro, etc. In order to use the GPU CUDA cores to render the output video you need to purchase a compatible 3rd party GPU render plug-in which are usually quite expensive.
You cannot simply transfer your CPU's workload to the GPU unless your software supports GPU compute tasks or the software supports 3rd party plug-ins to utilize the GPU.
The CUDA toolkit is mainly for software developers who create these plug-ins or software packages that uses the GPU CUDA cores.