Risk Analytics on GPU using ISO C++
The clever people at NVIDIA show how to run financial risk analytics on GPU using ISO C++
This is awesome. We have previously worked with banks to run financial risk analytics on GPU and this invariably involved rewriting the quantitative analytics libraries using CUDA.
Well NVIDIA have just shared a different approach that uses ISO C++ and as such the same code could be used across CPU and GPU.
This could potentially keep quants in work for a while longer if the approach is viable and the same code can genuinely be used across CPU and GPU. It wouldn’t be first time a theoretically viable approach is scuppered due to numerical differences in practice.
It would also appear to require much less rewriting of the quant code than the CUDA based rewrites we have seen in the past.
Anyone at QuantLib or ORE fancying adopting this? We'd be happy to help benchmark it across the cloud. I wonder how portable the code is across different CPU and GPU architectures...