Basically, imagine a pixel streamed experience where the user can move around in a 3D scene and see frame updates in real time. Then, when the user stops moving, there should be a photo realistic panorama rendering delivered to the user so that the user can pan around the 360 photo locally without requiring additional frames from the server.
I would also want the ability to generated multiple panoramas at different parts of the scene in the background while the user is static, so that I can cache those additional panoramas.
The generated panoramas should have minimal seams.
—————————
Deliverable:
A proof of concept demonstration of the feature listed above. The solution should be reasonably performant on mid tier GPU (RTX 3070 or A4000). It doesn’t have to be implemented end-to-end, but should be enough to prove how to do it. Would also like a write up detailing how Lumen handles SceneCapture2D components from multiple different locations and camera angles.
This solution should be implemented in C++ as much as possible, with blueprints used only when absolutely necessary.
Budget: $1,500
Posted On: July 15, 2024 15:23 UTC
Category: Video Game Development
Skills:Unreal Engine, C++, Game Engine, Panoramic Stitching
Country: United States
click to apply
Powered by WPeMatico
