The requirement is to create an API/SDK/Plugin/App-based virtual camera (which one from this can be discussed later) that:
1. Renders real-time 3D model from local storage.
2. Get the phone front camera feed and overlay this render.
3. There must be an option to move this render inside the screen, increase/decrease its size, rotate in 360 degrees.
You can take the scenario as same as what happens in OBS. In OBS, you can choose multiple sources and combine them into one. Later OBS will create a virtual camera. This virtual camera will be now available in zoom/g-meet or any other video conferencing software as an additional camera. When you select "OBS-Virtual-Camera", the attendees will be seeing the combined frame generated by OBS.
We are looking forward to building something similar to it. Our target platform is android. The output will resemble an AR view, but here you need to set up any floor scanning or image tracking, so no need of using those frameworks. Just keep the overlay of render on the screen that can be controlled by the user.
Also how the controllers for changing the size/orientation must also be designed and placed.
This must be done in 2 parts.
1st module - We will be building an independent VC module that can be installed/set up directly by the user and used with any video conferencing software.
2nd module - A web API/SDK that goes inside these custom made video conferencing software which helps the video conf: software providers to bundle our product and users no need to take the pain of installation/setup.
More details can be shared later. A sample output of this PoC is attached with this.
Before committing make sure that,
1. The developer must be ready to work in Indian time.
2. Must be flexible for changes/iterations made in between the development for best output.
3. Must be available for standup to check and analyst the progress.
4. Must give a tentative timeline and detailed plan of execution before committing.