We developed a Unity environment for testing communication between an automated vehicle and a pedestrian by means of a visual external human-machine interface eHMI [1–3] Your task will be to extend the environment to support multiple agents of pedestrians and drivers (create a multiplayer environment). The environment shall be modular: it should have an API to add multiple pedestrians and multiple drivers in various vehicles. The environment shall come with models of typical vehicles (e.g., small car, medium-size car, SUV, etc.), which should ideally be implemented with freely-distributed assets.
The agents shall support control by a head-mounted display Oculus Rift ([login to view URL]) and a motion suit Xsens ([login to view URL]). It shall also support a Tobii Pro VR eye-tracker ([login to view URL]). It should be possible to introduce both static and dynamic (moving) visual eHMIs, and auditory eHMIs to all vehicles.
We are looking for an experienced Unity developer with background in development of multiplayer games. The final product will be tested in a full setup with 2 PCs linked with each other in the LAN environment with Tobii Pro VR eye-tracker and Oculus Rift connected for user input. We will help you to test the system with the devices; we do not expect you to own them.
We will provide the single-agent environment described in previous work, and a repository with first steps for the implementation of the multi-agent version. You will work closely with researchers from Delft University of Technology, The Netherlands.
The finished product shall come with extensive documentation aimed at a researcher without much experience in Unity. The finished product will be used in scientific experiments, so constant (low) latency and stability are crucial. Please look at the requirements in the attached PDF file and agree to them before sending your bid.
1. [login to view URL]
2. [login to view URL]
3. [login to view URL]