Augmented Reality platform for fast motion video applications

EdgyBees technology

The Edgybees platform bridges the gap between real-world, fast-moving platforms, and high-level application engines.

Our engine uses information gathered from real-world sensors and other devices, such as cameras, GPS, IMU (inertial measurement unit), altitude meters, etc. to create reliable and credible reconstruction of the device’s motion in 3D. It further provides lighting that corresponds to the scene, and (in conjunction with cloud-based sources) representation of obstacles and other objects.

The physical device is abstracted into a 3D point in the application engine space, allowing rapid development of games and other real-time applications.

The video generated by the application engine is time-synced and applied over the actual video transmitted from the device, creating an immersive augmented reality experience, where real and virtual objects interact in a natural and customer-pleasing fashion.

Multiple devices communicate in real time over a local network

To achieve optimal user experience in multi-player cooperative applications, a coherent, synchronized reproduction of reality needs to be established by the participating devices. The Edgybees platform establishes a low-latency network between the devices involved in an application, and coordinates an up-to-the-millisecond accurate map of their respective locations.

Flight data is streamed for further processing in the cloud

Information gathered by the Edgybees software, including terrain altitude, location of obstacles, and users’ flight patterns are streamed by the Edgybees platform to our back-end servers. Using big-data algorithms, we analyze the data to generate accurate, constantly-improving maps of the terrain and obstacles.

Case study: First Response suite

Situational awareness and decision making tools for drone pilots via augmented reality

Using Edgybees technology, First Response app supports a wide range of drones out of the box.

Real-time drone video is overlaid with virtual layers in real-time.  Virtual layers are presented from an accurate point of view and perspective, making them align with the underlying video in a natural and seamless fashion.

Data layers allow for pilots to collaborate by adding new geo-markers as they are flying in real-time and to stream these markers to other viewers.

Cloud-connected, locally connected, and stand-alone operation are all supported to allow for various operation modes and different network availability levels.

Thanks to its layered structure, abstract interfaces, and use of off-the-shelf application generation engine, Edgybees’ Platform allowed the realization of First Response in a matter of months – not years.

Case study: Automotive

Edgybees technology used for developing a backseat automotive game.

The platform enables fusing the car cameras video streams with Geo-information from the car sensors, creating a game world that is responsive to the actual car environment and movement.

Case study: Games

Edgybees platform provides developers with real-time data and statistics of the drone flight. Developers can use these statistics for training, scores, and level performance. In DronePrix AR, these statistics are used for players scores and leaderboards.

The Edgybees platform exposes a complete SDK for Unity (TM) game development engine. The use of Unity allowed DronePrix AR to be completed in months – not years.