Visit
By Andrew Clark

A new approach is making it easier to visualize lifelike 3D environments from everyday photos already shared online, opening new possibilities in industries such as gaming, virtual tourism and cultural preservation.

Hadar Averbuch-Elor, assistant professor at Cornell Tech, is part of the research team behind “WildCAT3D,” a new framework that significantly expands the possibilities of novel view synthesis (NVS), a technique that creates realistic angles of a scene from just a single existing photo.

The work, which was presented Dec. 4 at the Conference and Workshop on Neural Information Processing, focuses on a key limitation in current 3D image-generation technology: Most systems can only learn from small, carefully curated datasets that look nothing like the messy, inconsistent images people actually take and share online.

Read more in the Cornell Chronicle.

Andrew Clark is a freelance writer for Cornell Tech.