GEOSPATIAL IMAGE SURFACING AND SELECTION

Invented by Meisenholder; David, Mourkogiannis; Celia, Giovannini; Donald
Imagine seeing hidden moments from the past, right where you are standing, just by looking through your smart glasses or phone. This is the promise of a new patent application that introduces a way for mobile devices—especially smart eyewear—to display images over real-world scenes based on your exact location and the direction you are facing. In this article, we will break down the journey that led to this innovation, the science and technology behind it, and what sets this invention apart from anything before.
Background and Market Context
People use mobile devices every day to take photos, record videos, and share moments instantly. Smartphones and wearable devices, like smart glasses, are now common tools for capturing and consuming digital content. As the world becomes more connected, users expect more from their devices: they want quick access to memories, easy ways to share experiences, and new ways to interact with their surroundings.
One emerging trend is augmented reality (AR), which overlays digital objects onto the real world. Apps like Pokémon Go and Google Maps’ AR walking directions have shown how fun and useful AR can be. But so far, AR has mostly focused on adding new, computer-generated graphics or information on top of what you see. What if, instead, you could see real photos or videos taken by other people at your exact location, right in front of your eyes?
This idea has big appeal for many groups. Tourists could view old photos of a landmark from the same spot they’re standing. Friends could see each other’s pictures and videos from a concert, park, or sports event, just by looking in a certain direction. Real estate agents could show clients what a room looked like before a renovation. The possibilities are endless, and the demand is growing fast. As devices get smarter, more wearable, and always connected, the need for richer, place-based experiences is only increasing.
But there are challenges. Today, finding old photos or videos from a specific place is hard. You have to scroll through endless galleries or rely on social media tags and check-ins, which are often missing or inaccurate. The process is clunky and breaks the sense of being in the moment. There’s also the problem of privacy, battery life, and making sure the experience is smooth and not overwhelming.
In this context, the new patent application steps in with a clear vision: let users see and interact with past images and videos, exactly where they were captured, directly through their mobile device—especially smart eyewear. This is not just a new way to use gadgets; it’s a new way to experience the world.
Scientific Rationale and Prior Art
To understand how this invention works, it helps to look at the science and technology that make it possible, and what has been tried before.
First, the technical building blocks. Modern mobile devices are packed with sensors: cameras for capturing images, GPS and other location sensors for knowing where you are, compasses and gyroscopes for detecting which way you’re facing, and screens for displaying information. Smart eyewear adds see-through displays and often includes eye-tracking, touch input, and even microphones. These sensors can work together to create a powerful real-time map of where the device is and what the user is looking at.
Augmented reality uses these sensors to mix digital things with the real world. Some apps let users see floating labels, arrows, or 3D models placed on top of what they see. Others, like Snapchat, use facial recognition to put dog ears or funny hats on people’s faces. These are fun, but they mostly use computer-generated graphics, not real photos or videos from the past.
Some earlier inventions tried to mix digital content with the real world based on location. For example, there are patents for showing reviews, ads, or information about nearby stores or landmarks when you point your phone at them. Others let users leave digital “notes” or “graffiti” at certain locations for others to find. A few apps let you see the weather, traffic, or historical facts about a place as you walk by.
However, these systems have limits. Most show text or simple graphics, not real photos or videos. Few let you overlay actual images taken from the same spot and direction you are facing. Rarely do they let you interact with these images in a fluid, natural way—like picking an image just by looking at it or swiping your finger. And almost none are built for smart eyewear, which is the most natural way to blend the digital and real worlds.
Prior art also struggles with privacy and data management. Pulling up random photos from the cloud can be slow or drain your battery. Matching images to precise locations and directions is tricky, especially indoors or in crowded places. And showing too much information can clutter the view and confuse users. Most importantly, existing solutions don’t give users a simple, smooth way to find, explore, and interact with images from the past, right where they stand.
This is where the new patent application stands out. It brings together many pieces—location tracking, direction sensing, image capture, server storage, and smart display overlays—into a single, easy-to-use system. And it does this with privacy and user control in mind, making sure users can pick which images to see, how to interact with them, and when to turn the feature off.
Invention Description and Key Innovations
This patent application describes a mobile device, such as smart glasses or a phone, that can show images or videos from the past, overlaid on the real world based on where the user is and where they are looking. Here is how the invention works, step by step, and what makes it special.
At the heart of the system is a mobile device—often smart eyewear—that includes:
- A see-through display, so users can view the real world with digital images laid on top.
- Sensors to track the device’s location (like GPS) and orientation (which way it’s pointing).
- A processor and memory to run smart software.
- Connectivity to a server that stores millions of photos and videos, each tagged with where and when they were taken.
When a user looks at a scene, the device figures out its current position and direction. It asks the server for any images or videos that were taken nearby, especially those that match the direction the user is facing. The server sends back these images, and the device creates small icons or thumbnails, placing them as overlays in the user’s view. These icons appear right where the original photos or videos were taken, so it’s like seeing hidden windows into the past.
Interaction is simple and natural. If the device is eyewear with eye-tracking, the user can move a small cursor just by looking at an icon. Staring at an icon for a short time selects it, and the full image or video appears, floating in the scene as if it were really there. Users can move to different icons, cycle through older images, or close the overlay with a gesture, blink, or button press. On phones or tablets, users can tap on icons or swipe through images on the touchscreen.
This system is very flexible. It can work on see-through eyewear, regular phones, or tablets. It supports different ways to interact: gaze, touch, gestures, or even voice. The images can be grouped by place, time, or even type of event—so users can see the latest concert at a stadium, or all the photos ever taken at a famous landmark. The device can filter images by distance or field of view, so users don’t get overwhelmed with too much information.
Some unique features set this invention apart:
– Real photo and video overlays: Instead of just adding graphics, users see real moments from the past, placed in the real world, right where they happened.
– Smart filtering: The system picks which images to show based on where you are, which way you’re looking, and what’s in your view.
– Natural interaction: Users can select, cycle through, or close images just by looking, blinking, or tapping—no need for complex menus.
– Server-based content: Images and videos from many users are stored in a central place, making it easy to access a huge library of content from anywhere.
– Privacy and control: Users decide what content to see and can turn the feature on or off at any time.
– Works on many devices: The invention can be used on smart glasses, phones, tablets, or any device with a camera and display.
On the technical side, the invention includes clever ways to match images to the user’s exact view. It uses sensors to figure out not just location, but also which way the device is pointing and what’s in the field of view. It can recognize landmarks or objects to place icons more accurately. The overlays are designed to be clear and not block the real world, so users always know what they’re looking at.
The system also handles content capture and sharing. When a user takes a new photo or video, the device tags it with the exact location and direction, then sends it to the server for storage. Later, when someone else stands in the same place, they can see and access that content, creating a living, shared history of every place. The device manages storage, downloads, and battery use to keep the experience smooth.
In summary, this invention opens up a new way to experience places and memories. Instead of searching through albums or scrolling social media, users can just look around and see moments from the past, layered on top of the world as they move through it. It’s a natural, immersive, and powerful use of technology that brings together the best of sensors, smart software, and wearable displays.
Conclusion
The patent application we explored today is not just about showing pictures on a screen. It’s about changing how we connect with places, memories, and each other. By overlaying real images and videos on the real world, based on where we are and what we see, this invention brings the promise of augmented reality to life in a way that is simple, powerful, and deeply human.
For users, this means new ways to share and relive moments, discover hidden stories, and connect with the history of every place they visit. For businesses and creators, it opens up a world of possibilities for tourism, education, events, and more. And for the future of mobile devices and smart eyewear, it sets a clear path toward richer, more meaningful experiences that blend the digital and physical worlds seamlessly.
Whether you are a tech enthusiast, a developer, or just someone who loves sharing and exploring, this new approach to displaying overlay images on mobile devices is a glimpse into the next chapter of visual storytelling. As the technology matures, expect to see it change how we see the world—one moment, and one image, at a time.
Click here https://ppubs.uspto.gov/pubwebapp/ and search 20250218138.