Do you think those apps which allow you to take a selfie with a fake beard do a good job? I think not— even when it manages to position it correctly, it just doesn’t look “real”. A patent filed by Apple answers why.
Augmented Reality works by embedding virtual objects into a live camera feed. To feel credible, the virtual objects must look and act like real objects.
In today’s AR apps, they do not. One of the reasons for this is that these objects do not transform themselves according to their context. The content, in this case, would be the camera feed.
You see, cameras are imperfect — the video feed they capture has some noise in it. This noise originates from the imperfections in the camera sensor as well as due to the lighting conditions.
The virtual objects have no noise in them. They look crystal clear. They also do not reflect the lighting conditions of the video feed. This is one of the main reasons why virtual objects seem to “float” on top of the video feed instead of being a part of it.
The patent shows this effect in one of the drawings. Looking at the drawing you can immediately tell which object is real and which is fake. Our eyes are very good at sensing these subtle clues.
To overcome this problem, Apple has suggested a method of transforming the virtual object according to the video feed. It will essentially consist of identifying the level and type of noise in the video feed and applying similar noise to the virtual object.
This enhancement will improve how credible the virtual object looks, at least such is apparent from another drawing from the patent:
I think this enhancement is a great example of the innovation that characterises Apple — focus on the nuances of user experience. Next time you are find something that doesn’t “feel” right — identify the nuance!
Liked this article? Subscribe to our newsletter to get more such stuff delivered to you.