Technology

The final 3D video call didn’t completely shock me

After Android XR smart glasses, I was very excited to try Google Beam, a shrinking and commercialized version of Project Starline 3D Video Calling Booth, which Google has been plugging in for the past few years. It seems that everyone who has tried the Starline project has told me how shocking it is to have a video call with someone in an essentially glasses-free 3D TV and feel like they are actually sitting in front of them. I finally had the opportunity to try this technology on Google I/O 2025, which made a big impression, but it was far from a perfect copy of the person you were talking to.

Let me repeat myself so that there is no confusion: Google can copy a person from a bunch of 2D videos and then spell them into 3D using a custom AI neural network, it’s nothing more than a wizard. The 3D people inside the screen do feel as if they are sitting across the table. In my demo, it was actually using an older project star line setup, rather than a friendly guy from a more compact HP-making, who was a guy named Jerome who said he was playing from Seattle, Washington to my screen in Mountain View, California, reaching out with an apple in my hand, and I instinctively tried to grab it. After a few beats, when he told me that the demo was over, we were high-destined – I did it again. All the time, during our 1-2 minute call, we made eye contact and smiled, as if we were together in IRL. Everything is very…normal.

The limitations of the current version of 3D video calling technology are immediately obvious when I sit in front of the TV “stall”. When Jerome appears on the screen, I can see his 3D rendering jitters slightly. Throughout the process, as he walked around, I saw a slightly horizontal annoyance. I could compare it to the closest thing like a little nervous TV scan line, but this is something I noticed right away and started to fix it.

Another limitation is camera tracking and viewing angles – it can really look at its dead center. Whenever I move the chair to the left or right, Jerome’s picture becomes dark and distorted. Even at 8K resolution, the light field display still looks grainy. I also noticed that if you try to “look around” the other person’s body, there is nothing there. Just…empty particle-like space. This makes sense because the Beam/Starline camera captures only the sides and parts of a person, not the back corner. If you’ve ever seen the back of a person’s portrait mode photo (see below), you’ll know there’s no captured data there.

I also doubt the working condition of the beam in less optimal lighting. The lighting in my room was very good. I suspect the image quality may degrade greatly by dim lighting. There may be some really obvious image noise.

I should also note that the chat with Jerome is actually my second demo. My first demonstration was with a guy named Ryan. The experience is just as brief, but Starline collapses, his image freezes, and I have to transfer to Jerome. prototype! Of course, zoom phones can also be frozen, but do you know what not to freeze? Talk in person.

Since these units are units of the project star line – the camera and speaker modules are attached to the side of the screen rather than built-in – it is impossible to know if Google Beam is a more polished product.

I really hoped to shake my thoughts like everyone else, but because it felt so natural, the whole experience didn’t scare me. I was frightened when some new technology looked great. Maybe it’s a blessing in disguise – at least there’s no shock factor (at least not for me), which means the Beam/Starline technology has done the work (mainly) avoiding real communication.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button