What the iPhone 15 Pro’s Spatial Video Feature Means for the Vision Pro video – CNET

What the iPhone 15 Pro's Spatial Video Feature Means for the Vision Pro

Phones

Speaker 1: Apple’s biggest new product in years. The Vision Pro, a spatial computing device for your head isn’t coming out until early 2024, but at Apple’s iPhone and watch event, I was curious if there were going to be any hints or new bits of information about that product here. And it turns out yes, there were one small one and one that seemed kind of like a hint about one. So here’s what happened. First of all, the iPhone 15 Pro is going to have something that’s related to Vision Pro. They announced that these [00:00:30] cameras on the 15 Pro are going to be able to record spatial videos that are going to be viewable on the Vision Pro next year. So what is a spatial video? Basically it’s a three D video clip and this type of technology has existed before in a lot of ways. Google experimented with VR 180 that they even made separate cameras for. Speaker 1: I remembered reviewing that and you could post these to YouTube and also be able to see them in vr. Now, I’m not sure how Apple’s is going to work, but I did see a clip of a spatial video [00:01:00] during my Vision Pro demo back at WW d c and they look really nice, very immersive and three D kind of looking like these ghostly snippets of memories. So Apple’s original pitch was that you were going to wear the Vision Pro on your head to record these special moments in your family’s life and then be able to play them back later on the headset. I don’t want to wear a Vision pro in my head at my kid’s birthday party, but the idea of using a phone to do it, that’s a lot more normal. That’s what you would do anyhow. What we don’t know is how that’s actually going to work. Speaker 1: Now, is this going [00:01:30] to be a separate video format that you’re going to toggle on the iPhone 15 Pro and record spatial video as opposed to regular video? If that’s the case, then I’m not sure which one I would toggle. I mean, I would usually default to the one I want to share with everyone else. It’s going to be frustrating to think about recording a video clip that is not compatible with Vision Pro or something that is only available for Vision Pro. And then I can’t share with my family, so I’m really curious how Apple works that out. I mean, they already have live photos and they have a lot of photos with Desense information in them that are compatible [00:02:00] with a lot of other apps already. So I guess we’ll see. I think it’s going to be away from the test out how that relationship works. Speaker 1: And then once the Vision Pro launches, I mean at some point that technology is going to trickle down to the other phones too. Hopefully it’ll feel nice and integrated and like an optional thing. So that was the big one, a small one that I noticed involved the Apple Watch. Now the Apple Watch is not Vision Pro compatible, however, they’re saying that really made me pay attention double tap. Now this feature which allows you to tap to open up [00:02:30] things or do little actions on your watch actually already existed as an accessibility feature. You can turn it on right now on your Apple Watch, but on the new Apple Watch, it’s supposed to work better, more reliably without draining battery life, but it’s being launched as an everyday feature. And what’s interesting is that tap looks a lot like the type of taps that I was doing with the gestural interface on the Vision Pro. Now at some point the watch is going to work with Vision Pro. I’m just going to say that because all the plans for Speaker 2: AR and VR that I’ve seen [00:03:00] talk about watches being an essential part met is looking at neural input wristbands and watches as a way of connecting with future AR glasses. I mean, once these things get smaller and more mobile, it makes sense that you’d have something like this on your wrist, especially with haptic feedback to give you a sense of feedback. So right now you’re just looking at taps on an Apple Watch, but what’s going to happen next? I mean, we’re supposed to have some sort of new Apple Watch next year, this Apple Watch X according to reports, that could be a whole new redesign. Is that a moment that they introduced [00:03:30] some way for it to interact with the vision? I mean, it seems likely, but I’m not going to skip ahead too much. But I do think that Apple’s going to try to lay out a little more of a gestural interface familiarity across its products. Speaker 2: I mean, how are you going to get used to using the Vision Pro? Unless you start having some ideas that feel like, oh, I’ve done that before. It may start with the Apple Watch and it may start bleeding over into some other places. Watch that space, so to speak, because I feel like there’s going to be some interesting things that happen. So we still don’t know the specific date for Vision Pro [00:04:00] and I didn’t get a chance to demo it again, but there were definitely hints and talk about it starting with spatial memories, starting with the iPhone 15 Pro. So anyway, those are some of my thoughts being here at the Apple event in Cupertino. If you have any questions or comments, things you’re curious about with mixed reality, the Vision Pro, let me know in the comments and make sure to like and subscribe. Thanks.