But the simulator is designed to help you test your app, just like iOS apps that require touch/gestures/low battery/etc. can be tested on a macOS machine. If I were Apple, I wouldn’t voel their building tools that run on Quest in order to test your Vision app. What would be the point? It seems like such an opportunity cost that’s better spent on other things.
You think testing VR apps in a simulator on a laptop is more indicative of the final experience than another VR headset that almost has feature parity?
I imagine you'd probably still want to use the simulator to ensure your code will run on the actual Apple hardware, but for verifying actual UX/behaviour I'd take the Quest Pro over that any day.
I think Apple will disagree with that “almost has feature parity” claim.
I have not used or even seen neither product, so I wouldn’t know whether they’re right, but do not rule that out, either. For example, the video resolution on Apple’s product is so much higher that it may cross a threshold w.r.t. user experience.
Also, if they did come out with a Quest-based simulator, I think it would be very bad for their marketing.
I would expect ‘the internet’ to say “it’s a Quest, but with a much higher price tag”. How would they go on from there to selling these devices?
> Neither does a Mac, yet it's the only device that's allowed to develop for the headset.
I'm willing to bet that you can develop a prototype for an app in Unreal Engine right now, and I'm willing to bet that Unreal Engine will be ported to Vision OS and that you can get that code running on there pretty quickly after release.
I don't see how this is very much different from how it'd be to develop Windows or Xbox applications. I might use be able to develop some core code with .NET core, or even a full game using cross-platform tools like Unreal or Unity.. but if I'm actually shipping a product I can't expect to get far without using Microsofts officially supported toolchain.
I'd say it's reasonable to be annoyed that you're not allowed to run Mac OS on non-apple hardware. But it's not reasonable to be annoyed that Apple isn't spending millions on officially supporting an SDK for their devices on other OS's, just for a very small niche set of users.
"RUMOR: Why Meta removed the Depth Sensor at the last minute
It allowed you to see people without clothes. It was not it's purpose, but during testing: someone noticed that by using the sensor someone could develop “creeper apps”
Ah yeah you’re right. Apologies for the mixup. However that link isn’t very good at explaining either, since it doesn’t describe Apple’s TrueDepth tech for the “depth sensor” which is a generic term for many technologies, whereas TrueDepth is specific.
Why would that be the right thing?
Quest Pro doesn’t have the same capabilities as Vision.