Just to give some nicer voice in this incel mess..
I've been waiting for OpenXR 1.0 release for a long time now and I'm pleased that it is finally here. I'll definitely start work on supporting VR using OpenXR in my engine and start doing applications using it.
I own a Vive and I love it, it looks very dev'y with all of those lighthouse sensor pointing everywhere, works well with glasses and has a decent screen. It's good for gaming, exercise (Beat Saber gets you really sweaty) and media consumption (watching movies in Bigscreen with a friend is really great).
Khronos Officially Releases OpenXR 1.0
Collapse
X
-
Originally posted by starshipeleven View PostI'm (also) saying that to get that level of contextual awareness you and others cite, you would need stupendous or even outright magic levels of sensor equipment and data-processing working on the background, which I don't see coming in the near future (decades).
Originally posted by starshipeleven View PostI don't want to know the amount of false positives this system gives. In a natural environment there are tons of "subtle changes" every day due to random events and animals.
Originally posted by starshipeleven View PostThis is a very dangerous double-sided blade. You need to be relaying information about airstrikes and friendly unit position all over the damn place.
Originally posted by starshipeleven View PostTheir commander (for any number of people, even a single squad) cannot handle so much data alone.
Originally posted by starshipeleven View PostThis can be at best sent to analysts for post-processing, a large team of people looking at different angles of each person's 3D feed in realtime, or you need some serious AI program to deal with it and provide useful info in real time.
Leave a comment:
-
-
Originally posted by starshipeleven View PostThe prototype with the best capabilities, used in the first preview was the one tethered to a PC. Later ones were more limited, but portable.
Self-contained ones have glitches https://www.theverge.com/2016/4/1/11...adset-hands-on
You're deliberately trying to derail the conversation in a pointless digression. Classic trolling.
Originally posted by starshipeleven View PostDid I cite Pokemon Go? Yes I did. I know. That's basic stuff though.
I don't mean to over-sell what these phones can do. IMO, they're still lacking in the sensors department, but the processing power is there to find planes, perform occlusion, estimate and match lighting, and store persistent anchor points.
If you want to see just what it's capable of, here's a video from a year ago:
The Lego AR stuff is particularly cool. I'll try to find a better video of that.
Originally posted by starshipeleven View PostTo actually have a decent shot at working reliably in a realistic non-controlled environment I expect NVIDIA Jetson boards
Second, the distinction is meaningless. Jetson is pretty old, by this point, and equaled or surpassed by modern, high-end SoCs.
Originally posted by starshipeleven View PostTo have a system that does not make mistakes all over the place in a cluttered environment (which is a requirement for its use by military, rescue and all the other usecases you advocate) you need more processing power than is currently employed.
Originally posted by starshipeleven View PostAs it is, it's little more than a toy with limited capabilities.
Originally posted by starshipeleven View PostStill stuck in a religious mindset?
Anyway, as a bonus, here's a Hololens 2 demo, mostly showcasing the precision of its hand-tracking, as people interact with a virtual keyboard and virtual controls.
Last edited by coder; 01 August 2019, 12:44 AM.
Leave a comment:
-
-
Originally posted by starshipeleven View PostLying about what it can actually do.
Originally posted by starshipeleven View PostAnd I'm shutting down lies so you can focus on something that could make it really become real one day.
Please leave us alone.
Leave a comment:
-
-
Originally posted by log0 View PostFunny that you are talking about minimal latency. The signal roundtrip time to your remote operated fighter jet will be like what 200-500 milliseconds? Then there would also be the issue of jamming or spoofing. (Semi)autonomous unmanned vehicles are where the future is heading.
Leave a comment:
-
-
Originally posted by coder View PostThe benefit that AR can provide is contextual awareness. It can provide you with relevant and important information for your specific context.
AR is just a display technology. Developing it now for those uses is decades too soon.
Again, using the patrol example, it could also highlight subtle changes that occurred since previous patrols (possibly even by different soldiers), so you can see where potential IEDs might be hidden or where foes might be lying in ambush.
And imagine being able to see where an incoming airstrike or artillery is going to hit, so you can see if it's on-target or if you're too close & need to take cover.
Imagine being able to see the location of "friendlies", so you can better coordinate with them, as well as avoiding friendly-fire.
Plus, with soldiers kitted out with multiple cameras and full 3D tracking, they can relay the data back to a command post, so their commander can see exactly where they are, what they see, and what's around them. All in an integrated, multi-asset battlefield display, that he might even be viewing in VR to see in 3D.
This can be at best sent to analysts for post-processing, a large team of people looking at different angles of each person's 3D feed in realtime, or you need some serious AI program to deal with it and provide useful info in real time.Last edited by starshipeleven; 31 July 2019, 06:56 AM.
Leave a comment:
-
-
Originally posted by coder View PostHere's the current Wikipedia definition:
Leave a comment:
-
-
Originally posted by coder View PostYou don't know what you're talking about. Hololens was self-contained
Self-contained ones have glitches https://www.theverge.com/2016/4/1/11...adset-hands-on
One of us browsed ESPN, the other checked out The Verge, and both of us stood there, staring at the same wall, looking at different virtual web pages. Irritatingly, when you move your head just a little bit, the browser window in front of you moves with it. Score one for virtual reality headsets.
and are otherwise immature https://www.geekwire.com/2019/holole...ality-headset/
The demos I got to see where pretty basic, as the device is not yet a finished product. I was able to read GeekWire on a web browser pinned to the wall. I inspected a windmill and turbine, both of which I could rotate, re-size and pin to me, so that wherever I walked around the room the image would follow me.
Plus there was a hands-on where I remember reviewers were complaining about a bridge hologram that was glitching through people and other complex objects.
Also note the very basic environment with straight and clean lines in all demos of it you find. In a more natural (and cluttered) environment it would glitch.
Today, you can download and run AR apps on regular iPhones and many Android phones.
Guess what? It didn't even ship in 2015 (nor was it planned to), so pre-release software issues are understandable. But the hardware of the original Hololens was finalized back then.
Current Hololens 2 is using Snapdragon 950, which is better but still inadequate.
To actually have a decent shot at working reliably in a realistic non-controlled environment I expect NVIDIA Jetson boards or similar car-grade hardware as a bare minimum, a high-end mobile SoC won't cut it.
That was my point - you seemed to imply there was a gap in hardware capabilities needed for AR,
As it is, it's little more than a toy with limited capabilities. Which is kind of bad for AR that is supposed to be used out in the field, not in a controlled environment.
By what authority?
Leave a comment:
-
-
Originally posted by coder View PostBecause that's how you seem to treat it - like you're Richard Dawkins, here to show us the light and cure us of it.
Lying about what? What problem are you solving?
Leave a comment:
-
Leave a comment: