Originally posted by skeevy420
View Post
Khronos Officially Releases OpenXR 1.0
Collapse
X
-
Originally posted by starshipeleven View PostYou are NOT going to evade a missile that is faster than you, has orders of magnitude lower inertia and does not even need to care about flying properly because it's moving on thrust force alone. The procedures to counter missile locks are deploy countermeasures (flares or chaff), and if they fail to eject.
Missiles can not turn tighter than a fighter jet simply due to the fact that they fly much faster, at Mach 3-4. Missiles have typically a very limited burn time (rocket engines). So they have to lead their target to minimize flight time. By changing your heading relative to the missile you force it to maneuver and bleed energy. A typical tactic is to turn hard and fly at roughly 90 degrees angle to attacking aircraft, forcing the missile to lead, followed by another hard turn after a few seconds to disengage. Air density can also be exploited by combining the turns with diving, thicker air reduces missile range considerably.
The actual issue with missiles is to know that one was launched at you.
Comment
-
-
Originally posted by coder View PostBecause that's how you seem to treat it - like you're Richard Dawkins, here to show us the light and cure us of it.
Lying about what? What problem are you solving?
Comment
-
-
Originally posted by coder View PostYou don't know what you're talking about. Hololens was self-contained
Self-contained ones have glitches https://www.theverge.com/2016/4/1/11...adset-hands-on
One of us browsed ESPN, the other checked out The Verge, and both of us stood there, staring at the same wall, looking at different virtual web pages. Irritatingly, when you move your head just a little bit, the browser window in front of you moves with it. Score one for virtual reality headsets.
and are otherwise immature https://www.geekwire.com/2019/holole...ality-headset/
The demos I got to see where pretty basic, as the device is not yet a finished product. I was able to read GeekWire on a web browser pinned to the wall. I inspected a windmill and turbine, both of which I could rotate, re-size and pin to me, so that wherever I walked around the room the image would follow me.
Plus there was a hands-on where I remember reviewers were complaining about a bridge hologram that was glitching through people and other complex objects.
Also note the very basic environment with straight and clean lines in all demos of it you find. In a more natural (and cluttered) environment it would glitch.
Today, you can download and run AR apps on regular iPhones and many Android phones.
Guess what? It didn't even ship in 2015 (nor was it planned to), so pre-release software issues are understandable. But the hardware of the original Hololens was finalized back then.
Current Hololens 2 is using Snapdragon 950, which is better but still inadequate.
To actually have a decent shot at working reliably in a realistic non-controlled environment I expect NVIDIA Jetson boards or similar car-grade hardware as a bare minimum, a high-end mobile SoC won't cut it.
That was my point - you seemed to imply there was a gap in hardware capabilities needed for AR,
As it is, it's little more than a toy with limited capabilities. Which is kind of bad for AR that is supposed to be used out in the field, not in a controlled environment.
By what authority?
Comment
-
-
Originally posted by coder View PostThe benefit that AR can provide is contextual awareness. It can provide you with relevant and important information for your specific context.
AR is just a display technology. Developing it now for those uses is decades too soon.
Again, using the patrol example, it could also highlight subtle changes that occurred since previous patrols (possibly even by different soldiers), so you can see where potential IEDs might be hidden or where foes might be lying in ambush.
And imagine being able to see where an incoming airstrike or artillery is going to hit, so you can see if it's on-target or if you're too close & need to take cover.
Imagine being able to see the location of "friendlies", so you can better coordinate with them, as well as avoiding friendly-fire.
Plus, with soldiers kitted out with multiple cameras and full 3D tracking, they can relay the data back to a command post, so their commander can see exactly where they are, what they see, and what's around them. All in an integrated, multi-asset battlefield display, that he might even be viewing in VR to see in 3D.
This can be at best sent to analysts for post-processing, a large team of people looking at different angles of each person's 3D feed in realtime, or you need some serious AI program to deal with it and provide useful info in real time.Last edited by starshipeleven; 31 July 2019, 06:56 AM.
Comment
-
-
Originally posted by log0 View PostFunny that you are talking about minimal latency. The signal roundtrip time to your remote operated fighter jet will be like what 200-500 milliseconds? Then there would also be the issue of jamming or spoofing. (Semi)autonomous unmanned vehicles are where the future is heading.
Comment
-
-
Originally posted by starshipeleven View PostLying about what it can actually do.
Originally posted by starshipeleven View PostAnd I'm shutting down lies so you can focus on something that could make it really become real one day.
Please leave us alone.
Comment
-
-
Originally posted by starshipeleven View PostThe prototype with the best capabilities, used in the first preview was the one tethered to a PC. Later ones were more limited, but portable.
Self-contained ones have glitches https://www.theverge.com/2016/4/1/11...adset-hands-on
You're deliberately trying to derail the conversation in a pointless digression. Classic trolling.
Originally posted by starshipeleven View PostDid I cite Pokemon Go? Yes I did. I know. That's basic stuff though.
I don't mean to over-sell what these phones can do. IMO, they're still lacking in the sensors department, but the processing power is there to find planes, perform occlusion, estimate and match lighting, and store persistent anchor points.
If you want to see just what it's capable of, here's a video from a year ago:
The Lego AR stuff is particularly cool. I'll try to find a better video of that.
Originally posted by starshipeleven View PostTo actually have a decent shot at working reliably in a realistic non-controlled environment I expect NVIDIA Jetson boards
Second, the distinction is meaningless. Jetson is pretty old, by this point, and equaled or surpassed by modern, high-end SoCs.
Originally posted by starshipeleven View PostTo have a system that does not make mistakes all over the place in a cluttered environment (which is a requirement for its use by military, rescue and all the other usecases you advocate) you need more processing power than is currently employed.
Originally posted by starshipeleven View PostAs it is, it's little more than a toy with limited capabilities.
Originally posted by starshipeleven View PostStill stuck in a religious mindset?
Anyway, as a bonus, here's a Hololens 2 demo, mostly showcasing the precision of its hand-tracking, as people interact with a virtual keyboard and virtual controls.
Last edited by coder; 01 August 2019, 12:44 AM.
Comment
-
Comment