Yeah something is wrong I don't know how gaming industry will convince people to put those headsets... i think i will never use that VR
AMD Launches Antigua (Tonga) Powered Radeon R9 380X
Collapse
X
-
What about people who already wear glasses? There was study somewhere on net and they estemate how 60% people in the world have more or less impaired vision, sad but true
In guess in 10 years all mesa developers will wear that, while bisecting regressions
But yeah from those 60% with imperfect eyes only 22% wear corrective glasses (all the time or sometimes), 38% people refuse to wear it but they should - maybe that is the convincing way, manekens will do a job as always
Last edited by dungeon; 22 November 2015, 07:54 PM.
Comment
-
-
micheal I've got an idea for how you might get better access to AMD cards, how about reviewing cards separately and include awards. Like: "best proprietary linux gpu", "best oss driver gpu" or "Editors Economic OSS choice". If not AMD, maybe some manufacturer like sapphire would at least send the ones that might likely get the awards.
Comment
-
-
Originally posted by bridgman View Post
Um... we're not jumping around as far as I know. We are not "messing with CUDA" as you put it, we are running the HSA stack on dGPUs with a C++ compiler and providing tools that make it easy to *port* CUDA code to run on that C++ compiler. The ported code can still run through NVidia tools, so basically we're helping to move from a proprietary standard to an open standard.
And hmm, which dGPUs HSA supports, exactly? Would it run on some of HD57xx I have, like OpenCL does? Or at least on my primary R9 270X I use? I do not give a fuck about uber-brand-new mega-cool hardware super-features, dammit. But I do care about some reuse of existing hardware and getting most out of it. Not to mention it is hard to demand all users to buy newest GPUs. Even gamedevs can't afford this, despite of existence of gaming maniacs buying each and every newest GPUs. Sure, I can understand hardware improves over time. But it's not like if devs would run like mad to buy each and every brand new hardware part, they're not gamers. After all, in terms of raw performance or performance per watt, etc... it improves. But clearly not at speed where buying new GPU each year really justifies it. HBM can theoretically can be a major break-through... but, erm, somehow Catalyst fails to put us a show it deserves, opensource driver is half-baked at very most, and do you honestly think we're going to waste $500+ to get such a crappy results? Hmmph, sorry, but...
AFAIK, nvidia gradually developed CUDA and it mostly works on many different GPUs, probably everything from Fermi and above, or so (correct me if I'm wrong). Is it a case with HSA? Not to mention CUDA is single strong brand. AMD on other hand created like a dozen of confusing names. OpenCL? HSA? KFD? AMDIL? HSAIL? Uh, what else I've forgot? Uhm, what all these are and why exactly I'm supposed to give it a fuck?
And what would prevent devs from getting idea: "Aha, let's target CUDA, use Nvidia as primary development platform, optimize for nvidia, and AMD... would do something... we do not care anyway, because we use nvidias to develop!". Now most gamedevs are developing using Nvidia tools, optimizing for nvidia. Wouldn't this movie cause the very same in GPGPU world, so devs would just target CUDA all the time? Because it is single strong brand. Because it seems to be gradually developed. And because it seems Nvidia cares about hardware compatibility, at least across their own GPUs.
This in turn means less optimization and less performance on everything which is not nvidia. Even if hardware is good on its own, it could be somehow different, and code tuned for nvidia could run worse on other GPUs. And so far, all this mess does not looks really appealing, branding looks rather confusing and fucked, and I'm not really fond of aggressive reliance on features of brand new hardware. Since I'm really fine with, say, my R9 270X and would only buy something newer when I can see there is something which can do it like 3x better in same TDP, etc. No, in fact I'm rather positive about getting more AMD GPUs, but only if older GPUs would be usable as well. Else it looks like a poor investment. Tomorrow you would do something better, invent yet another brand, calling previous ICs and brands obsolete. And ... what? I have to go rewrite code and buy new GPU? Hmm, somehow I'm not very optimistic about it.
Comment
-
-
Originally posted by bridgman View PostSystemCrasher was the one who brought up HSA and suggested we had dropped it in favor of CUDA. I was just doing my part
...but it much worse when it comes to global orchestration, etc.
How it supposed to work:
- HW team delivers epic advantage. Sure, it works any day for AMD.
- SW team is ready and helps to unleash fury of the Fury. Erm... it seems it rather unleashed fury of phoronix readers instead. Catalyst epic fails in benchmarks. No open driver in time either. Uh-huh.
- Marketing does it bests to ensure all competitors tremble since they do not have anything to counter HBM at this point and not like if you can do it in one night either. But really I guess it AMD's marketing's dep't who is going to tremble after reading Phoronix benchmarks...
Comment
-
-
Originally posted by dungeon View PostOriginally posted by profoundWHALE View PostSave the money for Zen and whatever their next GPUs are
AMD FX-9370 8-core CPU
AMD Radeon R9 290 GPU (bought 3 of them, sold one, gave the other to a family member)
ASUS Crossfire V Formula Z (AM3+ ATX)
4 x AMD 4GB 2400MHz DDR3 (Quad Channel)
240GB Crucial BX100 SSD (Running Windows 7)
120GB Kingston HyperX SSD (Running Ubuntu 15.10)
4TB WD Red HDD @ 5400RPM (Windows Steam + Origin Games)
3TB Toshiba HDD @ 7200RPM (Media)
Fractal Design 1000W Newton R3 Platinum rated PSU
Fractal Design Define R4 Case (Windowless)
So... no, I doubt I will lose 1 year of gaming when I already have all the performance that I'll need for quite a long while. I could do without the noise of the R9 290's reference cooler though... The heat I like because Canada's winters are cold!
Comment
-
Comment