Originally posted by agd5f
View Post
Announcement
Collapse
No announcement yet.
AMD Open-Sources VCE Video Encode Engine Code
Collapse
X
-
Originally posted by agd5f View PostThe vaapi encode interface was designed before we started the open source vce project. Why didn't Intel use omx or vdapu or some other existing APIs to begin with? omx is a lot more flexible in being able to support different types of hw. vaapi is very much tied to the way Intel's hw works (on both the encode and decode sides) which makes it a poor fit for other hw.
Comment
-
Originally posted by curaga View Post- the mentioned flexibility doesn't include slice output?
Comment
-
Originally posted by madbiologist View PostProbably because they want to sell more of their hardware and not yours
Comment
-
My 50c:
Obviously hardware companies will design software in a way that favours their own product (hardware). Nobody should expect anything different.
It should be up to the distribution maintainers to patch and maintain the upstream projects which the maintainers themselves are not interested in doing due to their own bias. Afterall, the distributions have the most to gain or loose by having a sane, well documented and supported set of APIs that developers can target and users can enjoy.
Microsoft and Google do this and are very successful in doing so. Valve is doing the same thing in the form of SDL, but distros are probably too understaffed, lazy or worried with cosmetic changes or petty political bickering to actually do what has to be done, in this case, unify VAAPI, VDPAU and whatever other APIs are available into a single API app developers can target and be done with it.
Comment
-
Originally posted by Deathsimple View PostCurrently we only expose the "normal" 4:2:0 YUV to H264 encoding process. But you can for example aid encoding by calculating the best motion vectors with shaders (or the CPU or get them from the source video while transcoding etc..). In general it's quite flexible regarding which part of encoding it should do and could even only do things like bitstream encoding and the rest elsewhere.
(Sorry if this was actually answered in what I quoted, I'm just not very knowledgeable with this kind of stuff.)
Comment
-
Originally posted by agd5f View PostThe vaapi encode interface was designed before we started the open source vce project. Why didn't Intel use omx or vdapu or some other existing APIs to begin with? omx is a lot more flexible in being able to support different types of hw. vaapi is very much tied to the way Intel's hw works (on both the encode and decode sides) which makes it a poor fit for other hw.
Mentioning such high-level details would hardly have given away competitive advantage.
/me ends public flogging for failing to predict the future
Comment
-
Originally posted by curaga View PostThis is addressed more to the managers, twriter and bridgman. I would have expected some forward-looking here. Even though the VCE open sourcing project was yet to be started, you likely knew such a unit would be included in the generations under planning at the time, and so would've been expected to send a few emails as vaapi 0.1 first started making waves.
Mentioning such high-level details would hardly have given away competitive advantage.
/me ends public flogging for failing to predict the future
Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
I'm not saying we couldn't have looked sufficiently far ahead to anticipate open source VCE support back in early 2009, but I guess I am saying that I couldn'tTest signature
Comment
-
Hello,
I am working on an embedded AMD solution (Kabini) and need to use hardware accelerated video encoding. Until now, I was unable to make it work but I was using ffmpeg for my tests and I read here that it does not support h264 HW acceleration. So I am about to test it with gstreamer as suggested here but before I start searching everywhere, I thought I could ask for some guidance ! :-)
So the question is simple : does someone can point me were to start if I want to use HW accelerated h264 encoding on AMD G-series ?
For information, system is linux (for the tests, we run a simple ubuntu on the SOC) and we need to grab X11 screen to a h263 video file. For now, we just installed latest AMD drivers (13.25?).
I already saw that there is a gstreamer plugin to grab video from X11 but I don't know how to make use hardware acceleration.
Any experiences with that? Maybe a link with a tutorial or whatever ?
Best regards,
Meigetsu
Comment
Comment