Announcement

Collapse
No announcement yet.

AMD Announces The Athlon 200GE With Vega 3 Graphics, 2nd Gen Ryzen/Athlon PRO

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • starshipeleven
    replied
    Originally posted by torsionbar28 View Post
    Lol, what? Did you forget about the part where a total remote takeover of the machine was possible (and easy) due to a flaw in ME? Most security outfits rightly rated that one as critical as intel scrambled for a fix.
    1. That particular case was a flaw in the AMT component of the ME (the remote control module) which is for vPro enabled devices and servers, the overwhelming majority of consumer devices are unaffected.

    2. I did not see at any point Intel sales plummeting, or PC review sites stopping their recommendation of Intel stuff because of security issues, or people actually using a PC even knowing what that is at all and deciding to go for AMD.

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by GrayShade View Post
    One thing I don't get is why the more expensive processors don't have integrated graphics. If I buy a Ryzen 7 or a Threadripper, I'd rather not pay $100 more for a GPU, assuming I'm not interested in gaming.
    Why $100? There are low end discrete GPU's available for as low as $35. You can buy a decent e-sports gaming card for $100.

    Leave a comment:


  • torsionbar28
    replied
    Originally posted by starshipeleven View Post
    Did it? Because apart from some vocal minority none gives a shit.
    Lol, what? Did you forget about the part where a total remote takeover of the machine was possible (and easy) due to a flaw in ME? Most security outfits rightly rated that one as critical as intel scrambled for a fix.

    Leave a comment:


  • grok
    replied
    Well, here's something I never knew yet or paid attention to : PCIe is full duplex (from what I think I'm seeing on wikipedia)
    So, PCIe 8x should be great for the CPU sending commands, vertices, texture data etc. while video output travels the other way, without hurting much.

    Thunderbolt is full duplex too, it must be why an external GPU with display on the internal laptop monitor is usable at all (on e.g. the infamous macbook pro).
    There's still a performance penalty doing that but Thunderbolt is only PCIe 4x 3.0 i.e. half the bandwith in the Ryzen APU scenario.

    The question is, why would you use a Ryzen 2400G which is not a really great CPU with an nvidia GPU. It's far from bad yet, perhaps like an old i7 2600K (old in terms of years not performance obviously)
    I wonder if Zen 2 brings PCIe 4.0 to the AM4 socket. I think it definitely should, you'll get PCIe 3.0 in an old motherboard and PCIe 4.0 in a new motherboard.

    Ryzen 2400G, 2200G and the new Athlon are like the Ryzen 1000 series on performance (but with the finer monitoring/power management of Ryzen 2000 series, but with half the L3), so a Zen 2 APU (Ryzen 3000 series) has significant performance to gain. Thus a next-gen Ryzen APU + Nvidia GPU rig would be a rather weird thing to do but not completely nuts.

    Leave a comment:


  • chithanh
    replied
    Originally posted by starshipeleven View Post
    HAP bit was mandated by US gov, which can be a minority in the worldwide market but wields tremendous power. Even a large lobby of consumers can't just phone Intel and ask them nicely to go fuck themselves like the US gov did.
    What I wanted to point out is that even if it is a minority, it can't just be labelled "vocal". It holds considerable purchasing or other power.

    Originally posted by grok View Post
    Another little bit : AMD CPUs with integrated graphics have 8x PCIe instead of 16x PCIe. Traces are reused for the display outputs.
    I expected that (same was done for DisplayPort on RS780), but I haven't seen confirmation of that until now. If true that would put AMD in a bad position as Windows (since 1803) allows FreeSync/Adaptive-Sync on NVidia cards if output is routed through a FreeSync capable iGPU. And Intel reconfirmed their plans to support Adaptive-Sync in the future. This could lead to the following paradoxical situation:

    Ryzen CPU + NVidia GPU -> no FreeSync, only G-Sync
    Intel CPU + NVidia GPU -> FreeSync

    Leave a comment:


  • grok
    replied
    Originally posted by Tomin View Post

    That's probably because they are currently doing dies with two CCXs and no graphics and dies with one CCX and graphics so it's either 8 cores or 4 cores and graphics. Intel seems to bundle graphics on (almost) all of their processors. Maybe some future processor has two CCXs and some (small) graphics there. It's also power constrained, Ryzens have slightly higher TDP values than Intel processors.
    Another little bit : AMD CPUs with integrated graphics have 8x PCIe instead of 16x PCIe. Traces are reused for the display outputs.

    Die area would be a problem as well, and even memory bandwith.
    We'll have to see what they do with Zen 2 on 7nm, in theory they could make an APU die with two CCX plus graphics, a CPU die with three CCX and no graphics.

    Leave a comment:


  • grok
    replied
    I've checked on anandtech, there will also be Athlon 220GE and 240GE.

    This Athlon 200GE is good to have, AMD has been too weak on the low end for too long. It's been so long though that few users will buy a low end desktop + monitor anymore when a laptop costs the same or less. At least it won't be slower than a 2C/4T Intel laptop.

    Still a bad time to get one (RAM prices)
    Athlon 200GE + 16GB RAM + low end NVMe SSD + hard drives would fly.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by chithanh View Post
    I think the existence of the HAP bit (aka. "ask ME nicely do please disable itself"), plus the fact that OEMs like Dell offer "ME disabled" computers (presumably through similar mechanism) tells us that it is just not a vocal minority.
    HAP bit was mandated by US gov, which can be a minority in the worldwide market but wields tremendous power. Even a large lobby of consumers can't just phone Intel and ask them nicely to go fuck themselves like the US gov did.

    Dell is just one of the OEMs, and one of the ones that sell also laptops with Linux, so they are obviously trying to cater to this minority and see if they can get into a new niche. Same for the Purism and other small OEMs that do the same, they are making it for a market niche.
    FYI: Apple always catered to a market niche, they created their very own religion for the sake of creating the niche where they can sell products. Does not mean that suddenly every buyer wants the notch in smartphones, or that everyone wants ultraslim laptops with only a couple USB-c ports.

    Leave a comment:


  • chithanh
    replied
    Originally posted by starshipeleven View Post
    Did it? Because apart from some vocal minority none gives a shit.
    I think the existence of the HAP bit (aka. "ask ME nicely do please disable itself"), plus the fact that OEMs like Dell offer "ME disabled" computers (presumably through similar mechanism) tells us that it is just not a vocal minority.

    Leave a comment:


  • Tomin
    replied
    Originally posted by GrayShade View Post
    One thing I don't get is why the more expensive processors don't have integrated graphics. If I buy a Ryzen 7 or a Threadripper, I'd rather not pay $100 more for a GPU, assuming I'm not interested in gaming.
    That's probably because they are currently doing dies with two CCXs and no graphics and dies with one CCX and graphics so it's either 8 cores or 4 cores and graphics. Intel seems to bundle graphics on (almost) all of their processors. Maybe some future processor has two CCXs and some (small) graphics there. It's also power constrained, Ryzens have slightly higher TDP values than Intel processors.

    Leave a comment:

Working...
X