Announcement

Collapse
No announcement yet.

NVIDIA May Be Trying To Prevent GeForce GPUs From Being Used In Data Centers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • starshipeleven
    replied
    Originally posted by coder View Post
    Religion, nationalism, feudalism, charity, conscription, and slavery are all proven methods of doing things at scale. It's just that capitalism seems the most flexible, robust, and potentially fair. But just because you have a really nice hammer doesn't automatically turn every problem into a nail.
    Neither scales anywhere near well enough to base a modern society on. There is a reason if all historical examples in richer nations died off centuries ago, any surviving examples are relegated to less-developed countries or became little more than scams (Religion and Nationalism I'm looking at you).

    Back then in the day when they were still effective and trendy, the situation was different:
    -world population was much less overall
    -crappy communication and transportation meant that the actual amount of people that could interact with each other on a daily basis was tiny

    So while whole nations were using slaves or were feudal kingdoms it still does not mean the above were good enough at the scales required now. Their total population was like a few million individuals and most commoners never interacted with people from outside their village.
    In these situations the method to "get the job done" only needs to work at village-like or small-modern-city scale.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by aht0 View Post
    If particular university lacks some sort of data center that particular researcher/scientist could make use of, it's pretty darn unlikely she/he would have resources for setting up her/his own.
    In those cases you can (and usually do) rent computing power from other universities or third party data centers or computing nodes or whatever. Many supercomputers are rented on a per-hour basis to run some calculations for someone. Costs quite a bit.

    And that NVIDIA licence applies to data centers in particular, not to universities.
    "data center" is a generic term, it can also mean the physical place (room or building) you put the computing hardware in. And yes, NVIDIA lawyers aren't retarded. They decided to stay generic on purpose.

    And whole university has budget sufficiently big that it could allow professional grade cards instead of consumer-grade.
    That's nonsense on two levels.

    1. budgeting isn't a choice taken by some god-like figure that decides where to spend money for all departments down to white A4 paper and printer supplies, and is also all-knowing enough to make wise choices on everything. The budget is split to each department's head manager, and then depending on how it goes it may be split more (may have sub-departments or something else). Those that decide to buy hardware for their department or subdepartment only have a relatively small amount of money, which usually covers the expenses and salaries and little more.
    To get more funding they need to escalate the requests and convince those above them (which may need to do the same with their superiors). And this process can be interrupted very easily at any time by one of the superiors that decides it's not worth it, note that managers and superiors don't usually know the details of all sciences, so they can't really say if the requests to get "new and cool instrument X" are actually a good idea or not, and convincing them is not straightforward. The higher the price of the "new cool instrument", the higher the difficulty.
    So yeah, total budget is relatively high, but does not mean each branch can easily get access to large sums of money beyond their immediate needs because chain of command and bureaucracy (and ignorance).

    2. GPUs are a tool, a means to an end. When you gotta get a job done, you don't care of stickers. On quite a bit of computing loads you don't need ECC VRAM or higher buffers or whatever, you only need raw power, low power consumption and other more mundane things. And if that is the point, then it is simply stupid to buy more expensive cards just because you are big and wealthy (and stupid).

    Local university here is not the richest (former USSR) but at least it's "clinic component" runs Oracle DB on Solaris platform. Which ain't exactly cheapskate option.
    That just says they waste money. Due to the fact (explained above) that budget decisions involve social skills, every academic institution (or human organization for that matter) burns money on bullshit just because someone knew how to sell his idea better than the others (or bribed his way around).

    Leave a comment:


  • coder
    replied
    Originally posted by anarki2 View Post
    Meh. There's no "may be trying". They've 100% achieved this years ago. Not de juro, but de facto.

    You can’t use consumer GPUs in datacenters
    Lol. He thinks the only reason they switched from end-mounted power connectors to top-mounted is to screw people trying to put them in server cases?

    He's completely ignoring that many desktop cases are length-constrained but have extra room along the top edge of the card. This is simply the most pragmatic arrangement for desktop cases. And I seriously doubt their board partners are prevented from making boards with end-mounted power connectors. He's got to provide some evidence of this, rather than pure speculation.

    He's also wrong about dimensions. You could have a 4U case with vertically-oriented desktop GPU cards. Only 3U is too short to accommodate the top-mounted power connector.

    Leave a comment:


  • xorbe
    replied
    And you wonder why nvidia wanted to move to email logins on consumer hardware.

    Leave a comment:


  • coder
    replied
    Originally posted by starshipeleven View Post
    There is no other system that works on decent scales than "greed" to get people to throw resources at problems, so... whatever. There isn't other choices.
    Religion, nationalism, feudalism, charity, conscription, and slavery are all proven methods of doing things at scale. It's just that capitalism seems the most flexible, robust, and potentially fair. But just because you have a really nice hammer doesn't automatically turn every problem into a nail.

    Leave a comment:


  • coder
    replied
    Originally posted by aht0 View Post
    If particular university lacks some sort of data center that particular researcher/scientist could make use of, it's pretty darn unlikely she/he would have resources for setting up her/his own. And that NVIDIA licence applies to data centers in particular, not to universities. And whole university has budget sufficiently big that it could allow professional grade cards instead of consumer-grade.
    Unless they actually defined "Data Center", then you should be aware that creative lawyers can and will bend that definition to include things like server rooms or even potentially computer labs. You need to dig into the meat of the specific language they used before deciding whether it potentially applies to specific universities and departments.

    Leave a comment:


  • coder
    replied
    Originally posted by pmorph View Post
    Nonsense.
    In capitalism you stay in business by providing products that offer better value than your competition. People won't give you their money just because you are greedy.
    What we have here is a market failure. Part of the problem is that CUDA is potentially subject to copyright protections, in the case of a similar ruling as the one in

    https://en.wikipedia.org/wiki/Oracle...v._Google,_Inc.

    This is obviously an abuse/misinterpretation of copyright, but now that we know it can happen, it's probably why AMD was extra cautious in implementing HiP as a porting toolchain (https://github.com/ROCm-Developer-Tools/HIP) rather than providing API/ABI-level CUDA compatibility. So, it means the market will take longer to adapt. In the meantime, Nvidia can milk their customers. Once the marketplace becomes more competitive, they can adapt by adjusting their product offerings, pricing structure, and/or license terms.

    Leave a comment:


  • aht0
    replied
    Originally posted by onicsis View Post
    This restriction sounds illegal to me. It's like Intel or AMD would forbid games to be played on systems with low end CPUs.
    Unless you can point out particular law being violated, be it US's or EU it's legal. Can't be "a bit legal or illegal" like women cannot be a bit pregnant.

    Since they have perfect right to define rules covering the use of their own products as they wish, you as a consumer and customer have perfect right refrain from buying. Both sides happy.

    Leave a comment:


  • onicsis
    replied
    This restriction sounds illegal to me. It's like Intel or AMD would forbid games to be played on systems with low end CPUs.

    Leave a comment:


  • aht0
    replied
    Originally posted by starshipeleven View Post
    I like how you assume that all academics/universities in the world must be crawling with cash to throw at problems just because you think your local one is.

    In general it's not true. They do have a lot of money because they don't run on love and prayers, but it's always never enough to pay for all researchers. And any researcher will do all he can to maximize the cost-effectiveness of his equipment, because if he can get double the processing power at the same price, you can bet your ass that he will do that instead of paying more to get a more "certified" GPU.

    I'd like to point out that 300k euro can be OK a smallish research project, using research machinery is expensive (especially when you have to lend it from someone else) and also test kits (chemical consumables to make tests and stuff) are expensive. Unless the prices in your country are dramatically different from the richer countries where I live, but I doubt it for high-end things like this.
    If particular university lacks some sort of data center that particular researcher/scientist could make use of, it's pretty darn unlikely she/he would have resources for setting up her/his own. And that NVIDIA licence applies to data centers in particular, not to universities. And whole university has budget sufficiently big that it could allow professional grade cards instead of consumer-grade.

    Local university here is not the richest (former USSR) but at least it's "clinic component" runs Oracle DB on Solaris platform. Which ain't exactly cheapskate option.

    Leave a comment:

Working...
X