That's a terrible idea. A GPU is a chip made for certain tasks. It is really good for those tasks, but really terrible for others that it isn't made for. It would not be only very inefficient, but also slow. Besides that, I'm not even sure if current GPUs have all the features to do all the tasks CPUs do.
Announcement
Collapse
No announcement yet.
linux kernel running on GPU without CPU
Collapse
X
-
Originally posted by A1B2C3 View PostI'm sorry but you are weak in this matter. Please look at the architecture of the CPU and GPU. [..] The CPU is primitive.
Originally posted by A1B2C3 View PostThey are like jobs, they just use algorithms for solving that are more suitable for execution on the CPU
Originally posted by A1B2C3 View PostIt's like if you pulled a racing car in a cart with horses and said that a racing car is not faster than horses.
I won't write more because you don't seem to understand the basics.
- Likes 1
Comment
-
Originally posted by A1B2C3 View Post[T]here are no algorithms for solving standard problems using GPU. everything [sic] is CPU-oriented.
...
deep [sic] parallelization will help solve the stupid killing of processes in the Linux kernel, solve the problem with queues and other diseases that seem incurable when using the CPU.
Comment
-
I think what you want is a CPU with thousands of cores, rather than a GPU. GPUs are entirely designed to be co-processors, they don't have any of the general-purpose machinery in place to operate by themselves, they're basically designed to process many small equations at once, and don't have any of the hardware orchestration abilities CPUs perform. You could probably design a CPU with thousands of cores, but at that point it wouldn't look anything like x86/ARM/RISC-V, and the cores themselves would probably be slower than the dedicated GPU cores. On the flip side, standard GPU architectures can't run by themselves, you can't simply program an OS to run without the CPU, the computer's physical architecture doesn't support it, the computer literally will not turn on.
Also, why would you even want this? It might be interesting from a manufacturing perspective, but there's nothing really wrong with having a CPU with a GPU as the co-processor. Even if the GPU is doing most of the work, you can still have the CPU perform general system management functionality, which is pretty much how it works now anyway. In fact, modern GPU architectures are bypassing the CPU and RAM entirely to perform work more efficiently, but they're not going to replace the CPU because that would be a huge waste of time (both to invent it, and a waste of the GPU's own processing time) when it's easier to simply split up simple tasks and parallel tasks into two separate dedicated processors.
Comment
-
Originally posted by A1B2C3 View Post
Guys, you are wrong. I know that it is possible. It is difficult but possible. Let's in this case each of us stick to our own opinion.
If you *know* a standard GPU can run by itself, then, how? I'm genuinely asking.
Otherwise, if you don't know, why don't you accept our answers? As far as I know from how GPUs work, they can't run by themselves, because they literally weren't built to do that, they are designed from the ground up to be a co-processor. Aside from being extremely proprietary and essentially entirely different architectures between generations, they don't have any hardware capabilities like networking, disk control, probably don't even have an MMU, I'd assume the CPU or Southbridge handle that. Again, it's totally fair to ask for a CPU with thousands of cores, but you are literally asking for an existing co-processor to perform all systems work, that's like saying you can rip out the 8086 of an IBM PC and only use the 8087, when that's physically impossible.
Also I'm still not sure why you hate CPUs so much. I did just explain they're not only perfectly fine at what they do, but they actually offload work from the GPU, the GPU performs better because it doesn't have to do all of the serial non-parallel work the CPU is doing. You could easily turn your (frankly insane) argument around and say we should all stop using GPUs because technically you can render on the CPU. Both arguments don't work because current computer architecture designs of having a single workload workhorse and a highly parallel processor optimized for tiny mathematical operations fits perfectly in our modern computing paradigm, computers need to do both expensive non-parallel tasks as well as highly parallel micro-tasks, which CPUs and GPUs are perfect for, respectively. What you propose is either to bog down the GPU with expensive non-parallel workloads it isn't optimized for, or to have a CPU with thousands of multi-purpose compute cores that are a jack of all trades but a master of none.
Comment
-
Originally posted by A1B2C3 View Postthat it is time to make the transition. CPU time is up.
So far you just sound like a ~2008 AMD marketing executive demanding more cores without any regards as to how well they perform.
And, no, nobody wants to kill CPUs. To be frank, I'm actually not convinced you're even aware of what a CPU or GPU actually is.
Comment
-
Okay, I think I at least understand you a little better. So this is political and not technical.
Originally posted by A1B2C3 View PostAdapt the Linux system for implementation in the core of neural networks, no NPU can compare with the GPU for these purposes.
Again, this seems to tie in with your misunderstanding of what a compute core is. CPU cores are entirely general-purpose, GPU cores are meant for simpler mathematical operations, but perform them faster than CPU cores, NPU cores are built specifically for ML-specific matrix operations, and are even better than GPU cores at it. This is why we have the different processor types; the more general-purpose a core is, the worse it is at specific tasks. Countering that, the better a core is a specific task, it becomes far worse as general-purpose ones. It's specifically why we don't throw everything at the GPU, because a lot of computational tasks would be horrible to perform on a GPU compared to a CPU, for a variety of reasons. The CPU is the safe default for computation because it's simply the best component for general purpose computing, people only go to the GPU ("GPGPU") when they have a computational workload that can specifically take advantage of the capabilities in GPU cores without suffering their downsides.
Originally posted by A1B2C3 View PostNVIDIA has made a move into the AI sector.. AMD and Intel will never be able to compete with them in this industry. The only way out is if People start buying GPUs from Intel and AMD instead of CPUs. Then they will be saved from collapse and people will in turn receive a base for realizing their needs and requirements. Everyone will be fine. Shitty GPUs from Intel and AMD that are not even good for games will be able to satisfy the demand instead of the CPU. Together, we will all enter the future without losing anyone. Linux systems can save AMD and Intel and become the only first system suitable for scientific work and entertainment. I think that Linux systems will become as popular as Windows. This is a chance for everyone. It is stupid to lose it.
I fully agree with your sentiment that we absolutely need competition for nvidia, but, this solution has nothing to do with it. AMD and Intel's GPUs don't suck because they're under-utilized, they suck because they simply don't care. AMD neglects it's GPU division for it's CPU division (a very stupid move, but not one that will be solved by somehow magically making GPUs work autonomously), and Intel is outright incompetent and losing market value by the day because of it (people are comparing them to Boeing). As much as I dislike nvidia, they are quite literally the only company competently making advanced GPUs.
Originally posted by A1B2C3 View PostEverything has already reached a dead end with the CPU. People are tired of lies. Nothing is being done to improve the lives of the masses.
I find this particularly ironic, personally, because I vividly remember watching a video from JayzTwoCents a few months ago where he said "It's funny CPUs are exciting now while GPUs are boring, because in the past it was the GPUs that were exciting while the CPUs were stagnant." referring to how both Intel and nvidia don't innovate much because their competition simply isn't threatening. CPUs were boring because AMD didn't have the resources to compete with Intel, but now they're exciting because their CPUs are being massively innovated and it's quite exciting. Meanwhile, now GPUs are boring because AMD is as slack with their GPUs as they were with their CPUs before they finally invented the Zen architecture.
Comment
-
Originally posted by A1B2C3 View Postthe most important thing is that it will become easy to develop software. you will not need to write code for the CPU. there will not be such a mess as with a huge number of aarch64 processors. This is a huge plus for the Linux kernel. The Linux kernel will be able to move from chaos to order.
This really points out that you have no idea what a CPU or GPU even is. I don't even mean that insultingly, you're genuinely making statements about things you know nothing about.
Comment
-
Originally posted by A1B2C3 View Post
You're the one who doesn't understand my messages. I have already talked many times about algorithms for solving problems on the GPU. Today, a programmer needs to solve a problem by thinking simultaneously on GPU algorithms and CPU algorithms. they will switch to only one algorithm, and they will improve it well. as for changes in GPU architectures, that's up to the compiler. The programming language will remain the same for all architectures and algorithms as well.
I really don't see any more point in this conversation, you don't even seem to understand what computers are, you sound like a politician trying to pass an IT law. I really recommend actually learning what each component in a computer is and how to perform GPU workloads in code, because I 100% know you know nothing about either.
Comment
-
I haven't read so much ignorant nonsense in a very long time. The facts don't change because you are of a different opinion. GPUs are just not suited for that task no matter how much you want them to be. That's why nobody is wasting time or money to make Linux run on a GPU.
- Likes 1
Comment
Comment