Originally posted by libv
View Post
Announcement
Collapse
No announcement yet.
Red Hat Is Looking To Hire Another Experienced Open-Source Graphics Driver Developer
Collapse
X
-
Originally posted by Farmer View Post
Would it have been easier to understand if it was rewritten to say: "five years of C++ experience - preferably with C++17 included?"
Leave a comment:
-
Originally posted by stan View PostA better comparison would be to Mac OS 10.1 Puma, which had way prettier 3D eye candy (ie. "genie" window minimization into the dock, realistic drop down shadows, glowing bevels, etc) than GNOME 3 does today. Keep in mind that Mac OS 10.1 Puma was perfectly snappy on a Power Mac G3/233 made in 1997 with a RAM of 128 MB (yes, megabytes) and an ATI 3D Rage II+ graphics card.
Now realize that GNOME eats up 3 Gigs of RAM after half an hour of use, and none of the core developers have a clue how to debug the garbage collection in gnome-shell's dependencies and scripting language.
When the garbage collection does work, it takes CPU usage through the roof. How embarrassing.
Leave a comment:
-
Originally posted by starshipeleven View PostKDE 5 Plasma and GNOME3 are perfectly responsive and work perfectly fine on a workstation running a shitty ATI Xpress 200 integrated graphics from 2004. I just replaced the ancient HDD with a more modern HDD (still mechanical, no SSD) for unrelated reasons (the stupid thing was noisy).
Now realize that GNOME eats up 3 Gigs of RAM after half an hour of use, and none of the core developers have a clue how to debug the garbage collection in gnome-shell's dependencies and scripting language.
When the garbage collection does work, it takes CPU usage through the roof. How embarrassing.
Leave a comment:
-
Guest repliedOriginally posted by Britoid View Post
I was never a huge fan of the early versions of GNOME 3, but in the last few years it's been preferable to me over everything. I never give too much care to customize my desktop anymore anyway.
Leave a comment:
-
Originally posted by DoMiNeLa10 View Post
I'm pretty sure it's impossible to have that much C++17 experience considering how long ago 2017 was. It's almost like that Swift job offer people joked about a while ago. These requirements are mostly bullshit, and I don't know why they even bother, do they expect people to just ignore what they post in their offers and apply anyway?
1. When you write a posting for hire you include everything that the perfect candidate would have. "Ask for the sky." None will have it.
2. When the candidates apply, with none having the "ask for the sky" level, you review the level they have. Then you get the choice of hiring the best of the lot or going back to the well. Going back to the well means dealing with HR again and is quite a pain. So, as a manager, you're asking yourself if the best candidate has sufficient skills to permit you from going through it all again. Which boils down to: "can this person be developed."
Here we get to unsolicited advice #1: do not overstate your qualifications. You'll come across poorly. Rather than trying to stretch your skills in an interview focus on your ability to adapt and learn. "I don't have that level but I'd love the opportunity to develop them."
3. Posting a fixed salary isn't suggested as then you're decision in #2 gets tough. "While the best of the lot is probably good enough to be hired, we screwed ourselves in posting a salary they don't command." "Pay based on skills and experience." See how that works?
I'm retired now but spent a couple of decades as a manager in a large place - technical manager. Hiring is, for a manager, a pain best avoided. Whether you see it or not that is good motivation to retain and advance internal candidates.....
My wife once reviewed a job posting and wondered if she should apply. "Apply. If you get the offer you can decline but if you don't apply you've passed a possible opportunity."
Toss your hat in the ring. Worse thing that can happen is they say no.
What I had happen a few times was a candidate wasn't a good fit for that job but was a very good candidate for another job we had open they weren't even aware of.
FWIW
- Likes 1
Leave a comment:
-
Originally posted by DoMiNeLa10 View Post
I'm not saying that KDE is good, I'm just noticing that people around me flock to it because of how universally despised GNOME is.
- Likes 1
Leave a comment:
-
Guest repliedOriginally posted by Britoid View Post
My hardware runs GNOME 3 3.32 perfectly fine with zero stutter. It's still not perfect, but I've found KDE isn't exactly much better either.
And if you think GNOME is for touch screens you're using it wrong. It's not trying to be a MacOSX or Windows clone which makes it unique among the larger DEs.
Leave a comment:
-
Originally posted by DoMiNeLa10 View Post
Show me hardware that can run GNOME 3 well. I've seen i5-8400 boxes stutter with nothing besides GNOME running. People can't stand this crap, and they switch to KDE. The desktop is just awful in pretty much every single aspect, it feels like it was designed for touch screens, and it's so bad that it makes Windows 8 and it's metro interface look good.
Why do they even bother with providing an official desktop? People who spend time working with GNU/Linux workstations will tend to have their preferences and dotfiles at hand to make themselves at home in a couple of minutes. I assume plenty of developers who put in the effort to pick something decent picked a tiling WM and called it a day.
And if you think GNOME is for touch screens you're using it wrong. It's not trying to be a MacOSX or Windows clone which makes it unique among the larger DEs.
Leave a comment:
-
Originally posted by DoMiNeLa10 View Post
Show me hardware that can run GNOME 3 well. I've seen i5-8400 boxes stutter with nothing besides GNOME running. People can't stand this crap, and they switch to KDE. The desktop is just awful in pretty much every single aspect, it feels like it was designed for touch screens, and it's so bad that it makes Windows 8 and it's metro interface look good.
Why do they even bother with providing an official desktop? People who spend time working with GNU/Linux workstations will tend to have their preferences and dotfiles at hand to make themselves at home in a couple of minutes. I assume plenty of developers who put in the effort to pick something decent picked a tiling WM and called it a day.
And the Ryzen 1700X with its Vega is really smooth. It never glitches. The Raptor II with its Radeon WX7100 is also ridiculously good at Gnome 3.
Leave a comment:
Leave a comment: