Announcement

Collapse
No announcement yet.

Demo Of The Lima Driver On The KDE Spark Tablet

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Qaridarium
    the ARM company don't think this is nonsense because they work on 64bit ARM cpus.

    you can do the same with a 32bit chroot in a 64bit system without using PAE.
    and outdated + bad software from bad companys are not an argument for 32bit.

    and again i accept the argument: "Its cheaper"

    amd+intel are only duopolist monopole mafia companys and because of his the price is so high.

    BOM?

    sorry SoCs is just a buzzword in the end it dosn't matter.

    sure its cheaper (but in my point of view its the only argument)

    sure but you buy it for: no 64bit, less memory ram ,much slower
    for me "Be slower with less features" is not a feature.

    in the end the driver support wins the radeon driver for example do have a bad power management. But i don't think you have the man power to beat the radeon driver in features and power management.

    but maybe you are the master of the universe ?

    no sorry for me is cheap only cheap and not crap.

    512mb ram is really the bottom,,, i would prefer 1gb and more ram.

    i just wait for the version with minimum 1gb ram, dualcore cpu.

    and if not only cheap a 64bit cpu+more than 4gb ram. and a openGL3.3 gpu
    Ultrabooks are already pushing what is possible with x86 today. x86 is having trouble scaling down from its normal envelope. ARM is having far less trouble scaling up. The money each side has to spend to be able to compete on equal footing, is a few orders of magnitude different. In the space where quantum physics already is limiting what is possible, scaling down is nigh impossible. Scaling up however...

    BOM is Bill Of Materials.

    About me being the master of the universe, i am not, but you are overlooking one key fact there: Without me, and the massive amount of modesetting work i put in to push ATI over the line where they could not go back, there wouldn't be a free driver for ATI Radeon today. If i and my two suse colleagues had not been obstructed by ATI (and redhat) as much, and had been allowed to continue our stellar work, things might have looked pretty different today. We were pushing hard for any information on power management in 2007/2008, and we were hearing all the time that such things are board specific and therefor fully and utterly depend on atombios. We were seeing the bugs and crap in atombios all the time, and fixing things in our C code. If anyone could've figured atombios based stuff out, it was the three of us. I personally do not understand the hold-up, especially not why it is still being held-up 5 years on. And that in itself only strengthens the fact that if it wasn't for me and Egbert Eich and Matthias Hopf (who pushed the first free software triangle out of r600), we wouldn't have a free ati driver today.

    And do you really want a 64bit cpu, 4GB of ram, and the latest openGL standard in a mobile device today? Get real, or wait another year or two, or maybe get a phone the size of 80s cellphones.

    Comment


    • #12
      Go KDE

      The biggest problem with KDE on any tablet is that it looks terrible. And its behavior is no better. No design just a giant hack. KDE looks ok on a desktop because the desktop paradigm that it follows is rather an old standard - title bars consisting of close, expand, minimize buttons, etc. These don't translate to the tablet

      KDE needs a lead UI designer to control the whole user experience. And as much as I don't like the Unity desktop supported by Gnome, I think in the end it is going to win out on mobile devices (and desktop). The reason being is that you have one guy at the top paying the bills (Mark Shuttleworth ) giving him the last word on how things should look and behave. Granted Mark may be no Steve Jobs, but the one person control is a model that really can't loose,unless the guy at the top is really incompetent.

      Maybe I am wrong, that KDE has a UI plan and a committee or single person controlling things. If they do its has really failed in my opinion, as they keep working on bringing fourth newer technologies (now QML) and the semantic desktop, for which the average user just doesn't care about. Right or wrong uses want an IPhone/ITablet experience. Those who can afford to choose have spoken with their wallets, and when given a chose they buy Apple products.

      I'd like to see KDE emphasis less on the "Be Free" and focus on competing with other UI's that are out there.

      Viva KDE

      Comment


      • #13
        Anybody in favor of sending a request to the forum admins for banning idiots like Qaridarium on the grounds of:

        - being a general all-round idiot who cannot phrase a simple sentence properly
        - has proven himself / herself to be incapable of listening to logic and always wanting to troll such discussions
        - being a certified idiot and numbskull? Oh wait, i said that one already.

        Anyway, with that out of the way, I'm not sure I want 64-bit ARM SoCs any time soon, if only because my current Windows Phone 7 smartphone (an old LG Optimus 7) is already mighty fast as it is, and the last thing we need are for mobile app developers to start jumping on the 64-bit bandwagon by delibretely writing inefficient code that consumes more memory and power than it really needs.

        I remember how my computer science lecturer always described the pros and cons between 32-bit and 64-bit computing: one is a cheap Toyota Corolla (32-bit) and the 64-bit processor is like a Hummer. Both will get you from point A to point B, but if you are only going to transport 1 passanger most of the time, the Toyota gets the job done faster, more efficiently and more cheaply than the Hummer which drinks gasolene like there is no tomorrow, starts up slower and drains more power to get you from point A to point B.

        Same analogy applies for smartphones and tablets: they are not going to replace the traditional desktop PC and notebook seeing as their primary use is for on-demand communications while on the move, so why have them waste precious battery uptime with redundant technology like 64-bit and, ugh, 4GB of RAM? I sure as hell don't want to pay $1000 for a '64-bit smartphone with 8GB of RAM', yucks, but i'll readilly pick the smartphone that has a SoC with a 32-bit CPU core clocked at 1.6GHz (yes, i'm looking at THAT well-known SoC), 1GB of RAM and a price tag which done not burn a hole in my wallet.

        Unfortunately, numbskiulls like Qaridarium won't be able to see that logic.

        Comment


        • #14
          Originally posted by Qaridarium

          short and true answer: YES ! i sell my last android phone(192mb ram) because of this.
          I'm sick of having less ram than i need for my apps.
          i can handle a slow cpu but less ram than i need is a nightmare for me.
          maybe i don't need the lastest openGL standard i only need at minimum openGL 3.2! because of WINE. (sure also for an ARM device emulating x86+directX )
          Which ****ing moron runs WINE on a smartphone? Oh wait, you ARE a ****ing moron.

          OpenGL 3.2? Then please get your arse out of Linux: Mesa 8 only supports OpenGL 3.0 Go back to Windows, moron.

          Comment


          • #15
            Originally posted by DarkCloud View Post
            The biggest problem with KDE on any tablet is that it looks terrible. And its behavior is no better. No design just a giant hack. KDE looks ok on a desktop because the desktop paradigm that it follows is rather an old standard - title bars consisting of close, expand, minimize buttons, etc. These don't translate to the tablet
            I personally don't really care about the user interface. This is the first modern tablet computer where you can use a proper GUI toolkit (Qt + QML) and other parts of a full Linux stack. So long the GUI lets me launch my Qt + Python application, I'm fine.

            Comment


            • #16
              Originally posted by Qaridarium

              you are stupid as hel because right now you can buy 1000dollar smartphones without 8gb ram and without 64bit.
              1gb ram only cost 4€ right now this means only 32€ for 8gb ram and the OEMs get it cheaper also 20€
              you really think 20€ of ram is a big part on a 1000 dollar smartphone?
              and 64bit if they use mibs they have 64bit already and in 2014 64bit ARM chips are the standart.

              also "GHz" burns more heat than 64bit in fact you can do the same with less MHZ with 64bit in the same time this means 64 saves you energy* (*=if you need more than 4gb ram compared to PAE)

              and hey i vote for this version: '''Anybody in favor of sending a request to the forum admins for banning idiots like Sonadow on the grounds of:'''
              32 Euros to manufacture, but the added PCB routing, complexity, soldering, and quality assurance surely makes it go over 100 euros if you want to make some $$ off that. By any chance, could you explain to us the need of 64 bits in a mobile landscape? If such need existed, why aren't we using MIPS as of now? What are the advantages of 64 bits over 32 bits? Does it make sense to trade off a feature that isn't not a feature in a mobile context for an added power consumption?

              And just listing off the top of my mind workloads wich could benefit from 64 bits memory addressing, how important is for scientists working on large simulations , let's say,in the quantum chemistry subject, to compute it on a mobile tablet? A 3D artist would be rendering something in his/her tablet? heck, and i ran out of examples.

              Btw, i vote for you being banned. The other guy, i like him much more than you. He has rethoric.

              And I'm a computer science student, i know what im talking about.
              Last edited by WillyThePimp; 13 February 2012, 09:59 PM. Reason: Added " the"

              Comment


              • #17
                Originally posted by WillyThePimp View Post
                32 Euros to manufacture, but the added PCB routing, complexity, soldering, and quality assurance surely makes it go over 100 euros if you want to make some $$ off that. By any chance, could you explain to us the need of 64 bits in a mobile landscape? If such need existed, why aren't we using MIPS as of now? What are the advantages of 64 bits over 32 bits? Does it make sense to trade off a feature that isn't not a feature in a mobile context for an added power consumption?

                And just listing off the top of my mind workloads wich could benefit from 64 bits memory addressing, how important is for scientists working on large simulations , let's say,in the quantum chemistry subject, to compute it on a mobile tablet? A 3D artist would be rendering something in his/her tablet? heck, and i ran out of examples.

                Btw, i vote for you being banned. The other guy, i like him much more than you. He has rethoric.

                And I'm a computer science student, i know what im talking about.
                There are some advantages to the x86_64 architecture over the x86. That cannot be said in general for all architectures, though - stating that a generic ARM cpu must be 64 bit or it sucks is just stupid.

                Comment


                • #18
                  Originally posted by smitty3268 View Post
                  There are some advantages to the x86_64 architecture over the x86. That cannot be said in general for all architectures, though - stating that a generic ARM cpu must be 64 bit or it sucks is just stupid.
                  Either is impliying that there is a need for it to be so, i develop software, and my 3GB laptop suits me more than enough. If i ever step up from my current work to a more complex and demanding eviroment, so will do my tools. Tablets and Mobile are used mainly by socialite-bussinessman, public communicators and people that don't do interesting things. I have hard times programming or even writing long stuff on a 7" touchscreen. There are no such things as a mobile proton collider to go with you scientific needs alongside your tablet, and touch panels aren't reliable enough for any (serious) artist (I can't paint shit on one as i do on paper), etc.

                  Comment


                  • #19
                    Originally posted by Qaridarium
                    you just compare apples with bananas. you can not compare a 32bit cpu with an 64bit cpu!
                    i wait in the year 2014 arm will release there 64bit version then we can compare it without 32 bit fake!
                    for me 32bit is a joke i like systems with 32gb ram and more and not 4gb ram.
                    and no PAE is just fake! if you compare arm+PAE vs nativ 64bit then ARM lose all benchmarks only because of PAE.

                    this means i don't believe your performance/watt argument because i only count nativ-64bit apps.
                    Q, you neglect some "details" of the x86 architecture, namely it's huge legacy baggage from the late 70's - early 80's, the most obvious part of this baggage is the 16-bit mode of operation that your latest-gen 64-bit CPU still posses. This makes up for some pretty complex logic for stuff that is never used by modern applications like BCD arithmetic.

                    Good news is that with BIOS fading away and being replaced by EFI (and hopefully coreboot) future generations of CPUs will no longer need to implement the 16-bit mode. But if by the time ARM comes with 64-bit CPUs x86 will still have 16-bit real mode you can bet the 64-bit ARM CPUs will be massively faster/watt then 64-bit x86 CPUs.

                    And even if x86 does drop the obvious baggage, ARM CPUs will probably still be significantly faster/watt then x86 because of the bad influence that early 16-bit design has had on the future x86 ISAs, the 32-bit one mostly, but to a slighter degree also the 64-bit one.

                    Probably what both AMD and Intel bet on is that GPGPU will become so widespread in the future that the actual CPU will become mostly irrelevant for any computation intense task so the inefficiency of the x86 architecture will go unnoticed

                    Comment


                    • #20
                      Originally posted by Ansla View Post
                      Q, you neglect some "details" of the x86 architecture, namely it's huge legacy baggage from the late 70's - early 80's, the most obvious part of this baggage is the 16-bit mode of operation that your latest-gen 64-bit CPU still posses. This makes up for some pretty complex logic for stuff that is never used by modern applications like BCD arithmetic.

                      Good news is that with BIOS fading away and being replaced by EFI (and hopefully coreboot) future generations of CPUs will no longer need to implement the 16-bit mode. But if by the time ARM comes with 64-bit CPUs x86 will still have 16-bit real mode you can bet the 64-bit ARM CPUs will be massively faster/watt then 64-bit x86 CPUs.
                      BTW, ARM has a 16-bit mode too, it's called Thumb. They just occasionally break backwards compat, introducing Thumb2 etc, which x86 can't really do.

                      Comment

                      Working...
                      X