Announcement

Collapse
No announcement yet.

That Open, Upgradeable ARM Dev Board Is Trying To Make A Comeback

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by duby229 View Post

    I think you misunderstand.
    quite likely...

    EDIT: And the smacking Intel silly thing, yeah, fly buzzing around an elephant seems like an appropriate metaphor. Essentially exactly what I said, I don't have the power or stance to do anything about it, Although I wish I could. I said one metaphor, you said a different one that means exactly the same thing.
    doh well, the thing is - my point is - it was some time about 5 years ago that i suddenly went, "y'know what? i see a lot of people - including myself - being totally resigned and feeling powerless. i don't *accept* that i am powerless. so i *am* going to do something about this, starting now" and that's when i began to think of a strategy to bootstrap up to an independent ethical computing business. i don't consider *anything* to be beyond the realm of possibility: the pieces of the puzzle are out there, it's just a matter of assembling them.

    for example, there have been a number of attempts to put together an open hardware GPU design. the ones i can currently find are the ORSOC graphics accelerator (2D), the GPLGPU, and the MIAOW. the MIAOW is an OpenCL-compliant engine, not a full GPU. i did find one that was based around integers and the guy even ported OpenGL to integer arithmetic for it, but that was 5 years back.

    so, the pieces are coming together... now all that's needed is to back-to-back that into a business deal with a pre-existing successful fabless semi-conductor company, justifying it by way of explaining to them that they could save money by *not* having to license proprietary hard macros, having a long-term better stability and success in their product line, and so on.

    my point is: once you take the blinkers and the brakes off (fear, resignation, sarcasm) it's like, "oh... errr... is that all that needs to happen?"

    Comment


    • I beg Intel to make something equal to or better than Iris Pro it's minimum standard. It would be so good for everyone including your fan base.

      Comment


      • Originally posted by lkcl View Post

        quite likely...



        doh well, the thing is - my point is - it was some time about 5 years ago that i suddenly went, "y'know what? i see a lot of people - including myself - being totally resigned and feeling powerless. i don't *accept* that i am powerless. so i *am* going to do something about this, starting now" and that's when i began to think of a strategy to bootstrap up to an independent ethical computing business. i don't consider *anything* to be beyond the realm of possibility: the pieces of the puzzle are out there, it's just a matter of assembling them.

        for example, there have been a number of attempts to put together an open hardware GPU design. the ones i can currently find are the ORSOC graphics accelerator (2D), the GPLGPU, and the MIAOW. the MIAOW is an OpenCL-compliant engine, not a full GPU. i did find one that was based around integers and the guy even ported OpenGL to integer arithmetic for it, but that was 5 years back.

        so, the pieces are coming together... now all that's needed is to back-to-back that into a business deal with a pre-existing successful fabless semi-conductor company, justifying it by way of explaining to them that they could save money by *not* having to license proprietary hard macros, having a long-term better stability and success in their product line, and so on.

        my point is: once you take the blinkers and the brakes off (fear, resignation, sarcasm) it's like, "oh... errr... is that all that needs to happen?"
        I'm picturing the examples in my mind while google searching and it seems fantastic. Your mind-state is awesome.

        EDIT: I'm still educating myself on the technology you mentioned there, but I want to say something. I'm very impressed. I think you can go very far. Don't lose that state of mind, because it really is awesome.
        Last edited by duby229; 11 July 2016, 04:53 PM.

        Comment


        • Originally posted by duby229 View Post

          EDIT: I'm still educating myself on the technology you mentioned there, but I want to say something. I'm very impressed. I think you can go very far. Don't lose that state of mind, because it really is awesome.
          the only risk is i get accused of being hugely egoistic. that's happened before (years ago, many times)... and yet i'm still here. i believe, now, that my 20-year track record of always working in the public eye - i actually get nervous when talking *privately* with people about software libre and hardware development - speaks for itself.

          Comment


          • OK, it might be possible to put up with the limitations that it's not the fastest ARM out there, in exchange for running mostly free software. But it really chafes me that you design the board such that perfectly good ethernet and sata controllers can't be used. I wouldn't be satisfied with that 1366 screen either - why not full HD? Those are not so hard to find. There's at least no actual limitation to run a full HD screen from your digital video pins I hope? But I don't know what sort of interface chip is needed to go from your video bus to whatever the LCD needs (lvds?). And yet the A20 already has LVDS output too.

            What about multiple cards on a bus? A tablet or laptop could have space for maybe 4 of these cards. If you want to multitask heavily (or just to compile big software packages faster) you could put ARM cards in all the slots. Or, install more specialized hardware in some of them. I think that one way for free software to take back the GPU is to design an open GPU on an FPGA. Granted, most FPGAs don't have open tools available - that's a big limitation now. But it might change. Whether non-free tools need to be used or not, a successful GPU design could be turned into an ASIC pretty easily, and then we'd have our fully open GPU without needing to ask some other company to design it. So I keep thinking I want to work on that, I just don't get around to climbing the learning curve for it, so far. I'm not sure if it's the best use of my time, and would sure take a lot of it. Anyway, it's kindof a problem if your pinout doesn't have a high-speed data bus for connecting to such things. You're spending a lot of pins on parallel digital video. I guess we'd have to get by with SPI or I2C or USB for interfacing between cards? Wouldn't want the bus to be a master-slave architecture like USB would have it be.

            I don't care about gaming much, but I care about the GPU, because as a Qt Quick developer, I've seen that a scene graph is a useful way to render even mundane 2D UIs. If nothing else, you need a way to do smooth scrolling. If you have even a text terminal running fullscreen, or a web page, you never want to expect the CPU to re-send every pixel to the display at its full refresh rate (usually 60FPS) while scrolling - but if you don't, it doesn't feel smooth. It's just not the optimal use of CPU power: that's what GPUs are made for. And there are uses for smooth scaling of fonts, other kinds of animations besides smooth scrolling, etc. So I think I'd rather run a blob for a while than have no GPU, and just hope that since the GPU only receives and displays data rather than doing 2-way communications, it's not a very apt spying platform. And if whoever is still interested in opening up the Mali wants to run a crowdfunding campaign, I'd contribute to that.

            Not sure about the heat management either. You want to use the metal case as a heat spreader I guess, but will it have good enough thermal contact with the chip? What about further dissipation of the heat after it gets spread across that card case? The card is fully inside an enclosure which insulates more than it conducts heat (plywood and/or plastic). So, you are afraid of any chip that needs a heatsink, which rules out all the powerful ones, and that becomes a permanent design limitation. What I would try to do is design the computer to have the CPU card(s) mounted on an aluminum heatsink plate, with some sort of flat (comfortable on your lap) fins on the outside, and an array of tapped holes on the inside, like an optical table. (But not too thick.) That beats plastic and bamboo: it does heat dissipation and makes the case really sturdy at the same time. (An FPGA needs the heatsink even more than the CPU does.) But maybe you'd need shims between the SOCs and the back plate, because probably something on the board is taller than the chip, and you want to be able to replace the shims when upgrading to some future board which has the chips in different places. So the board should bolt down to the back plate, with replaceable spacers at the bolt holes, and replaceable shims. There would be high-density surface-mount connectors on the backs of the boards, or golden-finger edge connectors. Interconnects could be flex circuits, or ribbon cables. So you try to have a fixed bus design for a generation or several, but with the expectation that any bus becomes obsolete eventually, and then the interconnects could be replaced when doing some future upgrade, while still being able to reuse the back plate itself (and the rest of the tablet or laptop that's built around it). This design could already be executed with the Dragon boards, for example.

            Are you really sure what the lifetime of that A20 chip is going to be?

            Comment


            • G2D, which is supported by the xf86-video-driver-fbturbo ... the Libre Tea Computer Card will go out with 2D accelerated X11 by default
              OK, but X11 is nearing obsolescence. Shoehorning 2D drawing through those antiquated APIs is being done less and less over time. I'd be wanting to run OpenGL-based apps on Wayland... there's another use for that GPU, to do window compositing. And does the A20 also have compositing hardware over and above the hardware necessary for OpenGL? Some ARMs do.

              The efforts towards open drivers for GPUs are really important, IMO.

              I have an AIO Zeus 3D printer, which runs on a quad-core odroid board. They also don't make use of the GPU for some reason, even though they are running Ubuntu, and there is a package for the proprietary drivers that they could just install and start using. I don't know if being proprietary was what stopped them or not. So it has kindof a lousy UI: all tapping, no dragging or flicking. Another big waste of perfectly good silicon.

              Comment


              • ouaaaa, ecloud, this is an enormous number of issues and questions, let me go through them as quickly as i can as i have a lot to do

                Originally posted by ecloud View Post
                OK, it might be possible to put up with the limitations that it's not the fastest ARM out there, in exchange for running mostly free software.
                entirely free software, it turns out - but that misses the point, which i deliberately emphasised by mentioning the passthrough card, FPGA card and the debian-based distro as well as the fact that there are nearly a dozen different distros for the A20, including l4android, l4linux and FreeBSD. sorry mixed the points there but you get it - please look up answers on other forums

                But it really chafes me that you design the board such that perfectly good ethernet and sata controllers can't be used.
                i cover this in the questions section, look on the crowd funding page search for "SATA" - more detailed response is on http://rhombus-tech.net/crowdsupply/

                I wouldn't be satisfied with that 1366 screen either - why not full HD? Those are not so hard to find.
                the screens *themselves* aren't hard to find, but $2.50 and $3 to $7 SoCs that can do 1080p definitely are. the jz4775 will just about do 1080p RGB/TTL but its internal memory bandwidth for the GPU's access to the internal memory bus is sufficiently limited that it can only manage 720p video playback. many more examples like this.

                i've answered this question already, there is a 3.3mm variant of the standard which will require costly tooling (plus 0.8mm PCBs not 1.2mm) so i am leaving that for later.

                There's at least no actual limitation to run a full HD screen from your digital video pins I hope? But I don't know what sort of interface chip is needed to go from your video bus to whatever the LCD needs (lvds?). And yet the A20 already has LVDS output too.
                the standard has to be able to honour low-cost SoCs in low-cost devices. this is covered in the ecocomputing whitepaper, summary: a 320x240 LCD if you forced the standard to be LVDS would require a $1.50 converter IC to be added to the cost of a $2 LCD - that's MASSIVELY disproportionate. however if you have 1366x768 the LCD is around $25 so adding $1.50 is not a big deal.

                remember i've been designing this standard for 5 *years* to have a lifetime of at least *10* years.

                What about multiple cards on a bus? A tablet or laptop could have space for maybe 4 of these cards. If you want to multitask heavily (or just to compile big software packages faster) you could put ARM cards in all the slots.
                great idea - it's better suited to a build farm / low-power server farm etc. the thing about putting them in a laptop is, you have a 5W budget per card, so now you're looking at 40-50W, that's "fans and metal cases" territory, estimated budget somewhere around $50 to $70k possibly even more.

                i've done what i have done on a reasonable budget that one medium-to-small-sized business could sponsor.


                Or, install more specialized hardware in some of them. I think that one way for free software to take back the GPU is to design an open GPU on an FPGA.
                several projects have sprung up like this. varying success. google them. names mentioned above (2-3 posts above).

                Granted, most FPGAs don't have open tools available - that's a big limitation now. But it might change. Whether non-free tools need to be used or not, a successful GPU design could be turned into an ASIC pretty easily,
                it's a lot of money. and you need special tools. or a university to help. it's a lot of work.

                and then we'd have our fully open GPU without needing to ask some other company to design it. So I keep thinking I want to work on that, I just don't get around to climbing the learning curve for it, so far. I'm not sure if it's the best use of my time, and would sure take a lot of it. Anyway, it's kindof a problem if your pinout doesn't have a high-speed data bus for connecting to such things.
                it's a *great* thing that it doesn't have ultra-high-speed data buses. this is so that an average "maker" engineer can consider getting a 2 layer PCB designed up, the only thing they have to be concerned about is USB differential pairs.

                You're spending a lot of pins on parallel digital video. I guess we'd have to get by with SPI or I2C or USB for interfacing between cards? Wouldn't want the bus to be a master-slave architecture like USB would have it be.
                correct.

                I don't care about gaming much, but I care about the GPU, because as a Qt Quick developer, I've seen that a scene graph is a useful way to render even mundane 2D UIs. If nothing else, you need a way to do smooth scrolling. If you have even a text terminal running fullscreen, or a web page, you never want to expect the CPU to re-send every pixel to the display at its full refresh rate (usually 60FPS) while scrolling - but if you don't, it doesn't feel smooth. It's just not the optimal use of CPU power: that's what GPUs are made for. And there are uses for smooth scaling of fonts, other kinds of animations besides smooth scrolling, etc. So I think I'd rather run a blob for a while than have no GPU, and just hope that since the GPU only receives and displays data rather than doing 2-way communications, it's not a very apt spying platform. And if whoever is still interested in opening up the Mali wants to run a crowdfunding campaign, I'd contribute to that.
                surprisingly you're not the only person to mention this.

                Not sure about the heat management either. You want to use the metal case as a heat spreader I guess,
                just did an update where i cover the laptop's internal design in some depth. leave it with you to read. A20 max 2.5 watt processor is fine. 3.5 watt CPU Card is fine. 4 watts needs graphite paper. 4.5 watts would need the case to be sealed and flooded with thermal gel. metal case contacts with PCB and with the metal of the keyboard. heat dissipation has been considered. aluminium plate already considered and to be investigated if needed for PCB3 (power board).

                [edit sorry forgot link https://www.crowdsupply.com/eoma68/m...printing-parts ]
                Are you really sure what the lifetime of that A20 chip is going to be?
                the A20 is extremely popular because of the overwhelming amount of OS support. allwinner have already told me it's not going away any time soon. despite the fact that it wasn't their work which made it popular, it's strategically important to them.

                ok, sorry to have to be so brief, there was a lot to cover.
                Last edited by lkcl; 11 July 2016, 09:13 PM.

                Comment


                • $2.50 and $3 to $7 SoCs that can do 1080p definitely are
                  But the A20 can, right? At least the datasheet says so.

                  3.3mm variant of the standard which will require costly tooling
                  Sorry I missed what is that related to? What is 3.3mm thick?

                  i've been designing this standard for 5 *years* to have a lifetime of at least *10* years
                  Well the last time I bought a laptop with such low res (1280x768) was about 10 years ago, only because I was in too much of a hurry, needed a fast machine that had a PCMCIA slot (not expresscard, which most had switched to), and grabbed one at Best Buy. (Dell's high end had already been 1920x1200 for a few years back then, and I had one at work.) I regretted the low res immediately when I tried to do any work it. And 10 years from now? I'd be surprised if 1366-resolution still exists. I'm surprised it still does today.

                  Comment


                  • Originally posted by ecloud View Post

                    But the A20 can, right? At least the datasheet says so.
                    yes, the A20 can, but the ingenic jz4775 ($3), the M150 ($2.50), the IC1T ($2) and many others which qualify for including in the EOMA68 "Type II" 5.0mm variant of the standard, cannot.

                    Sorry I missed what is that related to? What is 3.3mm thick?
                    read the EOMA68 standard and/or the whitepaper. there's a "Type I" variant which is 3.3mm thick that goes up to 1920x1080. best explained in the white paper.

                    Well the last time I bought a laptop with such low res (1280x768) was about 10 years ago, only because I was in too much of a hurry, needed a fast machine that had a PCMCIA slot (not expresscard, which most had switched to), and grabbed one at Best Buy. (Dell's high end had already been 1920x1200 for a few years back then, and I had one at work.) I regretted the low res immediately when I tried to do any work it. And 10 years from now? I'd be surprised if 1366-resolution still exists. I'm surprised it still does today.
                    i have a mac book pro with a 2560 x 1600 13in screen. i immediately deleted the OS and replaced it with debian. i got it *NOT* because of the OS, nor because it was "apple" but because i expect to be running this machine still in another 3-5 years time (i've owned it for 3 already). i can fit TEN xterms on one screen. i run TWENTY FOUR virtual screens under fvwm2. other people freak out at how small the text is but it only took me 4 hours to get used to it.

                    basically: we're developers. we can't cope if the information (source code) isn't available to read on as many screens as we can fit around us. i actually had FOUR monitors on one machine i was working on - i got a UD160A USB-to-VGA adapter so i could have that extra screen space.

                    but for everyone else - those who are doing just some email, a bit of internet browsing, and editing word documents? no. 1366x768 is not only tolerable but is actually desirable in large screen resolutions, to give them the larger and clearer text.

                    1366 x 768 will persist because the cost will come significantly down. it's to do with the way that LCDs are made. not many people realise that a 1366x768 LCD contains SIX MILLION transistors. it's a HUGE Integrated Circuit. there's 1 million RGB squares, which is 3 million pixels, and there's 2 transistors per cell so that the liquid crystal's opacity can be altered.

                    LCDs are made by forcing liquid crystal in at the top under EEENNNORRRMOUS pressure. it flows down between the glass layers, but sometimes doesn't make it all the way through. if that happens, the glass has to be smashed and recycled. so the cost is *NOT* down to the actual cost of the glass, it's down to the number of times that the glass has to be RECYCLED before you get a success.

                    the amount of space that the transistors take up in each pixel therefore has a dramatic effect on how easily the liquid crystal will flow down and into the next cell. so the lower the geometry (65nm, 28nm), the smaller the transistor. the smaller the transistor, the more likely that the glass won't have to be smashed and recycled.

                    also, the larger the pixels, the more likely the chance of success....

                    ... see where this is going?


                    so we're currently down to a sale price of around $25 in volume for 1366x768 15in LCDs, which is peanuts! and if it's low cost, then it's popular.

                    so if you look up 1366x768 15.6in LCDs on http://panelook.com you' find that the quantities being sold are ENORMOUS.

                    basically these LCDs allow manufacturers to sell laptops for around the $300 mark, which puts them into the "student" and "low income" affordability range.

                    make sense? you and i aren't the target market for this EOMA68 15.6in Eco-Laptop. we're developers. we have completely differently-wired brains. i can LITERALLY see the individual pixels on my old 1920 x 1200 24in LCD, now, thanks to using this retina screen. by "individual pixels" i mean i can see and clearly distinguish the INDIVIDUAL RED GREEN AND BLUE pixels from a distance of 2 feet.

                    i can also tell you what the refresh rate of CRT monitors is, up to around 75 hz, being able to distinguish between 43, 50, 60, 72 and 75 hz. i can also see the INDIVIDUAL FLICKERING of LED-based car headlights and tail lights, even up to around 200 hz. LED christmas lights are a bloody nightmare

                    developers *literally* have hyper-sensitive visual cortices that are waaay more developed than the average person, and, fascinatingly, many developers are not even actually aware that their visual abilities drastically differ from average abilities.

                    Comment


                    • p.s. forgot to remind you, there's a MicroHDMI output on the A20 Computer Card. max resolution: 1920x1080. dual independent screen (xinerama) is perfectly possible.

                      Comment

                      Working...
                      X