Announcement

Collapse
No announcement yet.

Intel Ramping Up Their Investment In Blender Open-Source 3D Modeling Software

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Tying together related posts:

    Originally posted by kpedersen View Post

    However I am getting a little worried that Blender is getting too big and professional for my uses. I am not in any way a decent 3D modeler but I do need to use it occasionally to make test data for my projects. If it gets too complex and requires a professional workstation; this is obviously not going to work.

    ----

    I guess I am also worried that they will drop OpenGL 3.x support simply because "Professional users are using professional workstations capable of at least OpenGL 4.x". Kind of the same culture as to why UE4 or Maya are not useful to me.
    I can’t speak for UE4, but this isn’t accurate with Maya. There are four three viewport backends, sans Legacy. You can run DirectX on Windows, OpenGL Core Profile (Compatibility) for OpenGL 2.1- and 3.2+ on Windows and Linux, or OpenGL Core Profile (Strict) for OpenGL 3.2+ only on Windows, Linux, and macOS. Nowhere is OpenGL 4 a requirement. Running Maya on Linux in the first place is a different story with how licensing works right now. And Blender dropping of OpenGL 2 was done specifically because it was holding back improving the application with the major overhaul of the viewports they were doing.

    My perspective on the matter from the Anim/VFX industry side:

    As someone who works in the animation industry (at animation studios), everything that’s been happening with Blender has been a marked improvement for usability and adoption, but it’s still got a ways to go. And when it comes to professionalism... all of the core DCC's commonly used, e.g. Maya, Max, Houdini, Cinema 4D, etc, even Blender host a ton of functionality in order to support a broad range of workloads and workflows. Every tool that has these kinds of use cases will grow in size, code, complexity no matter what. You can't pick a feature set and just refine unless you want to just be a specialist tool. Other than requiring actual GPU's because Intel UHD graphics... well... suck for this kind of work, these apps will usually work on most hardware that aren't potatoes. Those requirements are put there from a support perspective where you will find the most optimal results and how much time the vendors are willing to spend with you on solving a problem as it actively costs them money.

    When it comes to those other DCC's, they also have different target audiences. Maya and Houdini (originally Prism) are geared primarily towards VFX/Anim studios, C4D to smaller groups and motion designers, and Max to VFX studios (to a lesser degree) and archviz groups. And (at least in the case of Maya/Houdini), the versions of the apps used in the larger studios that can sway Autodesk/SideFX's priorities are not the off-the-shelf one's that others buy. They are heavily modified and customized (not talking about plugins), e.g. Disney has Autodesk engineers onsite, and 90% of the time no one other than that studio will ever see what's there unless they gift it to a vendor. Blender from the start was targeting solo end users for the most part and for a long time Ton made a lot of anti-studio/general industry decisions that went against what would have helped earlier adoption in the industry from the studio field. Outside of Blender Institute, Tangent Animation, and Barnstorm VFX, there really aren't studios using Blender, and if they did, they would be doing their own work on it and then considering making it available upstream once the devs got through legal. And it would be up to the Blender team to decide whether to accept that addition/request and weigh it to see if it's a benefit for the most users possible, not a small select group to then support the feature for. Studios are more than capable of doing that on their own. That being said, the Blender devs have been doing some things that make adoption by studios much easier and cleaner, but Ton and Co. are not losing sight of who their main audience is.

    Having the support of all these vendors from a hardware standpoint, software features that help significant amounts of users, and enabling tighter, more professional 1st-class support with 3rd-party tools is a huge bonus to Blender, that for all intents and purposes, shouldn't affect what you do with the app. Just due to the nature of Blender, I don't see it following the same path as other DCC tools.

    Summation: I really wouldn't worry

    Cheers,
    ​​​​​​​Mike
    Last edited by mroche; 03-27-2020, 08:36 PM.

    Comment


    • #12
      Originally posted by kpedersen View Post
      This is obviously great news for Blender.

      However I am getting a little worried that Blender is getting too big and professional for my uses. I am not in any way a decent 3D modeler but I do need to use it occasionally to make test data for my projects. If it gets too complex and requires a professional workstation; this is obviously not going to work.

      Alternatives I am looking into are Wings3D for the models and a slightly hacked up GtkRadiant to export and bake lighting to .obj.

      I would just look into maintaining an old version but because they have Python interpreters and things like that, it becomes too difficult to keep fixing breakages.

      Also interested in AC3D but it is proprietary and only works on a limited number of operating systems. I suppose Wings3D is a bit annoying because it drags in a JVM required to run.
      Blender is not going to require anything 4.x or similar anytime soon. Blender will stay OpenGL 3.3 for the foreseable future. What developers have stated quite some times in the forums and the dev. forum is they will support OpenGL and Vulkan (in the 2.9 or 3.x series, nothing is written in stone at this time) since Apple has deprecated OpenGL and OpenCL and Apple is not going to fix any bugs in their OS. There's no plans for Metal anytime soon, but there were some external efforts that didn't work in the end. Is not an easy problem to solve. But fear not... If you're on Linux, an intel HD 3000 will work just fine, albeit slow (since OpenGL 3.3 on Intel before Haswell is mainly software rendering anyways).

      Also, Blender getting more complex to use is not exactly true, but not false either. Devs just are adding features and trying to make Blender more "Industry Standard" compatible, and this requires to learn certain things and relearn others. YMMV.

      Comment


      • #13
        Originally posted by Calinou View Post

        What about Dust3D and TrenchBroom?
        Hmm, TrenchBroom currently looks a little primitive compared to the older GtkRadiant, however Dust3D looks pretty good. Thanks!

        Originally posted by stargeizer View Post
        Also, Blender getting more complex to use is not exactly true, but not false either. Devs just are adding features and trying to make Blender more "Industry Standard" compatible, and this requires to learn certain things and relearn others. YMMV.
        I did mean more complex to maintain / port (i.e for the operating systems I use). Complexity of use does not really phase me because I generally do very basic things. For complex things, such as generating light maps / baking I will always use the (slightly bizarre) Python API because then I can reuse the script rather than remembering the process. So long as they don't break that (which they did with Eevee, I used to use the Blender Render to make lightmaps), I don't really need to open Blender to use it.

        (Simple example script here: https://github.com/osen/raptorbakery)

        Up until recently we used a hacked up version of qrad the Q3 lighting tool. So we are happy to update in time haha.

        Originally posted by mroche View Post
        I can’t speak for UE4, but this isn’t accurate with Maya. There are four three viewport backends, sans Legacy.
        Thats interesting, I wish that open-source projects like Blender would architect their solution for "legacy" viewports. That might be a limited manpower issue however.
        That said, we simply couldn't use Maya on a number of our systems. The 32-bit ones were out instantly obviously but it also did not open citing a OpenGL version error for our Intel GMAs. I am pretty sure the laptops in question were the GMA 3000 (Thinkpad X220).
        Last edited by kpedersen; 03-28-2020, 06:58 AM.

        Comment


        • #14
          Hilarious, Blender gets a fortune per year and doesn't even have a competitive game engine, LOL!

          Also, whoever said "complexity and size aren't tied together" is like saying the engine of an oil tanker isn't "complex" and that a 747 isn't "big". Those things are big AND complex!
          So obviously those things are tied together, but maybe not always necessarily tied together.

          Comment


          • #15
            My personal opinion is that Intel is better off investing to making crap-free HW that isn't full of bugs, backdoors and quirks. Its quite odd when chipmaker faces ton of woeful bugs in their main field of expertise, yet takes on something completely unrelated (alas, Intel never been anyhow good with software). Next time try to improve hamburgers or bee hives maybe, dear Intel engineers?

            Comment


            • #16
              Originally posted by SystemCrasher View Post
              My personal opinion is that Intel is better off investing to making crap-free HW that isn't full of bugs, backdoors and quirks. Its quite odd when chipmaker faces ton of woeful bugs in their main field of expertise, yet takes on something completely unrelated (alas, Intel never been anyhow good with software). Next time try to improve hamburgers or bee hives maybe, dear Intel engineers?
              intel turns over 72 BILLION dollars per year, i dont think 30k euros would put them better-, worse- or any off at all!

              Comment


              • #17
                Well, my point is that they do some weird activity while there're clearly more pressing matters, like their CPUs being goddamn full of bugs. And attempts to work around these bugs in SW looks very odd and screws performance to the hell. I don't get why totally different ppl have to fix crap after Intel when they throw such an odd activity instead of chewing on more pressing matters. And if they really want to get sidetracked, I guess, they can manufacture some ventilators or so. Well, hopefully without backdoors and remote management, that sounds goddamn scary.

                Comment


                • #18
                  Originally posted by kpedersen View Post

                  That said, we simply couldn't use Maya on a number of our systems. The 32-bit ones were out instantly obviously but it also did not open citing a OpenGL version error for our Intel GMAs. I am pretty sure the laptops in question were the GMA 3000 (Thinkpad X220).
                  Maya has some very specific hardware and driver requirements in order to appropriately display and manage its color management pipeline (if that's the error you were seeing). It requires the NVIDIA and AMD drivers (for AMD, I'm not sure if AMDGPU-Pro is required, or if standard AMDGPU works, I don't think Mesa does but I haven't been on an AMD Linux workstation ever). Intel graphics are not supported, with the possible exception of Intel Iris Pro that was found on many MacBook Pros, but even then it would be dog slow. The app itself should still work, but any color-managed functionality (e.g. VP2) will be disabled. If Maya was flat out refusing to function completely, then I don't know

                  Cheers,
                  Mike

                  Comment

                  Working...
                  X