Announcement

Collapse
No announcement yet.

NVIDIA Proposes The Linux Hardware Timestamping Engine

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by timofonic View Post
    IIs it just me or is this just a disguised plan for stronger DRM?
    This has nothing to do with DRM - I'm not sure what exactly they need such accurate timestamps for, but I guess we'll find out.

    Comment


    • #12
      Originally posted by sandy8925 View Post
      I'm not sure what exactly they need such accurate timestamps for, but I guess we'll find out.
      Sensor data. Accurate timestamps are very important for realtime control applications, such as robotics.

      Comment


      • #13
        Originally posted by coder View Post
        Sensor data. Accurate timestamps are very important for realtime control applications, such as robotics.
        Yeah, I thought it might be for some realtime usecase, but with this much accuracy......

        Maybe it's for peripheral hardware validation? Or driver debugging/profiling?

        Comment


        • #14
          Originally posted by sandy8925 View Post
          Yeah, I thought it might be for some realtime usecase, but with this much accuracy......

          Maybe it's for peripheral hardware validation? Or driver debugging/profiling?
          I don't know why you seem to feel it's unnecessary.

          I can tell you from experience that software timestamping sucks. Especially if you're not using a real RTOS with bounded latencies.

          Comment


          • #15
            Originally posted by coder View Post
            I don't know why you seem to feel it's unnecessary.

            I can tell you from experience that software timestamping sucks. Especially if you're not using a real RTOS with bounded latencies.
            Yeah, but this is like nanosecond or microsecond level accuracy and precision. Does robotics need that much accuracy and precision?

            Comment


            • #16
              Originally posted by sandy8925 View Post
              Yeah, but this is like nanosecond or microsecond level accuracy and precision. Does robotics need that much accuracy and precision?
              Just because the units are small doesn't mean anything about Nvidia's purpose.

              If I were promoting such a standard for general purpose usage, I'd probably suggest the highest precision I thought was practical, so that it could be also be used for things like scientific instruments and other applications I hadn't even thought of.

              Comment


              • #17
                Originally posted by coder View Post
                Just because the units are small doesn't mean anything about Nvidia's purpose.

                If I were promoting such a standard for general purpose usage, I'd probably suggest the highest precision I thought was practical, so that it could be also be used for things like scientific instruments and other applications I hadn't even thought of.
                Hm, good point. That makes sense.

                Comment

                Working...
                X