Originally posted by frign
View Post
uint32_t days, hours, minutes, seconds;
ARM Cortex: 384 cycles
and then
uint8_t hours, minutes, seconds;
ARM Cortex: 434 cycles
finally
uint_fast8_t hours, minutes, seconds;
ARM Cortex: 384 cycles
So that I assume that for the ARM Cortex, uint_fast8_t just map to uint32_t because it's the fastest container that holds at least 8 bits for this platform.
About the compiler stuff, I just though the compiler could do the same thing (when you declare an int, it's free to use either 16, 32 or 64 bit for it depending on I-don't-know-what). It couldn't do it on any interface, because of binary compatibility, but it could do it for internal variables I guess. Because uint16_t is guaranteed to be 16 bits, but uint_fast16_t, short, and int are not. But those are only random thoughts.
Comment