If you meant that as an additional question, then the answers to my
questions probably imply a reasonable answer to yours.
If you meant that rhetorically (to answer my question about endianness),
I would expect that a program that processes binary traces of the realm
of complexity that we are contemplating generating (varying records types
and lengths) would either
- require metadata to describe records, which then might as well include
an identification of native endianness
- be smart enough to deduce it, perhaps based on a single magic number
similar to the ELF header word
- be sadly inflexible in these ways, in which case the trace data is likely
to contain trivial fixed-sized bits, which would be produced from a small
number of script probes/functions, which then might as well hard-code
endianness in the formatting directives.
I know, "not necessarily" is a come-back to all of the above, but I
think the generalities hold.
- FChE