This is the mail archive of the binutils@sourceware.cygnus.com mailing list for the binutils project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Re: Leading character for BINARY format ?


Hi Ian,

:      A customer reported a problem when using OBJCOPY to convert from
:      BINARY to ELF formats.  The problem was that the target ELF format
:      did not use an underscore prefix for user symbols, but the symbols
:      generated by the BINARY code did.  As a result they ended up with an
:      ELF executable with user symbols that had an underscore prefix.
:      Using the '--remove-leading-char' option to objcopy did not work
:      because the binary format does not define a leading character.
: 
:      The patch below solves this problem for the customer, but I am not
:      entirely sure if might not have some unexpected side effects.  All
:      that the patch does is to change the specification of the leading 
:      character for the BINARY format from nothing to an underscore, so
:      that converting from BINARY to (ARM-) ELF with --remove-leading-char
:      will now strip the first character.
: 
: I don't wholly understand.  What does it mean to say that the symbols
: generated by the binary code use an underscore prefix?

It means that since the binary format does not have any symbols of its
own, the BFD code has to generate some symbols for it, and it chooses
to generate these symbols with a leading underscore.

: No matter what the symbols look like, the user should be able to
: refer to them directly, using an appropriate number of underscores,
: whatever that number is.

True.

: I don't think it is wrong for BFD to use leading underscores for
: these symbols, since they are system generated.

I agree.

: I don't understand why anything has to change in BFD.

The problem is that BFD generates these symbols with a leading
underscore, but in the BFD target vector it sets the
'symbol_leading_char' field to 0.  This seems to me to be incorrect.
I think that it should be set to '_'.

: There is a known problem that the symbols appear differently when
: viewed from a.out C files and ELF C files, in that a.out C files
: expect a leading underscore and ELF C files do not.  That means that
: the same code doesn't work in both a.out and ELF, and the user must
: use some sort of #ifdef.  Is that the issue here?

No.  The problem is that is not (currently) possible to strip the
leading underscores from the symbols generated by BFD for a binary
format input using the --remove-leading-char switch to objcopy.

My concern is that there may be a good reason for setting the
'symbol_leading_char' field in the binary bfd target vector to 0, and
I have just not thought of it.  Of course it just may be that no-one
has ever converteed from binary to some other format, or if they have,
they have not been worried about the prefix to the symbols.

Cheers
	Nick

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]