This is the mail archive of the gdb-prs@sourceware.org mailing list for the GDB project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug gdb/19973] gdb prints sizeof a char literal as 1 when the language is C


https://sourceware.org/bugzilla/show_bug.cgi?id=19973

--- Comment #1 from cvs-commit at gcc dot gnu.org <cvs-commit at gcc dot gnu.org> ---
The master branch has been updated by Tom Tromey <tromey@sourceware.org>:

https://sourceware.org/git/gitweb.cgi?p=binutils-gdb.git;h=5aec60eb2f6f0905bfc76f5949fd5d55c6a10f10

commit 5aec60eb2f6f0905bfc76f5949fd5d55c6a10f10
Author: Tom Tromey <tom@tromey.com>
Date:   Wed Aug 30 16:12:56 2017 -0600

    Cast char constant to int in sizeof.exp

    PR gdb/22010 concerns a regression I introduced with the scalar
    printing changes.  The bug is that this code in sizeof.exp:

        set signof_byte [get_integer_valueof "'\\377'" -1]

    can incorrectly compute sizeof_byte.  One underlying problem here is
    that gdb's C parser doesn't treat a char constant as an int (this is
    PR 19973).

    However, it seems good to have an immediate fix for the regression.
    The simplest is to cast to an int here.

    testsuite/ChangeLog
    2017-09-05  Tom Tromey  <tom@tromey.com>

        PR gdb/22010:
        * gdb.base/sizeof.exp (check_valueof): Cast char constant to int.

-- 
You are receiving this mail because:
You are on the CC list for the bug.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]