This is the mail archive of the gdb-patches@sourceware.org mailing list for the GDB project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH 3/7] [python] API for macros: Add gdb.Macro class.


On Tue, Aug 30, 2011 at 5:45 AM, Phil Muldoon <pmuldoon@redhat.com> wrote:
> matt rice <ratmice@gmail.com> writes:

>> +macropy__definition_to_py (const struct macro_definition *macro)
>
> A small nit for me personally, but all of the other Python API classes
> use one "_" to separate the function library name from the function
> name. ?This and others.

Hmm... It seems as though all the Python API classes use 'foopy_' prefixes
_only_ for the 'public methods available through python',  all of
these 2x"_" functions
are private functions, I understand that static functions such as
these do not pollute
the global namespace, but I find the tab completion behavior when
debugging gdb itself
that having prefixes on these helps considerably.

Other python API seems to use random function names for these
(symtab_, set_, sal_, etc).
thus I use the 2x_ mainly because of the 'beginning with _'
reservation for compiler/libc.

Anyhow, it doesn't seem like removing one of the '_'s is really the thing to do,
maybe macropy_pvt_* (i'll see if that works but some of the functions
are getting fairly long.)

>> +macropy__validate_self (PyObject *self)
>> +{
>> + ?macro_object *me;
>> +
>> + ?if (! PyObject_TypeCheck (self, &macro_object_type))
>> + ? ?{
>> + ? ? ?PyErr_SetString (PyExc_RuntimeError,
>> + ? ? ? ? ? ? ? ? ? ? ?_("Typecheck failure converting macro."));
>> + ? ? ?return NULL;
>> + ? ?}
>> +
>> + ?me = (macro_object *) self;
>> +
>> + ?if (me->objfile_validator->is_valid == 0)
>> + ? ?{
>> + ? ? ?PyErr_SetString (PyExc_RuntimeError, _("Macro is invalid."));
>> + ? ? ?return NULL;
>> + ? ?}
>> +
>> + ?return me;
>> +}
>
> As you are pointing to an existing copy of self, and returning it, you
> *may* need to incref. ?It depends if any of the functions downstream
> decref it. ?Please check.

this is basically the same as the FOOPY_REQUIRE_VALID macro's
it shouldn't need to, for macros we don't really have a
symtab_object_to_symtab or sal_object_to_sal
type of function to do the type check in, putting that into e.g.
MACROPY_REQUIRE_VALID requires
using the PyObject * argument twice, I also found the assignment in
the macro to be weird :/

I should probably rename it macropy__require_valid().

>> +static int
>> +macropy_compare (PyObject *self, PyObject *o2)
>> +{
>> + ?PyObject *my_str = macropy_str (self);
>> + ?PyObject *other_str = NULL;
>> + ?PyObject *an_error;
>> + ?int err_code = -1;
>> + ?int comparison_result = -1;
>> +
>> + ?if (!my_str)
>> + ? ?goto check_for_errors_and_return;
>> +
>> + ?if (PyObject_TypeCheck (o2, &macro_object_type))
>> + ? ?{
>> + ? ? ?other_str = macropy_str (o2);
>> + ? ? ?if (!other_str)
>> + ? ? goto check_for_errors_and_return;
>> +
>> + ? ? ?err_code = PyObject_Cmp (my_str, other_str, &comparison_result);
>> + ? ?}
>
> Use PyObject_Compare

Hmm, I'm very skeptical of PyObject_Compare() because of its lack of a
way to identify if there was an error
that may have been cleared by something, e.g. gdb_py_print_stack()...
this may not be possible, but it may be if we figure out a way to have
Macro's comparison method 'monkey patchable'.

>> + ?else
>> + ? ?{
>> + ? ? ?err_code = PyObject_Cmp (my_str, o2, &comparison_result);
>> + ? ?}
>
> Ditto.
>
>> + ?check_for_errors_and_return:
>> + ? ?Py_XDECREF (my_str);
>> + ? ?Py_XDECREF (other_str);
>> +
>> + ? ?/* Because -1 is also Less Than we cannot use gdbpy_print_stack ()
>> + ? ? ? which clears the error if set python print-stack is off.
>> + ? ? ? Which would lead to (gdb) python print x == x
>> + ? ? ? returning False with no error message displayed.
>> +
>> + ? ? ? alternately if an error is set and the return value is not 1,
>> + ? ? ? python will complain.
>> +
>> + ? ? ? Thus if there is an error, we must normalize it
>> + ? ? ? making sure that both an error and the return
>> + ? ? ? value are -1. Should we encounter one. ?*/
>> + ? ?an_error = PyErr_Occurred ();
>> + ? ?if (an_error == NULL && err_code != -1)
>> + ? ? ?{
>> + ? ? ? ?/* a valid comparison */
>> + ? ? ? ?return comparison_result;
>> + ? ? ?}
>> + ? ?else if (an_error == NULL && err_code == -1)
>> + ? ? ?{
>> + ? ? /* an unknown error */
>> + ? ? PyErr_SetString (PyExc_RuntimeError,
>> + ? ? ? ? ? ? ? ? ? ? ?_("mystery error during macro comparison."));
>
>
> Don't overwrite the exception here. ?And if PyObject_Compare has
> returned -1, there will be an exception so this can't happen. ?You could
> probably just remove the condition entirely and just fold over to the
> else below.

that comment should also read 'and the return value is not -1' instead
of 'not 1'...

as per the above, we aren't overwriting the exception, in this case
'an_error' is the current exception
and it is NULL, while PyObject_Cmp returned -1 signalling to us that
there was an error.

if we return -1 without an error set they compare as 'less than',
and the user may have no idea that an error was even set.

unless we can prove this can never ever happen with any version of
python I prefer PyObject_Cmp,
which has the error code return value in place of PyObject_Compare
which relies on an PyErr_Occurred

e.g. If i figure out a way to add monkey patching of '__cmp__' on
gdb.Macro objects, its plausible that a user defined comparison
function could end up in a call to gdb_py_print_stack(), and a silent
failure.

>> + ? ? ? ?return -1;
>> + ? ? ?}
>> + ? ?else /* a known error */
>> + ? ? ?return -1;
>> +}
>> +
>
>
>> +static long
>> +macropy_hash (PyObject *o)
>> +{
>> + ?long result = -1;
>> + ?PyObject *my_str = macropy_str (o);
>> +
>> + ?if (my_str)
>> + ? ?result = PyObject_Hash (my_str);
>> +
>> + ?/* The docs say PyObject_Hash should return -1 on error,
>> + ? ? Don't believe it because it is not always true,
>> + ? ? The exception gets raised regardless of return value,
>> + ? ? But we should follow the documented return strategy of -1 on error.
>
> Even if you are right, are you right for all versions of Python? ?I
> guess the PyErr_Occurred check is harmless.

no idea about all versions, i checked with 2.7, concurr it's harmless.
I'll try and make it less definitive/mention the version.


>> +static PyMethodDef macro_object_methods[] = {
>> + ?{ "argv", macropy_argv_method, METH_NOARGS,
>> + ? ?"argv () -> List.\n\
>> +Return a list containing the names of the arguments for a function like \
>> +macro." },
>
> "function-like", imo.
>
>> + ?{ "definition", macropy_definition_method, METH_NOARGS,
>> + ? ?"definition () -> String.\n\
>> +Return the macro's definition." },
>> + ?{ "include_trail", macropy_include_trail_method, METH_NOARGS,
>> + ? ?"include_trail () -> List.\n\
>> +Return a list containing tuples with the filename and line number describing \
>> +how and where this macro came to be defined." },
>
> Your help and your return explanation don't match. ?It should return a
> Tuple (if it doesn't, please fix it). ?And please change ?-> List to ->
> Tuple.

They do, but it's awkward
e.g. a list OF tuples, so [tuple, tuple, tuple]

>> + ?struct objfile *objfile;
>> +};
>> +
>> +/* A macro_callback_fn for use with macro_foreach and friends that adds
>> + ? a gdb.Macro object to the python list USER_DATA->LIST.
>> + ? USER_DATA should be of type struct macropy_user_data.
>> +
>> + ? A null USER_DATA->OBJFILE is currently be considered an error,
>> + ? this is subject to change.
>> +
>> + ? Aborts the iteration on error, and forces the macro iterator function
>> + ? to return macro_walk_aborted check the iterators macro_walk_result.
>> + ? On error it is the callers responsibility to dispose of the USER_DATA
>> + ? structure and its contents. ?The appropriate python exception
>> + ? that caused the error has already been set.
>> +*/
>> +enum macro_walk_status
>> +pymacro_add_macro_to_list (const char *name,
>> + ? ? ? ? ? ? ? ?const struct macro_definition *definition,
>> + ? ? ? ? ? ? ? ?struct macro_source_file *source,
>> + ? ? ? ? ? ? ? ?int line,
>> + ? ? ? ? ? ? ? ?void *user_data);
>> +
>> +/* Create a new macro (gdb.Macro) object that encapsulates
>> + ? the gdb macro representation. ?OBJFILE is the objfile that contains
>> + ? the macro table. ?NAME is the name of the macro. ?DEF the macros definition
>> + ? SRC the source file containing the macro. LINE definitions location. ?*/
>> +PyObject *
>> +macropy_object_create (struct objfile *objfile,
>> + ? ? ? ? ? ? ? ? ? ?const char *name,
>> + ? ? ? ? ? ? ? ? ? ?const struct macro_definition *def,
>> + ? ? ? ? ? ? ? ? ? ?struct macro_source_file *src,
>> + ? ? ? ? ? ? ? ? ? ?int line);
>> +
>> +#endif /* GDB_PY_MACRO_H */
>
> Any reason why these need their own include? ?(over python-internal.h).

Nope, just didn't see these all collected there.

> If these are called from anywhere other than python/*

they are not.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]