This is the mail archive of the gsl-discuss@sourceware.org mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: test release gsl-1.8.90.tar.gz


Gerard Jungman wrote:
On Sat, 2007-02-17 at 10:08 -0500, Ed Smith-Rowland wrote:

The portion of the C++ standard (actually a technical report - TR1) is here: TR1 <http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2005/n1836.pdf>
The relevant sections are 5 - special functions and a random number generator. Also section 8 is C99 compatibility. They deliberately adopted a C99 style calling convention to leave open the possibility that C99 could adopt this too.

Hi. Thanks for the link. I hadn't seen this document until now. I heard about this push for special functions from Walter Brown (at FNAL) and we talked briefly about it once or twice in the last few years.

I admire the effort going into this, but I have my doubts that it
is a good idea. For example, I was strongly against the inclusion
of the confluent hypergeometric function. The implementation in
GSL is a failed experiment, in my view (I wrote the implementation,
so I take full responsibility for that). The function itself is
too complicated. Even if it can be made to work reliably
over the whole domain, the resulting code would probabyl have
highly variable performance and/or be so heterogeneous that it
would just feel wrong, as a standard component.
I am struggling over the hypergeometric functions as we speak. :-p
I was just wondering whether/what I could learn from the GSL. I did detect a hint of desperation in the comments ;-).
The branching everywhere is also painful.
This is one example, but there is a general problem here,
which is the illusion of simplicity that this type of
function provides. An interface of the form "given x and y,
return f(x,y)" can never express the many choices that have
to be made in the implementation. A good library needs to
expose these choices in some way; it need not be easy, but
it should at least be possible, for a knowledgeable client
to access these choices.

As another example of the inherent problems in this business,
consider the elliptic integrals. TR1 specifies the usual
first, second, and third kind functions, the way they
appear in textbooks. GSL includes these, but prefers
the Carlson forms. Carlson's book made it clear to me
that his forms are the most logical and are the best
for numerical implementation. Experts agree.

The general point is that the idea of "elliptic integrals"
is not just a couple of functions, it is really a family
of functions, joined together by a single algorithmic
idea, the arithmetic-geometric-mean. A good library
should express this.

I could go on for a while about this. But here is a
simple summary of my feeling about the TR1 special
functions:

functions from TR1 that are simple enough
that they could work as standard components
-------------------------------------------
beta function
exponential integral
riemann zeta


  functions that are doubtful as standard components
  --------------------------------------------------
  all the Bessel variants
  Laguerre
  Legendre
  Hermite
  elliptic integrals


functions that will almost certainly fail as standard components
----------------------------------------------------------------
hypergeometric functions, confluent and otherwise
I've reached the same conclusions basically.
The elliptic integrals don't seem to bad though If you use the underlying Carlson implementation.
The Carlson implementation seems almost universal from what I've seen. I would almost put them in the first group.

Here is a final philosophical rant.
It is easy to see why elementary functions should be standard
components, regardless of the level of complexity in their
implementation. Thankfully, they are simple enough. It is
possible to argue that the gamma function should be a standard
component, but I think the only reason this holds up is that
we have the Lanczos algorithm. Without Lanczos, the gamma
function might be some hairy beast, requiring trade-offs that would complexify it beyond the realm of standard
components. By the time we get to the confluent hypergeometric
function, we are way out in a specialized problem domain,
where trade-offs and details become key to a successful
library implementation. Such functions have no place
as standard components.


Does anybody have use cases for things like confluent
hypergeometric? Who needs these functions, why do they need
them, and how do they expect to use them?


Sorry to be so negative. Maybe you can tell me why I'm crazy.


The only functions I have not personally used or seen used are precisely the hypergeometric functions. I think there is a long standing allure of "one function to rule them all". If you *had* a stable efficient hypergeometric function, you could compute everything else with it. Plus the name sounds cool ;-). I think you are right about the insanity of having the hypergeometric functions.

I think that my recommendation as far as C++ standards would be to drop the hypergeometric functions entirely - even from TR1. I have mixed feelings about whether these functions should migrate from TR1 to the main standard. In practical terms I think they will not make it precisely because the implementations are not really set in stone. Standards are supposed to *ratify* widely used practice - not to push an agenda. Also, as has been alluded do in these standards discussions - if you implement these functions where do you stop? What about Airy functions? Or all the functions that statisticians use? People have noted that the choice of functions is physicist-centric.

I think implementations that are fairly useful can be provided for most of these functions. I'm going to keep plugging on gcc. I think it is worth exposing these if only to show the difficulties. Maybe I won't kill myself on the hypergeometrics though!

I think ultimately the way to go in C++ (or maybe even C) would be to have dumb algorithms and smart numbers. Basically one would crank sums with multi-precision numbers that kept track of errors or kept a fixed precision - speed be damned. I think there are libraries that do this.
Concerning complex I swore that conforming C++ was supposed to store the real and imaginary parts in such a way that you could interact with C99. I *think* you should be able to cast back and forth and have it work. Also, a large part of this TR1 deals with syncing C99 with C++ (section 8).

I thought there was a way you could grab the raw implementation and use it in C99 and conversely put a C99 _Complex in a constructor for C++ complex.
I'll dig around and see.

As it does for Linas, this has bothered me for years. At this point I have no idea what the standards (or anybody else for that matter) can guarantee about compatibility between std::complex and C99 complex. As far as I could tell previously, there was no way to make it work reliably, although it probably works by default with gcc on most platforms. Please do dig around and let us know what you find.

Thanks.
--
Gerard Jungman


Thank you for weighing in on these issues.

Ed Smith-Rowland


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]