This is the mail archive of the libc-alpha@sourceware.org mailing list for the glibc project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: memcpy performance regressions 2.19 -> 2.24(5)


On 05/10/2017 01:33 PM, H.J. Lu wrote:
> On Tue, May 9, 2017 at 4:48 PM, Erich Elsen <eriche@google.com> wrote:
>> store is a net win even though it causes a 2-3x decrease in single
>> threaded performance for some processors?  Or how else is the decision
>> about the threshold made?
> 
> There is no perfect number to make everyone happy.  I am open
> to suggestion to improve the compromise.
> 
> H.J.

I agree with H.J., there is a compromise to be made here. Having a single
process thrash the box by taking all of the memory bandwidth might be
sensible for a microservice, but glibc has to default to something that
works well on average.

With the new tunables infrastructure we can start talking about ways in
which a tunable could influence IFUNC selection though, allowing users
some kind of choice into tweaking for single-threaded or multi-threaded,
single-user or multi-user etc.

What I would like to see as the output of any discussion is a set of
microbenchmarks (benchtests/) added to glibc that are the distillation
of whatever workload we're talking about here. This is crucial to the
community having a way to test from release-to-release that we don't
regress performance.

Unless you want to sign up to test your workload at every release then
we need this kind of microbenchmark addition. And microbenchmarks are
dead-easy to integrate with glibc so most people should have no excuse.

The hardware vendors and distros who want particular performance tests
are putting such tests in place (representative of their users), and direct
end-users  who want particular performance are also adding tests.

-- 
Cheers,
Carlos.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]