Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Profile TW performance with large task DBs #3618

Open
djmitche opened this issue Sep 7, 2024 · 4 comments
Open

Profile TW performance with large task DBs #3618

djmitche opened this issue Sep 7, 2024 · 4 comments
Labels
topic:performance Issues related to performance

Comments

@djmitche
Copy link
Collaborator

djmitche commented Sep 7, 2024

...Probably running a profiler is the action item from this issue...

Originally posted by @djmitche in #3329 (comment)


I think this would be best done from Taskwarrior, treating the Rust code as a "black box". It would be great to learn which calls into Rust are taking the longest -- and what non-Rust things are taking a long time, too. I can think of a few possibilities we might learn, but the truth is probably something different:

I don't know much about profiling in C++, but perhaps you, dear reader, do?

@djmitche djmitche added the topic:performance Issues related to performance label Sep 7, 2024
@felixschurk
Copy link
Collaborator

felixschurk commented Sep 13, 2024

I think in order to have it somehow comparable we should use here the existing load, run_perf tasks which are in the performance folder. But probably (at least for a first run) in a bit cut down size.. Currently it would create 8510 tasks, which is fairly large :D

I am planning to take a look into this, however I will probably not have time before end of September :/

@felixschurk
Copy link
Collaborator

felixschurk commented Sep 27, 2024

I did let it run on my local machine which runs on Fedora, having an i5-1145G7 processor and 16 GiB of RAM. Not using the machine for anything else super intense at the profiling time and then it came up with some call graph like this:
image

At least based on that what takes the most of the time is the rendering, starting, and the rebuilding also the working set of taskwarrior.

I am not really sure how the best way to share results of performance profiling etc.

I also did let run the same performance suite (the one which is with taskwarrior) on the release of 2.6.2 and there the total runtime was only 2.7s vs. 4.25s on the current develop branch. I mean the measurements are not about the current absolute time nor anything (I only did let them run once so far, so in order to have there something super consistent multiple runs and then some average or so would be needed) but still one sees that taskwarrior 3 is noticeably slower, but solely from the profiling I can not really pin it down to one method where the problem might be coming from.

The performance script also returns some counters for garbage collection etc. and there the counters are an order more for the taskwarrior 3, but I am not aware on how they are computed so might also report something wrong.

@felixschurk
Copy link
Collaborator

felixschurk commented Sep 27, 2024

Okay, I just noticed that one "counter" or timer was lost due to the transition to the TDB2 backend. More specifically, the timer which tracked how long it takes to load some task set into Taskwarrior.

The counter was in Line 419 of TDB2.cpp.
5bb9857#diff-d6d65d1f59ba33ef5268d5bc8a49ed26f397b899b902332404aee34e800cbc0bL419

Context::getContext ().time_load_us += timer.total_us ();

This explains why the reporting of the performance always states load:0 for the latest taskwarrior.

@djmitche
Copy link
Collaborator Author

That flame graph seems to have multiple commands in it -- is it a concatenation from multiple runs? It's a little difficult to interpret the graphic, since things are cut off. Is there a way to share the source file from which that was drawn?

#3635 should restore the load timer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
topic:performance Issues related to performance
Projects
Status: Backlog
Development

No branches or pull requests

2 participants