Log Message: |
Fix a big scalability problem in the implementation of svnpredumpfilter.py.
The script kept re-computing the set of additional include paths while
mining the log history for copied paths. Each re-computation involved
a full iteration of the set of copies accumulated so far, which made
the run time explode on large repositories.
Instead, we can gather all copies first, and then iterate them at once.
In my testing this change reduces the runtime of svnpredumpfilter.py on
'svn log -qv' output from the FreeBSD repository (up to r271458) from
several days(!) to 1.5 minutes.
* tools/server-side/svnpredumpfilter.py
(svn_log_stream_get_dependencies): Run dt.handle_changes() once the log
history has been fully scanned, not for each revision.
|