Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

10% performance regression after rollup #49051 #49168

Closed
michaelwoerister opened this issue Mar 19, 2018 · 8 comments
Closed

10% performance regression after rollup #49051 #49168

michaelwoerister opened this issue Mar 19, 2018 · 8 comments
Labels
I-compiletime Issue: Problems and improvements with respect to compile times. WG-compiler-performance Working group: Compiler Performance

Comments

@michaelwoerister
Copy link
Member

http://perf.rust-lang.org/compare.html?start=39264539448e7ec5e98067859db71685393a4464&end=36b66873187e37a9d79adad89563088a9cb86028&stat=instructions:u

It's not quite clear what caused this regression, something in #49051. Pinging all committers from the range:
@jcowgill, @alexcrichton, @Eijebong, @mark-i-m, @kennytm, @Centril, @estebank, @snf, @ehuss, @Songbird0, @draganmladjenovic, @ExpHP, @petrochenkov

Any ideas? Did the rand crate get slower?

cc @rust-lang/wg-compiler-performance

@michaelwoerister michaelwoerister added I-compiletime Issue: Problems and improvements with respect to compile times. WG-compiler-performance Working group: Compiler Performance labels Mar 19, 2018
@michaelwoerister
Copy link
Member Author

@alexcrichton oh, I see this regression is more or less expected due to ff227c4. Pity.

This shouldn't affect stable and beta binaries, right?

@alexcrichton
Copy link
Member

Yeah this is almost for sure re-enabling multiple codegen units on dist builds. I do believe this affects stable/beta binaries, but such a change has already hit stable (aka today's stable release)

@michaelwoerister
Copy link
Member Author

Can we switch to one CGU for stable releases?

@alexcrichton
Copy link
Member

Unfortunately not really, we're seriously bleeding for time on CI as-is and we're looking for any wins we can get anywhere. We can leave this open if we ever get enough time on CI, but otherwise I think we just can't afford to do this today

@michaelwoerister
Copy link
Member Author

OK, thanks for the info. I'll close this issue since the regression is accounted for.

@sophiajt
Copy link
Contributor

sophiajt commented Mar 19, 2018

@michaelwoerister - sorry for jumping in, but is there a follow-up issue to track reclaiming this regressed time? One of our main deliverables this year is faster compile times, so each compile time regression is pretty serious.

Apologies if I'm misunderstanding the thread, just making sure we're keeping track of where we slip and where we need to recover.

@michaelwoerister
Copy link
Member Author

@jonathandturner I have an item on my todo list for setting up a tracking issue about building our releases as optimized as possible (including PGO, once that's available, and maybe full LTO). Should be up later today.

@michaelwoerister
Copy link
Member Author

@jonathandturner: #49180

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
I-compiletime Issue: Problems and improvements with respect to compile times. WG-compiler-performance Working group: Compiler Performance
Projects
None yet
Development

No branches or pull requests

3 participants