In most cases, you should only need to recompile the particular file you’re working on because interfaces should be changing a lot less frequently than implementations.
Any single file should not be so large it takes a long time to compile by itself.
If other files are getting recompiled anyway even though nothing about them actually changed, the dependency resolution in your Makefile (or whatever) is screwed up and you need to fix it.
Point is, routine long compilation times after a small change are a code smell. There’s something wrong that a faster CPU will only mask, not fix.
you can often just slap compiler cache on a project and get a 20-150x speedup, but when the original compile time was 45 minutes, it’s still slow enough to disrupt your workflow (though, I suspect you may be talking about some manual method that may be even faster. But are those really common enough where you would call the lack of it a code smell?)
I am thinking of development.
In most cases, you should only need to recompile the particular file you’re working on because interfaces should be changing a lot less frequently than implementations.
Any single file should not be so large it takes a long time to compile by itself.
If other files are getting recompiled anyway even though nothing about them actually changed, the dependency resolution in your Makefile (or whatever) is screwed up and you need to fix it.
Point is, routine long compilation times after a small change are a code smell. There’s something wrong that a faster CPU will only mask, not fix.
you can often just slap compiler cache on a project and get a 20-150x speedup, but when the original compile time was 45 minutes, it’s still slow enough to disrupt your workflow (though, I suspect you may be talking about some manual method that may be even faster. But are those really common enough where you would call the lack of it a code smell?)