As I see the tutorial - "Show how to blink without using delay" - it is pretty clear as is and as such a rewrite is not really needed. When I read "serious problems" I'm expecting my board to catch fire or worse.
What caught my interest is more academic (e.g. can I learn something, is there some other aspect to this tutorial that may be useful).
The program has a small "drift" in the time interval; the blink rate is slightly longer than 1000ms.
This is true as millis() is called twice and quite possibly there will be an interval between the two calls that accumulate. For the skecth as such it is (subjectively) irrelevant and the change does not add value unless you also point out the difference. The fix you proposed and what I suggested as an alternative will correct this issue.
Then you post the additional requirement as follows:
... when the rest of application takes more than interval to execute.
If this is the case, your proposed change will accumulate a "drift" every time "the rest of the application" runs longer than interval. This is in conflict with what you set out to fix in your first post. What I proposed will not, but the time between loops may occasionally be shorter whenever "the rest of the application" runs longer than interval. If the loop always takes longer than interval to run, either variation will drift.
In my opinion the "better" code is the sample that best illustrate the text - so unles you change both nothing is gained.
Or .. Is there something I missed?