If you view Go as a "rewrite" of C, then perhaps, but I wouldn't. It's a new language altogether. C can be embedded in Go easily, and even if it couldn't, there's no reason that one has to "win out" over the other. There's less direct competition between two languages than there is with a family of OSes. And Go was never intended to replace C completely - it just provides a better alternative for some subset of what C/C++/Java are used for. On the other hand, any given system is going to run only one OS[0].
In any case, as I mentioned, it's a tradeoff. There's no absolute answer, though I think he's right here in that the benefits don't outweigh the costs for this example.
Put another way, much as I might like to run a microkernel, I'd have a hard time concluding that it'd be the right move for the Linux project to spend time refactoring the entire codebase into a microkernel!
On the other hand, to see an example of a rewrite that was successful, look at Reddit, which rewrote the entire codebase in its early days. GCC could also be considered an example, depending on how you look at it.
[0] You can run more than one via virtualization, sure, but a number the benefits of Plan 9 come from having an entire set of computers running the same OS.
Go is certainly not a rewrite of C. It looks like most/all of the go compiler have been/are being rewritten from C to Go, described at https://docs.google.com/document/d/1P3BLR31VA8cvLJLfMibSuTdw... . That looks to be machine-assisted, so I'm not sure how good an example it makes.
This "Go is a rewrite of C" misinformation that's been floating around lately is getting almost as bad as the "JavaScript is Scheme-like" nonsense that has gotten uncomfortably common.
Although incorrect and annoying, they are at least becoming a way to identify people who likely don't know what they're talking about.