Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm in graduate school now, and I took some CS courses in undergrad ~20 years ago, and fundamentals haven't changed as much as the superficial garbage that is all the rage in industry. There are new approaches and capabilities, especially around deep learning (largely because we've replaced the push toward higher clock speeds with a push for parallelism, and almost everything is close enough to a parallel problem that a neural net can do the job), but the fundamental ideas don't change that much because real computer science (as opposed to industry wankery which pays $500,000+ but rots your brain) only accepts ideas when there's at least some evidence they actually work.

So, if you studied computer science in 1990 and kept the knowledge up, you'd still know a fair amount of what's relevant today. The problem is that what's employable isn't actually computer science, but trendy industry nonsense... the real CS is just in place to be an IQ test, a first-line hiring barrier.



> The problem is that what's employable isn't actually computer science, but trendy industry nonsense...

You may be going a bit hard on the industry. I see where you're coming from, having a long academic past myself and seeing there is a lot of BS in industry. But there is also a lot of meaningful things in industry and academia is sometimes a little too arrogant about that.

Given that you went back to graduate school and are likely a bit frustrated with industry, I can see why you have that stance, but neither extreme is good. Neither is everything in industry just trendy nonsense nor is everything in it the holy grail. There are real problems being solved in industry. That's what produces systems that people actually use, be it mobile phones or server infra or network gear or embedded systems, which after all is the whole point of all this. There is also a lot of trendiness and lots of people who just hack things and don't have a clue. It's not black-and-white. Similar with academia. There is a real benefit to the fundamentals being taught and researched in academia. And there is a lot of nonsense there too, tons of paper produced just to produce papers and get the next grant and stay employed. (If anything, the gatekeeping in academia is way worse than in industry.)

Both worlds have their place and it's sad to see disparaging comments from either side about the other one.


Spoken like someone who has never worked in industry. There's a lot of BS, but in general academia and industry have vastly different goals. It's so fucking different getting something to work vs building something that a team of 30+ engineers can understand and maintain. There's a reason it's called "software engineering" and not "computer scientist-ing", they are entirely different roles. You have to worry about things like reliability, logging, failure modes, etc. that quite frankly nobody in a lab thinks about outside of fixing their one-off script to get the result they need one time for their paper.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: