I am not in a CS program myself, but I guest lecture for CS students at CMU about 2x/year, and I'm in a regular happy hour that includes CS professors from other high-tier CS schools.
Two points of anecdata from that experience:
- The students believe that the path to a role in big tech has evaporated. They do not see Google, Meta, Amazon, etc, recruiting on campus. Jane Street and Two Sigma are sucking up all the talent.
- The professors do not know how to adapt their capstone / project-level courses. Core CS is obviously still the same, but for courses where the goal is to build a 'complex system', no one knows what qualifies as 'complex' anymore. The professors use AI themselves and expect their students to use it, but do not have a gauge for what kinds of problems make for an appropriately difficult assignment in the modern era. The capabilities are also advancing so quickly that any answer they arrive at today could be stale in a month.
FWIW.
When I was in college in the early 2000s, it was the same. Most professors were at least a decade behind current technology.
> a decade behind current technology.
And how about computer science?
CS is not a degree in web programming framework or DNN modeling framework du jour. Algorithms, data structures, linear algebra, and programming fundamentals do evolve, but gradually.
None of the languages I use at work existed when I was an undergraduate. Very nearly all the data structures and algorithms I use at work did.
Something tells me it was always like that. My university professors were teaching things nobody wanted to learn, and people were practically begging to be taught more up-to-date hireable skills.
Every time there was project work, we would be recommended using Swing or similar because that is what professors knew, but everyone used React because nobody hires Swing developers.
Someone once said "Our SQL professor's SQL knowledge is 10 years out of date. Probably because he has been a professor for around 10 years at this point" and that kind of stuck with me.
Someone told me that once a good idea came about it took about 5 years to process it into a book and then it took another 5 years to be accepted by people teaching outside of consultancies.
Of course, by then, it was antiquated.
- [deleted]
I wish it was decade for me, in early 2010s they were still teaching 90s approach to handling complex projects(upfront design, with custom DSL for each project and fully modelled by BA without any contact with actual users, with domain experts being siloed away - and all of that connected to codegen tools for xml from the 90s)
It can be worse! I went back to school for some graduate work in the early 00s after having been in the industry for a handful of years. There was a required class that was one of those "here's what life is like in the real world instead of academia".
The instructor was a phd student who'd never been in industry.
He kept correcting me about industry practices, telling me that I had no idea what the real world was like.
The Rodney Dangerfield film, Back To School covers this:
I had to deal with Java codegens from UML specs in 2021. So, nothing has changed! :')
Still there, cs bachelor
Back when soap wasn’t just for hygiene.
I still see software sold as soa compliant, whatever that means. I think we have just started recycling and mixing sw memes at this loint. Like you see someone wearing bell-bottoms with an 80s dayglo jacket. We do agile soap waterfall kanban model driven design here.
This is why I have always said, that a degree in CS is useless without some degree of passion towards it.
No professor can enable you for tomorrow, and a CS career is one of constant education.
I'm glad I learned some STM32 assembly, but with the resources available today, I wouldn't get anywhere near as deep as I did in the early 2k's.
I am building a local low power RAG system for the programing languages I like, but I'll still include stm32 asm.
> This is why I have always said, that a degree in CS is useless without some degree of passion towards it.
I would add I don't know how anyone can do any degree and career without some sort of passion for it.
For me personally, not only do I need passion but I have to have some sort of belief in the product and/or company I'm working for. In the early 00's I worked at a company, not software related nor was I working as a developer, and didn't like what I was doing nor did I believe in the product, it was lacking in so many areas where they were trying to frame it fit in the product market. I left after 3 years and did something completely different.
Are for instance the Knuth books "behind current technology"?
No.
A CS degree is not about the javascript library du jour, it is about the fundamentals of computation which don't really change.
Having taken a graduate-level CS course as a non-CS major, yes sw is about a decade behind what is actually being used. But the algorithms don't just magically go bad.
In the UK I did comp-sci from 2000, did a couple of extra modules. One was from engineering and covered communication theory -- nyquist etc. Another from was the English Department of all places and covered XML and data.
Very little coverage of tcp/ip in any of the courses. Language of choice in CompSci was Java at the time, which was reasonable as OOP was the rage.
Some compsci lecturers were very much of the opinion that computers got in the way of teaching Computer Science.
- [deleted]
I did my CS undergrad in China but was already in the UK early 2000s. I was also abit surprised there's little mention of TCP/IP which is kinda considered classics if there's anything taught in CS at all. Java was definitly the new dominating force in industry and academia at that time.
However it depends on the resources the univ got. In some places there were other less Comp sci / software engineering focused degrees but got a little content overlap (I guess for financial benefits to enroll more students) such as e-commerce / digital degrees. They shared some courses with CS but not all.
It's difficult to remember clearly from 25 years ago, the OSI model was certainly covered, and I clearly remember datagram programming, but nothing in terms of say network routing protocols.
The engineering course covered token ring. Remember in 2000, and certainly a few years before (when I suspect half the courses were created as lecturers often go years between updating them), Ethernet and IP were not the only kid on the block. Netbios/ipx was still in widespread use, Token ring (which I do remember being covered, as I'd encountered ipx and ip over serial and ethernet, but never token ring) was still being developed. HTTP was only 9 years old.
"Most professors were at least a decade behind current technology"
Surely there are some core concepts.
I hear that schools today aren't teaching how to build a compiler. But to me this seems like a task that contains so many useful skills that can be applied everywhere.
To be fair, college CS programs have always been decades behind in my experience. Maybe schools like Stanford and MIT are different but the majority of CS programs are not teaching tech that is actually used in the business world.
Maybe I’m an oddball, but I’d rather hire a new grad with sound fundamentals, but learned on an older tech stack, then somebody with all the buzzwords but no fundamentals.
And I’ve always found summer internships to be good way to find out. Even better if the candidate is willing to work part-time through their senior year.
Yeah. I see a phrase like “hirable skills” and… it feels like “skills” that are probably going to be outdated every couple of months.
100%.
For me, "hireable skills" (for a new grad) are things like "can do a basic whiteboard exercise". I'll ask them to sketch out a program to solve a business problem. I do higher ed software, so usually start with "build a class registration system from scratch" - they're recent grads, so the problem domain is known; there's plenty of space to discussion to move in several different directions; fits nicely in 20-30 minutes.
Bare minimum, I'd expect them to ask clarifying questions (particularly around system constraints, performance, etc). And then sketch out a very basic system diagram (I don't expect them to know AWS or Azure, but do want to see things like "ID provider", "course catalog", "waitlist service", etc. Then I'll pick a service and have them pseudocode some of it.
Sadly, somewhere around 50% of grads CANNOT do the above. I'm not sure how, but I've left interviews thinking "I hope they get a refund" more than a few times.
The Pythagoras theorem doesn’t change even if you use an LLM. Fundamentals shouldn’t either. Don’t see why schools should see this any differently.
> The Pythagoras theorem doesn’t change even if you use an LLM.
Indeed. But it does change if you want an answer on a non-Euclidian surface, e.g. big scale things on the surface of Earth where questions like "what's a square?" don't get the common-sense answer you may expect them to have.
I bring this up because one of my earlier tests of AI models is how well they can deal with this, and it took a few years before I got even one correct answer to my non-Euclidian problem, and even then the model only got it correct by importing a python library into a code interpreter that did this part of the work on behalf of the model.
I agree. That's why universities should never teach any practical real world programming languages. They should stick to Scheme and MMIX.
Not sure if that's sarcasm or not, but when I was in uni (late 90s), it was C++, which was very much a practical real-world language. There was a bit of JavaScript and web stuff, but not much (but Javascript was only 4 years old when I was a senior, so...).
Yes it is just you. Every application for a job gets hundreds of applications. A company is not going to hire someone with no experience or knowledge over someone who does.
> A company is not going to hire someone with no experience or knowledge over someone who does.
“Solid fundamentals” are literally knowledge.
That said, you’re probably right. At least in data, hundreds of mediocre-to-awful hiring managers have convinced themselves that their stack is special and there’s no way someone without experience in Snowflake (or whatever) could possibly figure it out based on experience in other stacks.
On the plus side, it’s meant that anyone who’s not intentionally shooting themselves in the foot can find a ton of high end talent because they recognize that know a specific language is valueless compared to understanding how to code in the first place.
And yet juniors can’t find a job to save their lives now…
And someone without experience in Snowflake I guarantee you will try to treat it like the OLTP database they are familiar with and have horrible results. If you don’t think need that specific experience, you are kind of proving the hiring managers point.
I don’t have experience with Snowflake myself. But I know enough about OLAP columnar databases (Redshift) to know how the schemas should be designed (ie it’s in the name)
I mean yeah, I agree, but is it that hard to keep relevant technology in the mix? I'm not saying everything has to be cutting edge!
Sure, but are C++ or Java really that outdated. AFAIK that’s what most schools teach. Maybe with some JavaScript as well. It’s not lime they’re teaching Fortran or COBOL.
And with the advent of AI coding, I’d hope they can spend more time on system design, as that’s where I’ve found new grads are generally lacking.
> Sure, but are C++ or Java really that outdated.
In what sense is either "outdated" at all?? Especially Java. Anybody who's paying attention to Java since about Java 11 would know that Java is very much a modern language at this point. I don't write much C++ myself these days so I haven't kept up with that as much, but my subjective perception is that C++ is also modernizing quickly over the last decade or so.
That was my point! Unless I missed it, those are the two most common core languages at colleges, and they're both very much alive and in use.
The irony is that if they taught COBOL today, those grads could likely get a good job working on legacy code.
I took a COBOL course during undergrad in 1998. Glad I was exposed to it, but I never did anything with it.
Many professors view teaching as a secondary obligation. Even if they don't it takes more time to learn to teach something than just to learn it. Our field is moving so fast that outside of the major innovations, it would be quite difficult to keep up being a good teacher on everything, while also doing research, and doing the actual teaching. In addition, most new tech isn't very interesting, or useful. Like every couple of months I'm getting another peak at SOTA Python or JS and the "innovation" is just another layer of duct tape that doesn't really improve much.
Cool tech usually also sees faster adoption in academia. Rust courses where offered at the uni I went to back in 2017 for example. According to my friends still involved with uni, there was also a strong shift towards more data science/engineering and HCD since then, both fields that saw major practical improvements.
When I was in CS, we were taught theory. If you wanted to be caught up with the current tech, you'd teach yourself.
That was my experience in the 80's - we were taught theory, we had to apply the theory in projects so we spent lots of time programming and getting stuff working - but we were pretty much expected to pick up particular languages, operating systems or libraries by ourselves.
The CS theory (i.e. maths based) side of it really has stuck with me - only other thing being vi controls being hardwired in my brain even though I went on to become more of an emacs fan...
Which is a good thing. They should be teaching the cornerstone principles, not offering vocational courses.
I think having one or two "software engineering" courses where it's project-based really helps. You get to actually learn how to use Git, work in a team, and architect and finish a project on time - which is going to be valuable no matter if you're seeking a software engineering job afterwards or stay in academia.
You think most people spend tens of thousands of dollars on college and expect not to be employable?
In my experience the places worth working for care about cornerstone principles far more than the hot buzzwords on a resume.
Well if you don’t have a job, the places worth working for is the one that will pay you in exchange for your labor so you aren’t homeless, hungry and naked.
A junior developer can’t say “you know what I don’t want to work for your company because you don’t value cornerstone principals. I would rather sleep on the street”
my old CS prof at my uni used to say when this question came up "do you sign up for an astronomy course and expect they teach you how to build a telescope?"
It's always puzzled me why people sign up for an academic education that has 'science' literally in the name and then complain when they get a theoretical education. It's not a tool workshop
Because they haven’t overcome their addiction to food and shelter and they need to make money to support their addictions?
universities aren't job centers either. They don't supply you with food and shelter, you pay them money and they give you the education you want. Which can mean making a lot of money if you happen to pick something private businesses value but it can also mean reading Ulysses or The Old Testament or number theory all day.
Higher education is entirely up to you, it's not a company pre-training. If you want that there are literal vocational programs that are not computer science.
And I told both of my step sons I wouldn’t pay for a degree that wouldn’t lead to a job and I had them research the expected income and types of jobs they could get based on the degree they were pursuing
The best CS programs teach a lot of tech that is not used in the business world. The they're often too theoretically or too experimental.
This is CMU so they would be at the bleeding edge just like MIT/Stanford. But I think all the schools are behind today
Interesting that the algorithmic finance firms are still recruiting. Perhaps they still need a pipeline of rigorous thinkers, or are unwilling to cede significant influence over P+L to llms.
In the future it will be considered one of the most unusual cultural/social decisions ever, that large financial services firms are as they are in the Western world.
I have never seen a group of people so frantically doing nothing of any value.
Because the market is eternal competition. If one does something that works others have to figure it out and nobody puts their ideas in open source.
How much drastic would things be if these corporations do open source it? I like to think that markets are fairly efficient so they are fighting tooth and nail for micro-percentage points which granted can be billions but usually what these companies really do is short of fraud at times which can be celebrated by finance (Jane Street frauding Indian investors)
My opinion is that they aren't worried about their competitors so much as the govt.'s patching the loopholes that they do because the only way they are a net sum positive game (in my opinion) is that they make money from the losses of the average person and that too in fraudulent manners at time.
Jane Street's $5 Billion Derivatives Scam Rocks SEBI :https://frontline.thehindu.com/columns/jane-street-sebi-scan...
Typing code has never been the difficult part of quant finance.
> Jane Street and Two Sigma are sucking up all the talent.
This is the most made up thing I've ever seen on hn. Those firms hire probably 10 new grads a year (maybe combined!). Unless you're saying the collective talent graduating "high-tier CS programs" numbers in the 10s, this is literally impossible.
yeah and 2s has not been doing too hot for a few years now. Jane street I buy - they tend to recruit a lot of CMU students. But definitely less than < 15 of the new grads they hire each year are from CMU. They maybe hire on the order of 50-100 new grad SWEs a year.
Way, way more than 10, but I agree with you that they are not taking even 1% of tech talent per year.
> but do not have a gauge for what kinds of problems make for an appropriately difficult assignment in the modern era.
I have no idea what is complicated anymore. You can build a 3d game engine in a weekend or two with Ai.
> They do not see Google, Meta, Amazon, etc, recruiting on campus
Really? As in FAANG has stopped recruiting graduates?
I am not a graduate but Apple has reached out to me twice in the past month. Many others too so I wouldn’t say it’s absolutely dead but it’s tightened a bit.
They still probably do, but mainly in India.
FAANG employees here are cheap to hire. They work very hard to remain rich or become rich from nothing (50-60LPA will basically make you rich in 5-6 years if you save and invest well). Leetcode grind and competitive problem solving is Indian childhood bread and butter these days. And given how much househelp exists in India this kind of model is perfectly suited to be outsourced to young and middle aged Indians who have virtually no life beyond CTC anymore.
I’m just surprised it took them this long to outsource.
The risk of course is people start their own companies learning from big tech and Indians get more UPI like tech.
If the Democrats were smart (they are not) they could landslide next election (and 5 more) by running a simple campaign, “Americans First,” the core of which would be to slap 1000% tax of any job which is outsources. Your company wants to hire someone from X, for every dollar paid in salary you have to pay $100 to the IRS.
The funny thing is that this same 60LPA person will happily take a 300k job (assuming parity) with a 50% tax cut because the standard of living is still quite high in maybe Atlanta or Pennsylvania compared to Hyderabad or Bangalore.
In the above scenario the federal government is collecting zero taxes for the employees and the shareholders are getting richer.
By cutting H1Bs the Americans are actually losing money by outsourcing jobs and creating a larger divide between the rich and the poor. Something that the rich actually don’t have a problem with and something people just seem to miss.
> …something people just seem to miss
this is because “people” have stopped thinking for themselves are overwhelmed by “social” and all the rest of the “media” pushing whatever narrative the ruling party wants them to hear. they see “oh look, we have a problem with H1B which will be solved by $100k payment” - boom - “America First /s”
I work on a project where 40+% of staff is off-shore, surely it is much worse in many other places
So are you saying that no multinational company ca have employees overseas?
This is really a poorly thought out proposal.
there are tens of thousands of national companies - hence the term “out”sourcing.
the “multinational” issue can be solved as well (if anyone cared to solve it)
So exactly how do you solve it? Don’t let any multi national companies hire from outside of the US?
Are you going to also ban “national” companies from setting up overseas departments?
we can start with 37,574 national companies which will take care of tens / hundreds of thousands of jobs and expand out.
don’t be a “can’t be done” person looking for excuses, be a solution guy - it will serve you well in life. this is an actual real problem that needs a solution - be a part of that solution
And you are going to tell all of them they can’t hire cheaper developers but the multi nationals can? You do realize that it puts them at a competitive disadvantage?
Are you going to also tell them they are not allowed to expand overseas?
you are completely missing the point…
No you are making completely illogical proposals without thinking about the very obvious knock on effects or how illogical it really is.