COMMENTS:
Disclaimer: I am not a CS degree holder. But I did attempt a masters in software engineering and it was really eye-opening to me to see how far behind "traditional" curriculum was when compared to real-world opportunities. My university completely overlooked things like front-end web development and a host of other modern needs at the time; when I spoke to the dean, he recommended the school might just not be a fit.
So if we're leaving students behind for AI outputs and shutting the door in favor of the old guard, what happens to the new wave? Will schools train them to get to the point of working with systems enough to call bullshit on AI? Are schools even teaching students AI right now?
I ask sincerely just because when I pushed for change in curriculum, I basically was shown the door that led to me dropping out and (thankfully) landing a gig at a startup that opened up another (better) door for me.
There must be plenty of opportunities to learn that also in USA.
It, in fact, ended up mattering a lot.
It's a shame since once you have the foundation, the degree is irrelevant imho. You continue learning on the job and off the job.
It's not that learning AI isn't important, it's just that school isn't necessarily the place for it: AI is always changing, and can be learned relatively easily online, while properly learning something like linear algebra or operating systems online is pretty unlikely except for a very motivated student.
Some programs have had to supplement their CS-curriculum with courses incorporating instruction on software engineering practices (e.g. version control). The "AI" being sold today isn't going to replace these practices.
This isn't really limited to CS programs either, something is deeply wrong with the system when college grads are only marginally more prepared to enter their chosen field than HS grads. One of my friends does work building cutting edge sensors for scientific equipment— literally the ideal case for an academic curriculum. They not two months ago rejected an ivy ECE major in favor of a guy with no degree but who builds custom toy drones as a hobby.
Trouble is I work in a lab building GenAI products. You can't play a player. Remember kids, your fingers move when they type and your eyeballs move left to right when you're reading.
CV was also full of exaggerations that were generated.
Candidate rejected without feedback. We were lucky, suspect someone else has been stuck with the gpt copy&paster.
I’ve had interviews where people have their friend bridging calls and doing the lookups. Some were incredibly stupid that we heard the typing from the next room lol.
Anyway remote interviews are a nightmare already.
How long until this is also fixed by the software? :-) This is within the state of the art now.
>> CV was also full of exaggerations that were generated.
+1. I see companies offering AI-generated CVs which barely even ask about the work a candidate has done. One of them stated that the customer should refine the AI-generated CV to match their actual skills and experience, but did not say anything about how their product/service helps with this. It's just left at AI-autogenerated CV. (Shame on such companies.)
There are many companies (startups) in AI these days that are just creating prompts/templates. Their value addition is just from more experimentation with models and prompts than an average user. I agree this wouldn't be much, however could be enough to lure non-savvy users into paying.
But it seems like a bad sign for a technology when think pieces need to be written portraying skeptics as cro-magnons on the evolutionary scale of adoption, there's something very "emperor's new clothes" about it. The value of revolutionary tech typically speaks for itself and doesn't need extensive apologetics.
LLMs might replace the bootcamp boys and dilettantes but there will always be room for people who know shit. If that isn't you, make it you.
The inputs were traditionally source code, but sure, we could in principle use prompts for an LLM as the primary inputs, which then produces source code that gets fed to the rest of the build chain.
But editing the source code produced by the LLM is a non-starter because then you're editing build artifacts.
I VOTED FOR KAMALA AND YES I SEE THIS AS AN ISSUE
item_43269196