October 3rd, 2024

Employers Say Students Need AI Skills. What If Students Don't Want Them?

Employers demand AI skills for job readiness, but many college students resist using generative AI due to ethical concerns. Attitudes vary by field, prompting calls for balanced AI training in education.

Read original articleLink Icon
Employers Say Students Need AI Skills. What If Students Don't Want Them?

Employers increasingly emphasize the necessity of artificial intelligence (AI) skills for future job readiness, yet many college students express reluctance to engage with generative AI technologies. A recent survey revealed that a significant portion of students are either unfamiliar with AI tools or outright refuse to use them, citing concerns over cheating, misinformation, and data privacy. While some students are open to AI, a majority believe its use should be limited in academic settings. The divide in student attitudes towards AI often correlates with their field of study, with humanities students generally more resistant than those in STEM fields. Experts argue that higher education institutions must adapt to this technological shift, as AI literacy is becoming essential for workforce preparedness. However, there is a challenge in changing student perceptions and institutional policies that have historically been cautious or restrictive regarding AI use. Many educators advocate for a balanced approach that includes AI training while also fostering critical thinking and communication skills. As generative AI continues to evolve, the role of higher education in equipping students with relevant skills remains a topic of debate, with some experts cautioning against overemphasizing AI at the expense of broader educational goals.

- Employers are increasingly seeking AI skills in job candidates.

- Many students are resistant to using generative AI, citing ethical and practical concerns.

- Attitudes towards AI vary significantly between different fields of study.

- Higher education institutions face challenges in integrating AI training into curricula.

- Experts advocate for a balanced approach that includes AI literacy alongside critical thinking skills.

Link Icon 14 comments
By @phkahler - about 2 months
Maybe employers are just believing the hype from companies like OpenAI. The student responses seemed spot on to me. They need to learn the material and how to do their job, not chat with a bot.
By @DiscourseFan - about 2 months
Its not really useful unless you're a highschool student or something. And even that's bad enough. There are definitely use cases but it will probably take like a decade or so to integrate it properly into industrial processes, and even then it will probably be limited and used in very strange ways you couldn't possibly imagine today. Like, someone is going to come up with something really fucking weird that saves a lot of time on some ultra-specific process and you won't even notice it. Meanwhile, people will expect work emails to be written with more personal expression or else they'll think an AI wrote it (I already know some people who got in trouble for using AI to write important emails and were then threatened with lawsuits for language in those emails that they themselves, of course, did not read).
By @Onavo - about 2 months
For the STEM students, what they need is statistics (with a focus on high level Bayesian and statistical learning theory, not just frequentist regression tricks) and differential equations. Then they can build AI. AI, or specifically deep neural network enabled machine learning, isn't some sort of magical black box solution with no weaknesses. (I will however admit that it is the best universal function approximator that we currently know of) Otherwise you end up with an uneducated public whose main education is from Hollywood and ChatGPT.

You need to start from high school, the AP classes need to be revamped. Currently they are focused on purely frequentist statistics. Frequentist statistics is great for most empirical sciences like biology. The formulas are mostly plug and play and even pure life science people with no mathematical talent can wield them without trouble. The problem is that they are very far from statistical learning.

Here's the current AP stats curriculum, it is meant to be equivalent to Stat 1.

https://library.fiveable.me/ap-stats

If you want to develop a strong foundation for ML, Unit 6, 7, and 8 ought to be thrown out entirely. The level they are taught at doesn't really teach anything more than plugging formulas. Unit 4.5 (Conditional Probability) and Unit 5 (Sampling) need to be further developed to cover the Bayesian theories, perhaps a segue into graphical models and Markov chains. Generative ML for example interprets likelihood as an information generator (since in the Bayesian formula, it is roughly the "inverse" of conditional probability), unfortunately most stats classes outside of physics and high level ML theory will never mention this. Heck most classically trained statisticians won't ever encounter this idea. But it is the bread and butter of generative AI. Having a vague idea of KL-divergence and what Metropolis-Hastings is coming out of high school is infinitely more useful for their career in ML than knowing how to fiddle with a p-value. You can teach most of these concept without calculus if you simplify some things and replace integrals with their discrete summation versions. Rejection sampling for example is very easy to teach. The Common Core needs a revamp, and perhaps it's time to shift away from the historical focus on calculus/pre-calc as the central tenet of pre-college mathematical teaching.

By @TrackerFF - about 2 months
Also, the way I see it:

"AI skills" is comparable to what using a search engine was before.

You'd be absolutely amazed how many people still can't use search engines, other than the absolute bare basics of typing something into google, and giving up if the result isn't on top of page 1.

I've worked with plenty of (non-tech) people that are like fish out of water, when trying to find information. Just learning stuff like boolean opeators, searching for words in quotation marks, specifying which sites and dates to search for, is way beyond what most people know, or do.

Some goes for LLMs. There's a difference between prompts, and knowing what to ask for, and how to structure your questions.

By @Almondsetat - about 2 months
Perhaps AI will achieve what Excel, scripting or programming failed at?

Education is slow, people still don't know how to use a spreadsheet software or scripting language to enhance their lives/work. Students don't use Excel or python to cheat on their homework or exams, so they don't really have reasons to learn those tools.

Meanwhile, user-facing AI tools are often extremely intuitive and feel quite natural to interact with, so by the time young people reach working age they already have familiarity with what they need

By @TimGlowa - about 2 months
It’s true—you can't lead a horse to water, and AI is no different. It's becoming like computers once were: you can choose to embrace it or not, but eventually, it will be a necessity. The bigger question isn’t just about students accepting AI, but about understanding the upskilling needs of both current and prospective employees. How do we plan to strengthen these skills to keep everyone competitive in a rapidly changing job market? The real focus should be on practical, future-proof skill development—whether students are eager for it yet or not, the workplace will demand it. Career coaches (real ones are always best) can help with this, providing personalized guidance and support. Alternatively, tools like JobMatch Pro can offer similar benefits at a fraction of the cost, helping users navigate career paths and develop the skills needed for the future. You can learn more at https://hrbrain.ai/jobmatchpro/.
By @TrackerFF - about 2 months
So a nephew of mine starter studying CS this fall, and I've helped him out with his intro to CS class.

They recently got a larger mid-term assignment which involves implementing some well-known, basic data structures, and the standard functionality associated with them.

In one of the problems, they were given skeleton code to a bit more complex functionality - and their task is to explicitly use LLM of their choice to fill in the code, and test if the code works, using a set of tests.

I think the class in general has been updated to assume that students are using LLMs more and more, as the problem sets this year are longer and more complex, compared to those of past years (which were made available to all current students).

By @tkgally - about 2 months
Ever since ChatGPT came out, I’ve been discussing it and other AI tools with the students in the university classes I teach. My impression matches the results of this survey. Some of the students have started following AI developments closely and using the tools, but many of them don’t seem interested and they wonder why I talk about it so much. Even when I told the students that they could use AI when doing some of their assignments, it was clear from their distinctive writing styles and grammatical mistakes that most of them had used it only sparingly or not at all.
By @drivingmenuts - about 2 months
Why do they need AI skills when they're going to be replaced by the same AI that is learning from their work?
By @rsynnott - about 2 months
Oh, ffs. What are “ai skills”?

I’m reminded that the big thing when I was in college was XML databases. XML databases, we were assured (though not particularly convincingly) were the future. I didn’t opt for the course covering XML databases, and somehow survive 20 years later; meanwhile no-one really remembers what an XML database even was.

(It was, all told, a rather boring time for tech fads. The 90s AI bubble had imploded with such finality that people barely dared utter the term 'AI', the dot-com crash had just happened, and the gloss was off CORBA, so weird XML-y stuff got pressed into service for a few years until the next thing came along.)

By @zingababba - about 2 months
My company fired devs and replaced with AI... It's coming.
By @bamboozled - about 2 months
Who cares what people want \s
By @cheema33 - about 2 months
AI is just another tool. A pretty good one. And there is a skill to using it efficiently. If you ask it dumb questions, you will get dumb answers. Garbage in, garbage out.

I am a software developer and I hire devs as well. If somebody is ignorant of AI or refuses to use it, that is hard pass for me. It is not all that different from an accountant not wanting to use computers. Sure, you could still do some work. But, you will not be competitive.

By @atleastoptimal - about 2 months
students will only need AI skills for the next 2-3 years, after which point AGI will render the "need" to have any skills meaningless as it would be to expect students to have woodworking or metalsmithing skills