Blog

Should We Teach Generative AI to Junior Developers and Analysts: NSS Alumni Perspectives

Written by John Wark | Jul 17, 2024

This is the second post in a series that reports on the results of our second annual survey of NSS alumni’s experiences using - or not using - generative AI tools on the job. For the first post in this series, see here.

In the first blog post in this series, we reviewed the survey questions related to if and how working developers and data analysts are using generative AI tools in their work. Such snapshots are highly valuable to us as we try to assess if and when to alter the curriculum of our bootcamps to reflect changing workforce needs, especially when we detect changes in required knowledge for junior talent. 

The survey questions covered in this blog post focus even more directly on the question of whether, and possibly how, to integrate teaching of generative AI tools into our bootcamp programs. Just as we found last year, there are very real differences of opinion among our alumni regarding the extent to which we should be teaching the use of generative AI tools - or even whether we should be teaching anything about these tools. While we have already decided to incorporate some teaching on the safe use of these tools, there are still real questions about how best to do that and how deep to go.

Do you feel that learning to use generative AI tools is important for preparing future developers or data analysts for real-world coding scenarios?

This question is interesting because it’s the only one that became more negative in 2024. Not hugely so, but there was a 15% increase in No votes. The negative sentiments tended to be like these comments:

"I think it is a terrible idea for students, or even junior engineers to use these tools. In order to use these tools effectively, you need the experience to read code very effectively, and I feel that these tools would hinder that learning process."

"The hard part of using these tools is that it can feel like you know more than you actually do. I'm glad that using them wasn't an option when I was going through school as it would have been tempting to use it to solve problems without understanding why something was the case. Further, these tools are not always correct, and you need to have some base knowledge to know what's true or false."

"I answered no for two reasons. First, you might end up at a company who blocks all access to it. Second, it’s pretty easy to use and getting easier by the day."

"I think it is another tool. However, just like it might be helpful to learn vanilla JavaScript before picking up a modern framework, understanding fundamentals and patterns is still paramount."

"It depends. Will they know enough on their own to debug and complete a technical interview without the use of AI?"

"Yes, but also no. They must understand the basics themselves or any usage of AI is a crutch. They should learn how to extract value from genAI solutions without relying on pasting their own code into one."

"No, with a caveat, I think it is a great tool for learning. I think students should be encouraged to enhance their understanding of concepts but having been part of an instructional staff I have noticed a stark decline in serious engagement and drive to understand what it takes to be a data analyst and how to write SQL queries or python code. I’m torn, I don’t think it should be a part of the data analytics curriculum. I think it could be a useful tool to enhance students learning and understanding. Unfortunately, it seems to be more used to just produce code for the student leading to a lack of understanding of key and absolutely crucial troubleshooting skills that have to be honed by allowing yourself to make mistakes."

The flip side of the coin is many comments along the lines of these:

"This is really tricky... I'm glad I learned the basics before AI tools came around. It probably would have made me a lot lazier and reliant on AI. But on the other hand people should know how to write good AI prompts and use the tools because they are only going to get better and become more used in the industry."

"It makes you a better employee if you use it right and also, if you don't use these tools, someone else will and be faster and better than you."

"Why not? Especially in this field, people should be quick to grow and learn and accept new technologies. AI is super helpful and it isn't 'writing the code for you' so let's teach people how to use it to our advantage."

"That's the way things are moving. Get on board or get left behind."

"My Dad told me to think of it like digging a ditch. We don't dig ditches with shovels anymore. We use tractors."

"I think it's a tool that's here to stay and future devs should be taught how to use it responsibly. More important is learning how to define the problem and formulate an educated question. This involves an established understanding of the topic."

"It’s not going away. Being able to write good prompts will be a necessary skill."

"I think AI will continue to be ingrained in our careers, and it's important to understand it from the beginning. Things like GitHub Copilot will continue to grow and become faster/better/stronger, so it's best to know how to properly use these tools going forward."

There are definitely strong opinions on all sides of this question. We’re working to navigate through the maze as all of this matures and continues to rapidly evolve. Our current opinion is pretty close to the comment directly above blended with a strong belief that developers must know the fundamentals themselves given the fact that genAI tools cannot be trusted - we must still have a human in the loop and that human needs to know how to do the work without using genAI. 

Are there specific skills related to the use of generative AI tools would you recommend be introduced to beginners during the bootcamp experience?

We didn’t ask this question last year - we asked a narrower question asking about integrating the use of genAI tools into the curriculum. We felt that question was too similar to the prior question so we made the question a bit broader with a focus on skills related to genAI rather than the tools themselves. 

It’s interesting how negative the responses are as compared to the sentiment in the answers to the preceding questions in the survey that we reviewed in the first blog post in this series. The answers to this question are also more negative than to the question immediately above, which seems a bit contradictory but also which may reflect the mixed feelings about genAI that come across in many of the comments above and below. 

Here’s a sample of the affirmative comments alumni left in response to this question:

"Understanding how to write good prompts. AI is like communicating with a person, if you can't get your idea across to the AI, it's going to give you bad or incomplete responses. The concept of having good communication skills still holds true when interacting with AI."

"Learning these tools as a supplement and not a crutch. They are wrong very often, so copy and pasting their output without understanding it will do much more harm than good."

"The skill to know that AI has no idea what it's talking about and you'll need to test and thoroughly verify everything it puts out."

"How to write questions. The idea of the "context window" and the basics of how genAI works. Knowing these things allow you to craft better questions to get better responses."

"The two main concepts I would focus on is effective ways to ask it questions, and understanding that it can hallucinate incorrect results very convincingly."

"Ask the AI to explain every vocabulary word you don’t know. Don’t take any new concepts for granted and dive deeper into them when they aren’t understood. For my experience, I felt like chatGPT really helped me have a strong technical vocabulary that set me apart from other junior developers during the interview process. (At least that’s what my now employer tells me.)"

"Yes and no. The biggest thing is DO NOT USE AI for code generation when starting out. You have to learn the basics before using code generation. Because the vast majority of generated code is subpar. I said though that having a chat open and asking for explanation on what code is doing and why can be very helpful, using AI chat bots as a personal tutor is a powerful pattern to practice."

"I would just teach folks how to use it for questions and debugging initially. And also teach them that if you give it code or ask it to write code, it won't work as smoothly as you think."

And here is a sample of the more skeptical or negative responses to the question:

"I think its unnecessary, and can be used as a crutch to get assignments done instead of learning."

"No - if you know how to learn you can figure out how to use these tools. If you don't know how to learn, you will use them in ways that short circuit what should be a learning process. I find learning how to learn to be the essential idea, then how to leverage AI to learn. But it's like: how would you leverage Gabe C always in your pocket? What kinds of questions would you ask? I would emphasize: DO NOT USE AI TO GENERATE CODE very much. Use it as a thought partner."

"AI tools are not useful so there are no specific skills. At most, one could say debugging bad code outputted by LLMs is a useful skill, but that’s not different than debugging code generally which is one of the important skills."

We gave the alumni a chance to let us know if they saw any needs for professional development/continuing education opportunities that we might offer for graduates or others in the community. However, I don’t think I worded the question clearly enough so we got a lot of answers that appeared to relate more to teaching students during bootcamps than to continuing education for working professionals. As a result, there really isn’t anything to share from the answers to that question other than the fact that folks did suggest some ideas for workshops or classes and we’re considering them.

Finally, we gave alumni a last opportunity to share any other thoughts regarding their experience using AI tools. Here are a few of their thoughts:

"I did not have to go through a coding interview to get my job, but I would imagine that the over reliance on tools like ChatGPT or Copilot could potentially cause issues in interviews where you don't have access to these tools and need to solve a problem without them."

"My cohort used Stack Overflow exhaustively. I would say that ChatGPT is simply the evolution of that. Just like it is the evolution of a text book for my grad school work. I am using it to teach me algorithms at the Masters Degree level. It's like a private tutor. Maybe that is one way to use. 'Teach me a for loop.' 'Teach me recursion.' 'Teach me the difference between binary and hexadecimal'. Etc."

"To summarize: I think for more experienced developers, generative AI CAN be an extremely useful production tool. As a learning tool, I don't recommend it to people who are new to development, as I think it takes away a lot of the critical thinking if they become too reliant on it."

"My general thoughts on using AI tools is that I have concerns that it may stunt some folks ability to grow their skills and understanding if they rely on it too much to write code for them. However, when used as a knowledge resource it is quite useful for expanding knowledge and expertise."

"I didn't start using these tools until after doing your last survey. Since then, my view has changed and I think they're a really efficient way to improve your programming skills, as long as you know how to use them effectively."

"Be careful with company policy and AI. It can cost you your job if they find out you're doing it irresponsibly. Don't copy/paste code. Change it so as to protect the code base as intellectual property of the company."

"I have heard talk of AI replacing the software developer. From what I have seen, this falls short of being true. Instead, AI is a great tool for enhancing productivity."

"For Data Analytics, I recommend not teaching anything about AI’s until right before capstones. Personally, I enjoyed the challenges of each assessment and having to sort through class materials."

"I just got this f***ing money and Sam Altman [ED: Sam Altman is CEO at OpenAI, the ChatGPT developer] is f***ing it up for me."

Relative to that last comment - watch for the next blog post in this series. We’ll update our projections from last year regarding the impact of genAI on the job market for software engineers, data analysts, data scientists, and other tech careers. We don’t actually think that genAI is going to have the impact implied by that comment!

This was the second and final post outlining the responses from a survey of our graduates regarding their use of AI tools on the job. You can read the first post here. The next post in our series on AI tools will address what else we have learned in the past year on the issue of the impact of these tools on the demand for such talent - will AI tools reduce or eliminate demand for talent, or will it drive additional demand for professional problem-solvers? Watch for it.