How Should We Integrate AI Tools into Learning?

Jul 19, 2023
John Wark

And the survey says…

This is the second in a series of blog posts on the topic of AI tools/tooling and their impact on software development - both from a learning perspective and from a job demand perspective. You can see the first post here. 

The survey questions we reviewed in the prior blog post related to if and how working developers and analysts were using AI tools in their work. Such snapshots are highly valuable to us as we try to assess if and when to alter the curriculum of our bootcamps to reflect changing workforce needs, especially when we detect changes in required knowledge for junior talent. The survey questions covered in this blog post focus even more directly on the question of whether, and possibly how, to integrate teaching of AI tools into our bootcamp programs. 

 

The first of the three quantitative questions related to understanding and learning asked: does the use of AI tools enhance your understanding of coding concepts? Respondents answered:

  • Yes - 55%
  • No - 45%

When we asked for examples of how the AI tools enhanced the understanding of coding concepts we got answers such as:

  • “I wanted to learn about how to create an object of ClassB from an object of ClassA in C#. It explained the common ways that this can be done, such as using custom constructors and explicit instantiation. I'd learned that stuff before by googling, but then I was able to ask it about other methods, such as automapper, casting, and interfaces, and it really deepened my understanding.”
  • “I can ask ChatGPT how certain things work together and it more often than not gives a better concise and correct answer than me just googling and looking at documentation.
  • They do help slightly. But often the code is not correct due to ChatGPT being not up to date and not being able to share data with it.”
  • “AI, ChatGPT specifically, does a really good job at providing documentation/explanations of any code it creates, but also can tell you what any code you provide it does as well. And not just the actual function of the code, but also explaining concepts …”
  • “It’s more personalized and comprehensible than trawling through Stack Overflow.”

 

The second topic we asked was: do you feel that learning how to use artificial intelligence tools is important for preparing future developers for real-world coding scenarios?

  • Yes - 82.5%
  • No - 17.5%

The follow-up question was: Why is learning these tools important for preparing future developers? Compared to all prior questions, this one got quite a bit more response. One that particularly struck me for the evident thought behind it as well as the dilemma it captured was:

  • “I'm uneasy about answering this question. Considering this tool wasn't really around yet while I was in class, we were forced to debug, problem solve our own code and ask peers for help. I think this is a wonderful way to learn as every classmate had different strengths and you learned to work as a team as you would in the real environment of the career. I think it boils down to personality here. If a young developer thinks they can jump into a career prompting ChatGPT for every solution and copy / pasting the response, that's going to end very badly for them. However, it's a useful tool when utilized as listed above. It shouldn't be the first thing a developer does which could be very difficult to enforce, oversee in the bootcamp environment at NSS. Some folks might just want the ‘easy win’ and abuse ChatGPT to somewhat bluff their way through the program unfortunately.”

Additional representative responses to this question included:

  • “It's pretty clear that these tools are here to stay in one shape or another, and I think learning them now will help folks stay ahead of the curve.”
  • “With how accessible AI is becoming and how useful it is for developers, there's no reason not to take advantage of it, so long as future devs still understand the code being written both by them and the AI. To me it's much like frameworks and libraries, they're tools that make coding easier to do, but it's still necessary to understand programming concepts regardless of how abstract the code becomes.”
  • “I think that it doesn't hurt to introduce new developers to these tools now, 1) because they can often give quicker and more accurate feedback than an instructor when it comes to things like typos, versioning issues, etc. and 2) I can only imagine these tools becoming more and more integrated into the workplace, so knowing how to ask AI the right questions seems to be crucial for job security in the next few years.”
  • “The hard part is thinking of the right questions to ask the AI to get what you are looking for.”
  • “I believe this will be yet another tool for developers to use. We have Google. We have Stack Overflow. We have several forums and community sites. This is an efficient tool to help a developer answer their own question.”
  • “I'm iffy on this. I think a developer from NSS has enough knowledge to figure this out on their own if they can Google for things.”
  • “While I don't think the tools themselves are necessary for building integral developer skills, exposure to these resources in the workplace is all but inevitable at this point, so early familiarity with their capabilities and shortcomings would be useful for a new dev.”

 

As a follow-up topic to the preceding one on the importance of learning these tools, we asked an even more specific question relative to integrating AI tools into our bootcamps: Would you recommend the integration of artificial intelligence tools. . . in coding bootcamp curriculum to future students?

  • Yes - 76.7%
  • No - 23.3%

The follow-up open-ended question about integrating AI tools into our curriculum was: Why or why not? This question generated an even higher volume of responses to the prior question. There were so many responses that we can’t really share them all here. To summarize a bit the consistent themes were: a) Yes, but with caution, b) Maybe, but near the end of the class, c) beginners still need to learn to code without the crutch, d) be aware students are going to try the tools on their own anyway, so assessment will be key. A sample of the responses includes:

  • “Maybe, fundamentals are still important but you need to know how to use new tools.”
  • “Students will be exploring these tools on their own anyway, and setting early guidelines on the best ways to utilize them would be useful. It would also be useful for the instructors so that these tools are a known and examined aspect of the class, which makes it less likely that students just sloppily throw all their work into it and submit whatever comes out.”
  • “This one's hard, but I think it's something everyone is going to be using. You have to understand how to prompt the AI and not abuse it. Team Leads openly post about their responses from ChatGPT at ◼️◼️◼️◼️◼️◼️ so it's no secret everyone is using it. You can't avoid it but the curriculum needs to stress it's a TOOL not a SOLUTION.”
  • “Maybe just exposure and some do’s and don’ts. THEY ARE GOING TO USE IT ANYWAY, so prepare them for the real world and help them understand the advantages, and dangers, of using these tools.”
  • “Yes! But not early on. Maybe towards the end?!”
  • “At the end of the course. We still need to know how to code to be able to debug.”
  • “I answered "No" because the widespread use of AI-based tools for coding will make understanding how software works and fits together one of the most important values that developers bring to the process. In my limited experience, AI-based tools aren't good at helping you learn or understand new things; they're good at helping you do things you already know how to do faster. Learning to use AI tools for productivity is something that could easily be taken on in a first job where you can learn it in the context of the team you're working with.”
  • “I feel like it should be so brief if at all. Googling is still the standard IMO because I don't think ChatGPT is able to keep up with the rapidly changing web (cough JavaScript). I think the focus should be on googling and reading documentation. So maybe starting without showing these tools, and then demonstrating how these tools can enhance your workflow later in the class, because that is what they are essentially. Not a replacement yet. I actually recently worked with someone going through NSS who at the time was early on in the program, and I feel like ChatGPT distracted them from trying to figure out the problem. I don't have a good reason, but I feel like them knowing about ChatGPT was an inhibitor to their learning at that point of the program, probably because it is so easy to lean on it like a crutch as an aspiring developer. We all know why that is a problem.”
  • “Yes, but only at the end as an extension. Bootcamp is so short and you can get some really bad code from Copilot. It might work. You may get your capstone done, and you could have no idea why it's not a good idea or even what it is doing. This would be a serious pitfall for a beginner because as I mentioned before, the skill is not just using AI. It's discerning what is the appropriate use of the AI tool and whether that code/pattern fits the problem context.”
  • “Given how little time a bootcamp student has, I would still recommend covering the topic but only briefly. Students will most definitely benefit more from hands-on troubleshooting.”

It’s very interesting seeing the comments from our graduates speaking to many of the same questions that our Learning Leaders team have been wrestling with in considering if and how to respond to the rise of AI tools for coding and analytics. I’d say that the experiences of our graduates mirrored in many respects what our instructors (and in some cases our students) have found in our experimentation with AI tools such as ChatGPT and Github CoPilot. And their opinions regarding the importance of students first learning the fundamentals of how to do the work themselves definitely parallels our overall beliefs. 

So, how to balance the competing demands of exposing our students to the use of assistive AI tools while not undercutting or diminishing their learning of the fundamentals of how to do the work of delivering quality, working solutions? We’ll speak to some of our answers to that question in the fourth post in this series, but first we need to directly address one other fundamental issue - are software developer and/or data analyst jobs going to be eliminated by generative AI technologies. After all, why worry about how to train if the jobs aren’t going to be there? That’s the topic of the next post in our series. 

 

This was the second and final post outlining the responses from a survey of our graduates regarding their use of AI tools on the job. You can read the first post here. The next post in our series on AI tools will address the issue of the impact of these tools on the demand for such talent - will AI tools reduce or eliminate demand for talent, or will it drive additional demand for professional problem-solvers? Watch for it. 

Topics: Learning, Analytics + Data Science, Web Development, Software Engineering