When ChatGPT, a large language model–based Artificial Intelligence (AI) chatbot, was first released to the public in November of 2022, it provoked a range of reactions. Optimistic tech enthusiasts hailed it the beginning of a brave new world, while skeptics worried it marked the beginning of the end. But everyone seems to agree the technology will have a significant impact on society.
Many corporations have already greeted generative AI with open arms, using it to make workflows more efficient and products more potent. In the realm of education, however, this paradigm shift has met with far more worry and hand-wringing. We’ll take a look at some of the top concerns about AI in education and the adaptations teachers are making in an effort to keep up.
For many educators at the high school and college level, the most pressing concerns around AI involve academic honesty. ChatGPT can provide original responses to a vast range of complex questions, and it can even produce entire research papers with works cited lists. Conventional approaches for identifying cheating, such as plagiarism detection software or searching for matches online, are no longer sufficient in these cases. While some programs exist for detecting AI authorship, they can rarely prove anything conclusively. Moreover, these programs are reactive, meaning developers will constantly have to produce new detection programs to keep pace with rapid advances in the field of AI.
While generative AI can seem all-knowing, it’s only as smart as the information it’s trained on. Large language learning models such as ChatGPT or Google’s Bard don’t have the ability to critically evaluate accuracy or bias, which means their responses can occasionally contain errors. If you’re an expert in a specific topic who chooses to test an AI’s knowledge, you might notice these errors. However, students who use AI as a virtual tutor rarely realize when the AI makes mistakes, and they don’t always understand the importance of vetting information. This could lead to students underperforming on assignments, but it also has larger implications as these students graduate and struggle to critically evaluate source materials in an information landscape riddled with half-truths and partisan bias.
Because AI is capable of completing so many routine tasks such as drafting emails or writing papers, some question if there is even a point in continuing to teach students to do many things for themselves. The problem is that even though AI can be helpful in many ways, it’s always best for users to review AI-generated writing to ensure it is of sufficient quality and appropriate for the task at hand. If students no longer study topics like composition or rhetoric, they won’t have any way of knowing if the work AI has done for them is adequate or appropriate. In other words, if the autopilot ever fails, the person in the cockpit should know how to steer the plane.
It’s difficult to predict next steps in AI development, or how quickly to anticipate change. In the near term, many educators have gone back to relying on oral examinations and hand-written assignments as workarounds for AI interference. In all likelihood, the field will have to evolve dramatically over time in response to technological advances.
If you have a child, teen, or young adult in school right now, here are some simple steps you can take to ensure success in this time of change:
- Talk with your student about AI–see what they think and what they’re hearing from peers and teachers.
- Emphasize being accountable for the integrity of your work, creating original work, and the value of learning new skills to your student
- If your student is allowed to use AI for certain assignments, ask them questions about how they use the technology, what strategies they use to maximize their outcomes, and how they can critically evaluate their AI-assisted work.
While no one knows for sure what’s coming next, it’s usually wise to err on the side of caution. There’s no disadvantage to individuals being knowledgeable in their own right, and this can only enhance their ability to use AI productively. Pushing students to learn skills and concepts for themselves seems like a safe bet for the foreseeable future.
The content is developed from sources believed to be providing accurate information. The information in this material is not intended as tax or legal advice. It may not be used for the purpose of avoiding any federal tax penalties. Please consult legal or tax professionals for specific information regarding your individual situation. This material was developed and produced by GW Financial, Inc. to provide information on a topic that may be of interest. The opinions expressed and material provided are for general information, and should not be considered a solicitation for the purchase or sale of any security. Copyright 2023 GW Financial, Inc.