The onset of AI in the coming age presents a series of new challenges for scholars and society as a whole. With access to generative language models and other forms of automation becoming more prevalent in today's world, we must begin to examine our core values, and to reevaluate the quality of our work at both an individual and interpersonal scale. AI and generative tools such as ChatGPT, Copilot, and Bard can certainly help us when writing, brainstorming, proofreading, and many other productive cases, however, it's important to know when to draw the line. Personally, I believe AI tools (in their current state) are best used to augment one's knowledge and abilities, and are not meant to be used as a wholesale replacement for these qualities.
Note: I wrote most of this essay by hand, and used AI to assist me in outlining paragraph specifics.
I have used AI in class this semester in the following areas:
I used AI to tutor me when I didn't understand Bootstrap tags or React components. It also helped me bugfix quickly before our 11:00p homework deadlines.
Practice WODs were all done by hand so I could learn and practice everything I'd need to know for the WOD.
AI was a little difficult to use on some specific problems presented in the real WODs, and I often felt like I was wasting more time correcting the language model than actually improving (or even understanding) my code. If I could go back, I think I'd probably restrict my use of AI to:
This way, I could spend more time thinking about my program and less time on code I'd already written before.
I tend to suck at outlining and structuring essays on my own, so I asked AI to write me a mockup outline pretty much every time, which was very helpful.
I used AI for most everything. It was very helpful in coming up with ideas, presenting common/known solutions to my problems off the web, and, of course, bugfixing.
For the most part, I still believe that learning and practicing something on your own is the only way to retain any knowledge of it long-term, but at times it can be nice to have a friend (ChatGPT) explain the harder bits to you like you're a 5-year old.
I can answer questions on my own!
I try to come up with my own questions for the most part.
Yes, it's like googling the definition of a word you've just learned. Very helpful!
If I don't understand something the first time, I will usually try to look up more information from the web than to ask an AI. Oftentimes people have very detailed explanations on sites like StackOverflow.
Hell yes. Easy code, hard code, components and pages.
I did most of the documenting (very little, woops) by hand.
I used to ask ChatGPT questions like, "can you improve this code? [insert code snippet]" or "can you make this code more concise? [snippet]" but got tired of doing that because of the lengthy and hard-to-understand solutions it would often provide. Sometimes the language model would give obscure solutions I didn't fully understand, or it would import functions from libraries I'd never seen before. I ended up resorting to the method of asking a single outlining question to retrieve an answer I'd hope the AI would find trivial to write, and then copying most of that solution.
Task planning/management, romantic interest ??
Overall, I think that having access to AI throughout this semester has boosted my coding confidence and capabilities, but may have reduced my overall comprehension of the material. The knowledge I gained about HOW to gather more knowledge through AI to actually boost understanding I think is just as valuable, though.
Beyond its usage in this class, AI definitely has potential to change the way we approach problems in real life. AI can present solutions to problems in many fields, and is yet another convenient method of accessing all of the internet's (accessible) knowledge at the push of a button.
One challenge we all face is learning to incorporate this new tool into our mental toolbelt while keeping it exactly that, and nothing more. If people begin to rely on AI without putting critical thought into decisions, shit will get messy real quick. Another challenge arises from the fact that the language model is nothing more than a prediction. Just like humans, AI doesn't always provide accurate or adequate answers to the prompts it receives. Properly vetting these solutions to reduce misinformation is another important step to integrating this new tool into our lives.
Tutoring people is difficult. Everyone learns at a different pace, and teaching methods will always be more effective with some students than others. Teachers don't often have time (or all the answers) to students' questions. With AI, learning can be made much more custom-fit to an individual's learning style.
This era is the earliest of many to begin incorporating artificial intelligence into society and daily life. As our technology advances, and we continue to adapt to the new capabilities of AI-derivative technologies, we must take care to utilize, rather than completely depend on the tools we create. Eventually, society may come to the point where the use of AI becomes collaborative, or even cooperative. In this event, it will be essential to understand our ethical boundaries and conduct ourselves progressively as a society.
In conclusion, the integration of AI into modern society has reshaped the landscape of many fields, offering new avenues for learning and problem-solving. From interactive dialogue to automated code generation, AI tools have revolutionized the learning experience, empowering students to explore and comprehend complex concepts more effectively. Moving forward, continued innovation and refinement of AI-driven educational technologies will be paramount in optimizing learning outcomes and preparing the next generation for success in a rapidly evolving digital landscape.