Feb. 2, 2026

Human-AI Collaboration: Outsourcing vs Offloading and the Rise of Co-Produced Cognition

Human-AI Collaboration: Outsourcing vs Offloading and the Rise of Co-Produced Cognition
The player is loading ...
Human-AI Collaboration: Outsourcing vs Offloading and the Rise of Co-Produced Cognition

Recording from the Deep Freeze: Craig broadcasts from snow-covered north Louisiana (running on generator and Starlink!), where AI helped him MacGyver a propane tank solution involving ratchet straps, a plastic bucket, and a shop light. Welcome to the wild world of practical AI applications.

Featured Topics

Oboe.com: The Future of Self-Directed Learning?

Craig and Rob explore Oboe (oboe.com), a free AI-powered platform that creates customized courses on virtually any topic in minutes. Craig demonstrates by building a course on AI agents, and Rob becomes his first student. The hosts discuss:

  1. How the platform auto-generates quizzes with reasonable multiple-choice options and helpful feedback
  2. The potential to revolutionize textbook accessibility with low-cost or no-cost alternatives
  3. Using Oboe to supplement existing textbooks (like adding blockchain content to their own textbook)
  4. The limitations: shallow sourcing and need for instructor vetting
  5. Credit to the AI and I podcast from Every.to (makers of Lex.page) for the discovery

Security First: The Moltbot Warning

Not all that glitters is AI gold. Rob raises important concerns about new tools like Moltbot that can automate processes but may introduce security vulnerabilities. Key takeaway: Educators must apply the same critical thinking they expect from students when evaluating new AI tools for classroom use.

Craig's Three-Stage Hierarchy: A Framework for Human-AI Interaction

The centerpiece discussion introduces Craig's developmental model for understanding how we work with AI:

  1. Cognitive Outsourcing - AI does the task for you (the "easy" but often problematic approach)
  2. Cognitive Offloading - AI handles specific components while you maintain control
  3. Co-Produced Cognition - True collaborative thinking that produces outcomes neither human nor AI could achieve alone

Craig shares his experience co-writing with Claude, comparing it to the collaborative process of updating their textbook with co-author Franz. The magic: AI enables 24/7 expert-level collaboration that would be impossible with humans alone.

The Big Idea: This hierarchy should guide our teaching. Rather than telling students to "think critically" (a vague catchall), educators should actively move students from outsourcing toward co-produced cognition, where AI's power truly unlocks.

Geeking Out on Affordances

Craig unpacks how AI is fundamentally "a bundle of affordances" - potential uses that only matter when actualized. Using the metaphor of a rock (hammer? erosion control? weapon? stepladder?), he explains:

  1. The same AI tool can be used to cheat on an assignment or to write a meaningless email nobody will read
  2. What matters isn't just what AI can do, but which affordances we choose to actualize
  3. Understanding affordances helps us guide students toward productive uses

Rob adds that affordances can be actualized poorly (like dropping a rock on your toe), emphasizing the need for purposeful, intentional use.

The Balanced Path Forward

The hosts reject both AI extremism and AI evangelism, calling for nuanced, intentional engagement. Whether it's Oboe.com or ChatGPT, tools can be used for good or ill - context and purpose matter.

The Challenge: You can't understand AI's affordances without using it. Even if your conclusion is not to use AI in your classroom, that decision should come from informed experimentation, not avoidance.

Key Quotes

"What we need to do as educators is we need to push students from that outsourcing to the offloading to the co-produced cognition. I see that as our main job with generative AI." - Craig

"The whole idea of think critically I think is a catch all phrase that we use very often that's very hard to quantify... I do really like that example of pushing students towards that co-produced cognition." - Rob

"If you don't use them, you're not going to know what they're capable of either harm or benefit. So it's really, I think anybody in higher ed, it's your responsibility to start using these tools." - Craig

Episode Resources

  1. Oboe.com - Free AI course creation platform (for now)
  2. AI Agent Oboe.com course
  3. AI and I Podcast from Every.to
  4. Watch out for: Moltbot security concerns

Bottom Line

Don't be blindly pro-AI or anti-AI. Be intentionally informed. Understanding the affordances of AI tools - and helping students actualize them purposefully - may be one of higher education's most important responsibilities in 2025.

AI Goes to College is your guide to navigating generative AI in higher education. Hosted by Dr. Craig Van Slyke (Louisiana Tech University) and Dr. Rob Crossler (Washington State University).

Takeaways:

  1. In the podcast, we discussed the emergence of Oboe.com, an innovative platform that facilitates self-directed learning through AI.
  2. We emphasized the importance of critically evaluating new AI tools before implementing them in educational settings.
  3. Our conversation highlighted the significance of distinguishing between cognitive outsourcing and cognitive offloading in the context of AI use.
  4. The hosts expressed their belief that AI can democratize learning, but it must be used responsibly and with proper oversight from educators.
  5. We reflected on the collaborative potential of AI, stressing that true innovation arises from synergistic human-AI interactions.
  6. The episode concluded with a call to action for educators to engage with AI tools to better understand their affordances and implications.

Companies mentioned in this episode:

  1. Google
  2. Oboe
  3. Every2
  4. Lex Page
  5. Moltbot
  6. Claude

Mentioned in this episode:

AI Goes to College Newsletter

Chapters

00:00 - Untitled

00:41 - Untitled

00:42 - Introduction to Generative AI in Higher Education

02:07 - The Future of Education with AI

12:03 - Using AI to Offload or Outsource Tasks

13:00 - Cognitive Offloading vs. Outsourcing

20:25 - Human-AI Collaboration and Co-Produced Cognition

29:23 - Understanding Affordances in Technology

Transcript
Craig

Welcome to another episode of AI Goes to College, the podcast that helps you figure out just what in the world's going on with generative AI if you're in higher ed. And I am joined, as always, by my friend, colleague, and co host, Dr. Robert E. Crossler from Washington State University.Rob, I'm talking to you from really chilly Eris, Louisiana, which is where I always am.But we, we just had about 4 inches of mixed snow and freezing rain and sleet, and we do not do that well, so people up north are just laughing at us right now.

Rob

Well, we do it a whole lot better than you do here in Pullman, Washington. But I would say your 4 inches may be more than we have had all winter this year. So it's like you stole our winter weather.

Craig

So we're running on a generator and starlink, and so hopefully that'll keep going.

Rob

Isn't technology great, Craig?

Craig

Yeah, AI has been invaluable throughout this process.I had some technical difficulties with a propane tank and was able to figure out a solution that was surprisingly low tech considering I was using AI. Let's just say it involved ratchet straps, a plastic bucket and a shop light, but it worked, thanks to Gemini.

Rob

It sounds like you just described Duck Dynasty to me, Craig.

Craig

Yeah, yeah, kind of. Kind of. You do what you have to. All right, enough of my whining. Let's get going.First thing I wanted to talk to you about, Rob, is a new AI enabled self directed learning platform. I think we might have briefly talked about Google's Learn about at one point. This is another one. It's free right now.OBO b o e. Like the instrument.com, you can create a course on virtually anything you go in. It's got a little chatbot interface, says, what do you want to learn about? I put in, I want to learn about AI agents.Give it a little bit more of a description of maybe two sentences and then, I don't know, three or four minutes.I had a pretty decent course on AI agents, one that I would feel comfortable giving to our students, saying, look, if you want to get a quick handle on what's going on here, this is enough to get you started. It wasn't in great depth. The sources were a little dicey and a little bit shallow.So there was Wikipedia and some company based sources that I'm not quite sure that. It's not that the information was wrong, but it was shallow.

Rob

Yeah, Craig, what I really liked about it, because I went and took your course, always wanted to take a course from Craig and this gave me the ability to do it. As I went through and I read it, had quizzes along the way that I think were auto created.The options for the multiple choice questions on the quizzes seemed to be reasonable as far as various different ways to say things and when I got it wrong, wasn't graded or anything that told me it was wrong and what the correct option was.And I thought it was a pretty straightforward way to have a nice learning tool and makes me wonder about what textbooks are going to look like two, three, four, five years from now.

Craig

Yeah, that's where the shallowness comes in. A textbook can provide a lot more depth. It can provide multiple opinions, multiple perspectives on things.And I think as the courses get more complex, the problems increase just because of context rot and that sort of thing. But hey, I encourage people to check it out. They've got a bunch of pretty.I don't know if they're pre done by the company or whether they're just what other people have created. Just go in and check it out.

Rob

Yeah, and one of the worlds I imagine Craig, whether the price is $20 a month or $50 a month, something reasonable, if that's access to oboe.com and you could create a series of, let's say business classes where students just get a license to something like that and that's all of their text for the book and the instructors are able to put the right guardrails on the material that feeds that in. That's world changing for what the textbook industry looks like right now.So it's gonna be interesting how creative faculty get at creating these low cost, no cost solutions to education to make it more accessible.

Craig

Yeah. Rob, textbooks are archaic.I remember 10 or 15 years ago attending a dinner put on by Wiley where look, you know, we need to rethink the whole textbook thing given current technologies. And it's just, I mean it's kind of happened, but not really.But where I see something like Oboe being really, really valuable is in supplementing a textbook.Rob, you and I have a textbook with the Franz Boulanger Virginia Tech and a lot of people use it and it's a good textbook, but it's never going to be perfect. It's never going to be exactly what any particular instructor wants. And so let's say somebody wants more content on blockchain. That's their thing.We've got some, but not a lot. Well, you go in and you create a course in blockchain and send them there, talk about it in class. As long as you Vet it.You have to make sure that it's correct and it follows what you want. But I'm sure you could do that maybe through careful prompting. I kind of did a one shot. I think you can only do one shot.But I didn't spend any time at all on the prompt in my course. But I would send my students to that course.

Rob

Tools are only going to get better, Right? So what you took was one of their early releases of Oboe and started playing with it.I would not be surprised if they are still iterating on adding those features. They went to market with something that worked good enough to start to get interest.Three, four, six months from now, we're going to see a different product and.

Craig

I'm going to give a shout out to the AI and I podcast, which is from a company called Every2. So if you go to Every2 podcast, you'll get the podcast. And that's the same company that makes Lex Page that I like so much.And so that's how I learned about this. They interviewed the co founder.So I would encourage our listeners to check out oboe.com to quote Animal House once again, it don't cost nothing right now, so just go out and try it.If you're going to give it to students, you need to go through the course and make sure that it's the content you want or use that as a learning opportunity and have the students have a discussion with the students about what the course didn't quite get quite right or what some other perspectives might be. But just you don't want to go in blind like any of this stuff with AI.

Rob

Yeah. And I think that's a great statement and it kind of leads into what I want to mention about paying attention to these new tools as they come out.There was a new one called Clawbot when it was first released, Mult Bot that does some really cool things to automate processes. People use it. You're authorizing it to take over your machine and do whatever you want.And I've seen a lot of security vulnerabilities that have come up for people who are early adopters of this new really great technology that does really cool things.And as I bring this back to even Oboe and other sorts of things that we may be using in the classroom, as the instructors who are finding these new technologies, who are trying out these new technologies, we really do want to pay attention to what level of risk student lost information, poor information quality that we're providing to our students and be those critical thinkers. Don't just Say, oh, wow, great tool. I'm going to use it. But same with the critical thinking we want to see from our students.It's important that we, as administrators, as class teachers, whoever, we are making decisions about these new tools, that we don't lead our students unwittingly into poor learning or security violations or whatever those things may be.

Craig

Yeah, especially with this Multbot. Multbot. There's gotta be some sci Fi thing.

Rob

Well, no, here's what's funny. I'll tell you why it's called Moltbot. They went with claudebot and got sued because it heavily uses Claude and it's not a Claude product.And then they liked crabs or something. So the molting of crabs is what ultimately led to this name, which makes no sense, but people all have their reasons.

Craig

There had to be alcohol or weed involved in that decision.

Rob

Or both.

Craig

Or both. It could be both. That's true.But when I was reading about this, after you sent me the article you sent me, which we can link to, it's a TechCrunch article, it occurred to me, okay, so some company that you've never heard of, I think it may even be open Source, wasn't on GitHub. I think it was.

Rob

I think it was Open source on GitHub with a whole lot of stars.But what I heard and I read about is they manipulated the number of stars it had so that way people would trust it more and did some very bad things in getting popular.

Craig

Sounds like that's a perfect job for Multbot is just to go in and boost the number of stars. I started thinking about this.Okay, this is kind of like you're sitting next to some stranger in a bar and you want to book a flight to somewhere and you slide your laptop over and say, I'm going to go get another beer, or I'm going to go to the restroom or whatever, and can you book this flight? And here's my credit card. That's just insane. But that's kind of what this is doing.Now, you can sandbox it where you give it things that aren't really going to have a lot of downside, but there are ways you can test these out. But I'd be really careful. And I think I sent you the Reddit thread where somebody found malware in the system instructions or something like that.It was like, yeah, that ain't good.

Rob

Well, I was reading some things about how to properly deploy this. You had to be a level of technological nerd in order to even understand what they're talking about with their directions to deploy them.So again, I would encourage our listeners. We're going to see a lot of technologies come about. Higher ed is going to be a popular target to market to.And if it seems too good to be true, or if the hype is so high, let somebody else be the person who goes first.Or work with your technology people at your institution so they can help you vet it to make sure that you're not going to inadvertently create a serious security breach for your organization.

Craig

Yeah. Or yourself. Yeah. The word I would put out there is chill. Just chill. When these new things come out, they're really cool.Google announced some new stuff today, and I was getting ready to change my Google subscription to get early access to it. Or you can just wait a couple of weeks, see if it's worth checking out, and then try it.Although I did try Claude Cowork, which is pretty interesting, and it somehow Claude code for the rest of us. And it did a pretty good job of organizing my downloads folder.I don't know about you, but my downloads folder has like 9 million files with no organization other than the date. And so if I lost all of those, it's no big deal.It was a pretty easy test, but it went through and organized them and gave me a suggested organization. We tweaked it, I hit a button. Next thing you know, I've got an organized download folder.

Rob

Well, that's great. And what that leads us into our next topic, Craig, which is the whole idea of using AI to either offload or outsource what it is that we're doing.

Craig

Nice transition. You like that?

Rob

You set that up nicely. Softball.But no, it sounds like, you know, offloading being at the very simplest term, taking something and writing down a list so you remember what to do. The time and the effort required to organize your downloads folder is not something that you probably needed to put cognitive effort into.But having a tool that automatically did it for you was super nice and probably helps a lot. And you can focus on other things versus the whole idea of outsourcing.And I think we're seeing this a lot with students writing essays is they will just take the prompt, give the prompt to AI, and it will just completely outsource the writing of that essay. So it's not offloading something I need to remember or to do. But that learning of writing never occurred because I completely outsourced it.And trying to figure out when's the right time to offload.Maybe when is the right time to outsource the right times to do that, but being purposeful about what is our goal and what we're using AI for, and what do we lose if it truly is outsourcing, the complete completion of tasks.

Craig

This comes from an article from Natalie Wexler. And we'll link to the article in the show notes. It references another article and she links to that. So I'll let the listeners follow that trail.As I read both of the articles, I was not clear on exactly what cognitive offloading is. Outsourcing, I get if you say do this, you could probably argue that what I did was outsourcing, just do this thing.But there's not much value to me doing a task like that, which is kind of the point, right? I'm not learning how to design file systems or I'm not trying to work through anything that I really want to remember.It's like, you know, this is a mess. Clean it up, right? So that was really complete outsourcing. But offloading is different.And the articles used like a calculator and then writing stuff down, which, back when I studied cognitive psych, we called that an external memory aid. So I didn't quite get exactly what offloading is. So what's your take on that?

Rob

So my take and where I anchored on this as I read it, was the whole idea of the things that cause cognitive debt.When I really do need to be learning something or knowing something or actively involved in the process of something, and I completely outsource it, I completely give it to the AI. I'm creating the space of cognitive debt where I ultimately am not going to learn the things that I would need to learn in the process.Whereas I saw the whole idea of offloading being those places where I was not going to be worse off because I found a way to utilize the tool. You know, making an outline of a paper, that actually is a fairly trivial process that I could do myself.But if I can offload the creation of an outline and then start doing the writing and the thinking, I've made myself more efficient by getting some of the busier work out of the way.

Craig

See, I guess that's where I got off track, because to me, creating an outline is an important cognitive process for ultimately creating some document. You know, as I create the outline, I'm thinking, okay, should this come first or should this come first?And I think through the sequencing and the story I'm trying to tell and the flow and all of that kind of thing, maybe it's different if you're writing some kind of a Report where those things don't come into play. But the point here, I think the larger point is to think about whether or not the friction involved is useful.When we talked about cognitive debt, we talked about cognitive friction in learning. So you want friction when that friction is going to pay some benefit. You don't want that extra work when it's not going to do anything for you.It doesn't have any benefit like organizing downloads folder or creating an outline for a letter of recommendation or something like that. The other thing that this didn't really bother me, but I think it's important to understand is this is not a clean dichotomy.It's not a clean set of separate buckets. There's some gray area in between and it's probably on a case by case basis.

Rob

Yeah.And I think it's important that we're paying attention to our role in knowledge creation, in development, whatever the tasks are, and being purposeful about where it is that we're adding value, where we need that friction, where we need to be engaged in that thinking process. If there's ways that we can be purposefully making ourselves more efficient, it makes a lot of sense.One example I've used before, and this is because I'm a heavy Copilot user, it's not unusual that I will make PowerPoint slide decks from a Word document I have or some outline that I've created and utilizing the way that Copilot plugs in between Word and PowerPoint.I don't get perfect PowerPoint slide decks, but I will tell you I have drastically cut down the amount of time it's taken from having one communication medium to another communication medium in ways that aren't removing my thoughts from the process at all. It really is removing the busy work of just changing formats of what I'm going to be doing with that material.

Craig

Oh, absolutely. I use Gamma app and Beautiful AI to do the same sorts of thing. And it just saves so much time.I can take a presentation that I would have spent a couple of hours on and have it done in 15 minutes and it's still my presentation. So it's like I'm making up numbers here, but Gamma or Beautiful or copilot does 80% of the work and it's the grunt work.It's the stuff that if we had unlimited funds and unlimited gas, I'd give a GA to do and then I'd spend my time where I can really add value. And so it's. Yeah, that's a good example of the idea of Cognitive offloading.

Rob

Well, I would say that's not much different than the concept with math where I would have a piece of paper and do long division or whatever. I can do it. It would take a lot of time. We enter a calculator, we enter Microsoft Excel, I punch those numbers in and it does that so quickly.The math is still the same, but now I can get done with whatever it is I was trying to do in a much quicker fashion. And so I think that may have been some of the examples they are trying to use of how it's not really outsourcing it.You're offloading, you're still doing the math, and the math is still getting done, but we're able to do it quicker. And we've, we've jumped through these things before and we've become more efficient.And then we've spent our time like what we teach at Washington State University in our College of Business. We want every student to be able to look at data and make decisions.And ultimately that's why we do this math, that's why we do these calculations using various different tools is so we can get to the point that the human can evaluate those numbers and say, based on what I'm seeing, this is the direction we should go. This is what we should do.

Craig

This is really nuanced. So let me see if I can lay out an example. One of the things that students sometimes struggle with is what happens when denominators get larger.And so if you don't know the formula for some statistic, you're not going to have an innate feel for when this number goes up, this other number goes down. If the numerator goes up, they figure the number goes up, but for some reason they get hung up on the opposite.And so if you've never at least built a formula like in Excel step by step, you don't get that. You get that. I get that because we did this stuff by hand. But I don't do it by hand anymore. I don't do it step by step in Excel anymore.I put it into whatever and just say, give me this number. Because I know if one number goes up, this other number goes down. So it's really nuanced.And I think that line, if we're going to draw a strict line between outsourcing and offloading and where that appropriate line is, is different for everybody. I want to add one more thing to this.I think there's a part that's missing, and that's human AI co produced cognition, where it's not you doing something or AI doing something, you're doing something truly together.I've been working on this and I define this as a collaborative, synergistic, iterative process in which human and AI work together to produce something that neither could produce on their own. By the way, I co produced that with Claude. Of course, that was my definition. Here's the co produced version.A synergistic process in which humans exercise agency through iterative collaboration with AI to produce cognitive outcomes neither could achieve independently. I think that definition is better than my definition because it brings in human agency.But Claude never could have come up with that on its own and I didn't.

Rob

And I think what I like about that, Craig, is it's consistent with a lot of what I'm hearing and seeing.I think there's even a science article that if generative AI is left to its own to train itself on more knowledge beyond what it has, it'll create itself to mediocrity right to somewhere in the middle where it's not great and it's not profound. And what this suggests is through human AI co produced cognition and the synergistic nature of the two working together.It's in that instance and in those use cases that we are truly going to see the magic of what this technology can do becoming more and more commissary. It's not just going to be the machine doing magic and all of a sudden humans don't matter.No human's brilliance is going to be able to shine even better because of the synergies that the technologies allow us to produce.

Craig

Yeah, I agree 100%. And I think this parallels human collaboration like on the textbook.The three of us working together come up with things that no one of us would have come up on our own. We're releasing an update with an AI chapter, generative AI chapter.And one of us drafted the bulk of the chapter and then somebody else came in and said, wait, this isn't quite right. What about this? And it wasn't exactly going in and editing it.We had a back and forth and we ended up with something better than any one of us would have on our own. And what AI does is it lets you have that ability 247 virtually any place on virtually any topic.And that's just not feasible for most of us with human to human collaboration. So it's not that it's better.If you gave me a whole ton of experts that knew a lot of different things, maybe we would have come up with something better than Claude. And I came up with, but I don't have that.What we need to do as educators is we need to push students from that outsourcing to the offloading to the co produced cognition. I see that as our main job. With generative AI. The outsourcing is easy. Right. Do this thing and it does it.You know, the offloading is a little bit harder because you have to figure out exactly what it is that you need and where it's going to be worthwhile. And you need to know more about what AI is good at and not good at.And then that sets the stage for this co produced cognition where you can start to really unlock the power of this. I mean, it's a borderline magical technology. And so I think that's really what we need to be doing as educators.It finally kind of clicked with me as I was trying to work through this whole thing. It's like, dang, this is what we need to be doing. We can't just say think critically with generative AI.You know, they're not going to know how to do that, but we can move them towards that.

Rob

Yeah, well, the whole idea of think critically, I think is a catch all phrase that we use very often that's very hard to quantify and to know what we really mean by saying that. So I do really like that example of pushing students towards that co produced cognition.Because what it really is doing is it's defining what we mean by critically thinking with AI.It's putting something that's a little bit more measurable, it's something that you can pay attention to a little bit better and actually speak into that results in that higher level phrase of critical thinking.

Craig

Yeah, absolutely. I'm going to geek out a little bit because this is another thing I've been thinking about that fits here. So AI is a bundle of affordances.An affordance is simply a potential of something to do. Something. If I'm out in the pasture and there's a rock, it's a rock. But what it really is is it's a hammer.If I need to drive a nail, it's an erosion control device. If I need to stop some water flowing somewhere, it's a weapon.If a coyote comes in, it's got an almost endless array of things that rock can be used for. So sure it's a rock. Yeah, sure it's AI, but that doesn't really get at what's going on.And so what we need to do and start to understand what those affordances are, how to direct students into the right affordance at the right time. Because what matters not is just what those affordances are. It's which one we actually put into play.And if the affordance to write a document is cognitive outsourcing to cheat, so you're bypassing the work of learning, that's bad. If it's to write a document that's a response to an email that nobody's ever going to read anyway. Okay. I mean, what's wrong with that?

Rob

So, yeah, I think that whole idea of actualizing affordances is good because there's two ways that affordances can be actualized.Going to your rock example, I may actualize an affordance of that rock by breaking my toe because I picked it up and accidentally dropped it, which would be a poor actualization of the affordance of that rock. It hurt me. I think there's ways, as we use technologies, if we use them incorrectly, they can ultimately hurt us.Being purposeful about what is it that we're trying to accomplish and why, for how we're actualizing what it can do helps us to make good decisions with that human input.

Craig

Yeah, don't actualize the affordance of a group of rocks to be a stepladder. That's a bad one.

Rob

So is that experience?

Craig

Yeah, I can neither confirm nor deny. So it's an interesting time. And so I want to kind of bring all of this together.We've got this tool, this learning tool, oboe, oboe.com that can be used in a lot of different ways. And some of those could be good, some of those could be bad. I think it's largely positive because it democratizes learning, if nothing else.But without proper oversight by a faculty member or without the user having the skills to double check on anything, that's really critical. I mean, there was a course on options trading, which, you know, how could that go wrong?But if you know enough, you can say, okay, this is going to get me up to speed, but here's where the risk factors are. And so here's where I need to dig a little bit deeper. It depends, right?So it can be used for good, it can be used for not necessarily evil, but we'll call it evil. We see the same thing with generative AI through this framework or hierarchy or whatever, with outsourcing, offloading and co produced cognition.And I think we need to keep that in mind. One of the things I get really frustrated with is people that are just completely anti AI, where AI is bad. AI is bad, AI is bad.And there's environmental concerns and that sort of Thing that I don't want to be dismissive of, but I also get pretty frustrated with people that are AI, AI, AI. Without thinking about what we've just discussed, I'll get off my soapbox, which is another bundle of affordances now.But hopefully we've gotten the point across.

Rob

Yeah, no, I think that's good, Craig. And I think it all comes back to not blind trust in any direction. Right.So it truly is saying, what do I want to do and how do I want to accomplish it? And then looking for the tools that can help us. Just because you have a hammer.Not everything is a nail, so that may not always be the right tool for the job.

Craig

Yes. As my pair of pliers that became icebreakers so I could get into gates this weekend.

Rob

Well, how about ice scrapers? Craig, Are you using an old driver's license or an old credit card to get the ice off your windshield?

Craig

I'm just not driving. So we have not moved in several days. But I want to return to one of our larger messages for all of our listeners out there.You can't really understand these affordances until you start using these tools. Don't want to use them uncritically. Like with Moltbot, you know, you gotta be careful.But if you don't use them, you're not going to know what they're capable of either harm or benefit.So it's really, I think anybody in higher ed, it's your responsibility to start using these tools in some way, even if it just helps you make an informed decision that you don't want to use these tools in the classroom.

Rob

No, that's great, Craig. And I think intentionality is the key in all of that.

Craig

All right, so we made it through without the generator quitting, which is good. Anything else, Rob?

Rob

No, I think we covered a lot today.

Craig

We did cover a lot today. All right, well, if that's it, then we will see you next time on AI Goes to College. Thank you.