With another week, we have another Great Ed Tech Debate topic to discuss again! This week, the debate was on the following: Schools should not focus on teaching things that can be googled: Agree or disagree? Once again, I found this statement interesting, as it’s something that I’ve never really thought about. The obvious answer (to me at least) is to disagree – literally everything can be Googled. Google is THE place to go for any information you could so desire… So how would you teach something that could not be Googled? And it appeared most of my classmates had somewhat similar thoughts.
This time, the majority were on the disagree side rather than the agree side. Though, it was not quite as much of a shutout as last weeks debate (where the vote was 94.7% agreeing with the statement), but Sydney still had her work cut out for her.
Unlike last week however, very little people were swayed by the debate this time, though a few were. Was that because the debate didn’t ignite anything? Or was it simply due to the content of the debate question?
Not Teaching What You Can Google: Arguments
Sydney’s side of the argument is based on how being taught subjects in school that mainly rely on the memorization of facts or patterns that one could simply Google instead is redundant. In ‘Advent of Google means we must rethink our approach to education’, the article gives examples of the types of things that we memorize that could simply be Google’d, such as spelling, grammar, and multiplication tables. The article also states that the way school examinations are run (solo, without using any educational resources outside of ones memory) is contrary to the way the workforce works, where you will often work in groups and use every resource at your arsenal. In my opinion, this is perhaps the greatest argument toward not teaching what you can Google. If school truly is to prepare you for the ‘real world’, shouldn’t the way that we do our examinations and work be similar to that of the ‘real world’?
Another argument made that I think is quite effective is the fact that memorization is somewhat of a waste of time. There are a couple articles that go over this fact, but the point remains the same between them – memorization is a waste of time and brain power that you could be spending on something more worthwhile. The biggest supporting piece of this argument is a quote from Albert Einstein:
“Never memorize something that you can look up.”
If you have access to millions of resources at the tip of your fingertips, why should we not be expected to use them? Again, people in the ‘real world’ use it practically all the time, so why should we expect students to memorize tons of information that, frankly, most of them will forget in just a few months time (or less)?
Instead, we should focus on the tasks that are expected of you on a day-to-day basis as an adult. Things like effective researching, conversational skills (something that was brought up in a previous blog post!) or changing the type of questions that we ask entirely into questions that provoke discussion or debate, that cannot just be memorized, and require personal opinion and thought. It would be a very different system to what we are used to now, but is that necessarily a bad thing?
We Should Still Teach What We Can Google: Arguments
Aurora’s side of the argument, to sum it up, generally wants the way school treats Google to remain the way it is right now. The article, “Will technology make teachers obsolete?” talks about how being told to ‘just Google it’ is not enough. The article talks about how many schools have appointed a member of staff to watch the children as they use a more media-based program. This raises the problem of what makes a teacher. Is a staff member who watches the students as they google answers to problems really doing any teaching? When students are left to use the internet to do all their learning, they lose the ‘human factor’ of teaching. As someone who learns best when being taught in person where personalized adjustments can be made at any time, this ‘human factor’ is incredibly important, and losing that to a more internet-based curriculum screams trouble in my eyes.
I think a decent argument can be made for still teaching what you can google based on exactly what I just mentioned: the ‘human factor’. Again, I learn best when taught something by a person, who is physically in front of me. The information that I hear in person is retained a lot more than something I hear or read on the internet. That may just be how I learn, but if you cut that out of schools, you are possibly making information retention much harder for people who learn similarly to me, and as someone who has taken online courses with no face-to-face section, it can be very NOT fun trying to learn a bunch of information in a way that you do not excel at.
So Which Side Wins?
For this debate, I am more apt to lean towards remaining to teach what you can Google. The biggest argument in my eyes, is the ‘human factor’. Without that, what’s the point of going to school at all? We could always just look up everything we would ever need to know, and for things like conversational skills, we could just make friends at work, through hobbies, or even over the Internet. But by having the information still taught in schools, I believe school is worth going to. Not everything may interest you, true, and some may find the memorization aspect intolerable. However, I think that it can be valuable – while you may not always retain the information, building up the skill to memorize can be useful in the future, when you may have small tasks that may be useful to memorize, rather than looking it up or asking for help every single time.
So this time, with the information I have on hand, I am on the side of disagreeing.