By Philip Baczewski, senior director, University IT
Every once in a while, I come across a commentary that seems to lodge in my brain – well, not exactly lodge, but rather get stuck, because it doesn't seem to quite fit my observation of the online world. The latest of these is a post on the Educause web site titled, "Can Higher Education Save the Web?" At first glance, this is interesting, because higher education kind of created the web, so saving it would seem right in higher education's wheelhouse (if higher education were a batter in baseball or a nautical helmsman.) But, on further reading a few things about the article rang a dissonent note in my mind. To summarize, we're all distracted click junkies who don't read complete articles and unwittingly post fake news and conspiracy theories because we've been taught by higher education to use libraries and digital literacy instruction will allow higher education to save the web and provide the hoi polloi with better-quality information. (At least, that's how I read it.)
The web's the thing...
Let's start with "the web." The article offers Facebook and Twitter as examples of the "decline" of the web. Being someone who has used the internet before there was a web, it is an annoyance when "the web" is used synonymously to mean the internet. Further, Facebook and Twitter are not the web. They are both applications available via web browser, but fall short of a comprehensive representation online resources. (I actually think of Facebook and Twitter more as smartphone apps, than web sites.) Painting the web with such a narrow brush ignores a universe of information resources, some good, some bad and some quite excellent.
The article further makes the assertion that the Web 2.0 business model has yielded "pull-to-refresh addictions, clickbait conspiracy sites, and mob-like behavior" on "the web." The publisher, O'Reilly, first popularized the concept of Web 2.0 in 2005. By 2009, discussion turned from Web 2.0 as simply a more user-centric experience to the idea of "harnessing collective intelligence." Given recent events, perhaps "harnessing collective stupidity" is a better characterization. However, the narrow brush of Facebook and Twitter ignore a large aspect of online technology that has advanced yet another 8 years. You could argue that Web 2.0 gave us big data (literally) and is responsible for pushing developments in data research that are impacting areas from commerce to physics.
We're going down...
The Educause article depicts "the web" as "careening, Hindenberg-like, to the ground below." This is reminiscent of many previous laments about how internet technology is somehow diminishing our intellectual capabilities. A 2010 book called "The Shallows" by Nicholas Carr comes to mind which contends that the Internet is sapping our and our children's ability to apply sustained concentration to intellectual tasks. I think the book is well refuted by a New York Times review from the same year which cites research indicating, "Google . . . isn't making us stupid -- it's exercising the very mental muscles that make us smarter."
Another statement that caught my eye in the Educause piece was, "We spend countless hours teaching our students to navigate the world of research and published books. And yet we graduate them into a world where the vast majority of the information they consume professionally and personally will come through the Internet." It would seem obvious to me that the world of research and published books is the exact familiarity needed in order to counter the fake news and conspiracy theories bound up in that internet information.
There seems to be an undercurrent indicating that the internet has somehow fundamentally changed the context for higher education. Another recent article quoted an industry source as saying, "Today's college students have spent their entire K-12 academic careers immersed in interactive learning environments -- both in the real world and digitally. We can't expect to drop them into college classes and strip that level of interactivity and engagement away and expect them to be successful." It has seemed for years that we acknowledge that higher education requires preparation. We even have a term for it -- college prep. It usually involves a greater amount of reading, writing, and sustained attention than educational programs that assume a vocational outcome. Done well, it helps college students to be successful. In this case, the most important learning environment would seem to be the individual student's brain, whether enhanced by virtual environments or just by the availability of a good traditional library.
The Medium is the Medium
One thing that the internet has changed is that the amount of accessible information has increased tremendously and the modes for gaining access have increased. Information that was once localized to specific libraries is now available from most places in the world. I might be reading a hard copy of a book at home, but reading an electronic copy of the same book on my phone when I'm out and about. The result of internet technology may actually, on the whole, be qualitatively neutral. Facebook and Twitter promote the spread of fake news, but the open access movement leads to freely available research and scholarly output.
So, sorry – I don't think that higher education can save the web, because there is nothing needing to be saved. I think the harder question is whether higher education can save our culture. I agree with the article that, "The literate culture of books and published articles is one of the great achievements of our culture." Online books and articles are no different than their analog instances – they are just potentially more accessible. What technology does for us is provide new ways to express information and allow for more efficient and rapid exchange of information. But technology can't on its own make up for an unprepared learner and digital literacy, while useful, does not take the place of scholarship.