Writer Kevin Kelly talks about what technology wants and the future of education.
Kevin Kelly is Senior Maverick at Wired magazine. He co-founded Wired in 1993, and served as its Executive Editor from its inception until 1999. He is the author of several books, including What Technology Wants, New Rules for the New Economy, Out of Control, and most recently Cool Tools: A Catalog of Possibilities, released in December.
Q: You often compare technology to an organism, the Technium, a seventh kingdom of life, which has its own goals and agenda. How do we avoid a Matrix or Terminator-type scenario where technology truly takes control?
A: I like to hang out with the Amish quite a bit. They have taken some collective decisions to limit the amount of technology and, I tell you, it’s very exhilarating to hang out with the Amish and live the life they do because they are living in the past. I mention that because, one answer to your question is, we always have a choice to go back and live with less.
Any of us could buy a bus ticket and within 10 hours be somewhere in the world where there is a whole lot less. Or we can go to Pennsylvania and live like the Amish. It’s not hard to just get rid of the stuff. It’s not hard to unplug your TV. I know because we don’t have TV in our house and our kids grew up without it. It’s not hard to do. The reason we don’t is not because it’s hard but because we don’t want to. We complain about machines taking over but we want them to. This is who we are. We are wholly dependent on our technologies. We, as human beings, can no longer live without technologies. We passed that point thousands of years ago.
My argument is that we are part technological ourselves, and we cannot unravel this. We are not going to go back because that means fewer choices. We like to have more choices, we are happier. We may worry or complain about it, but we are not going back. We are going to go forward.
I make long arguments in my books that this is a system and it does have its own agenda. The question is: how do we know what its agenda is? My only argument is that this system, the technium, the stuff we made, is propelled by the very same forces that made life. We might as well ask: how do we know life isn’t going to take over? Well, life is trying to take over, if you haven’t noticed. Life is trying to take over your house and it’s ruthless. It is always there, but we deal with it. It is the same thing with the technium. It is another kind of life, a kind of dry life.
I think that we should treat it more like a second nature, literally like another wilderness that we are dealing with. That means that you don’t let it come into your house. You erect barriers and that’s what we were just talking about. But at the same time, you want to work with it. You want to use its forces. That’s what we did with agriculture, for example. We domesticated it. I see our lives as an effort to domesticate technology. That’s what we are trying to do with it. We are not trying to eradicate it, we are not trying to stop it, we are just trying to domesticate it.
Q: Looking at technologies like nuclear energy, genetic engineering, or other controversial technologies, how do we mold their development in the proper direction without any restrictions on the use of the technology?
A: That is a great question and that is the question. Prohibition does not work, but we do want to domesticate it. Is technology going to be more like a dog or is it going be more like a cat? How do we house train the technium?
First, I think that we want to be engaged with it and again not try to prohibit it. But we do eternal constant vigilance, we do constant testing, we use science as much as possible, which we should be reevaluating and testing all the time. We should constantly look at how it affects our behavior.
We should use money, instead of building aircraft carriers, to do science and testing and evaluation of technologies that we do use. We should spend money on that kind of science and on science in general.
This evidence-based, data driven attitude toward our use of technology is really going to help us and that’s one of the reasons why I am a big proponent of quantified self and self-tracking which is that, in addition to all of this health stuff, we should be tracking the effects of our behavior as we use things. It will help us understand what’s happening, what’s good and what’s not good.
Q: Who should be responsible for that constant revaluation and tracking? Engineers, consumers, policy makers?
A: All of the above. If you are using it, you have a responsibility to evaluate it. I like to think that whenever there is a right, there should be a corresponding duty. If we talk about human rights, there should be human duties. If you have the right to use certain technologies, you should have a duty in using them as well. One of those duties is to be involved in evaluating its effect on you and your community and your society. You are participating in the greater constant vigilance of what we were doing.
This is scary talk for a lot of people. Self-tracking is one thing, but when there is a kind of mandate to self-track, it begins to sound like big brother. But to me, it’s like voting. I think you have a duty to vote. It is not mandatory in this country, like in Brazil and other places, but we still have a duty. Participating in evidence about the effects of technology is something that we have a duty to do.
Q: What does technology want from the education industry?
A: I am a lifelong learner and always think of myself as a student. Being a college dropout, I am still in college, you know. I often get paid to learn, that’s what writing an article is. I write articles on things I want to understand. It’s like a homework report.
I do acknowledge, though, that third graders don’t always learn in the same way that I do. A lot of elementary education is not just education. It’s also about training, about child-rearing. The dynamics of child rearing are not the same as those of education, and that’s why there are differences.
But in terms of the educational aspects of what technology wants, it is more choices, more options. Ideas like the Khan academy, where you can go at your own speed and use scientific data to both guide you in your pace and guide you in what you are learning next, that sort of auto-correction. That is one of many choices.
There is also the choice to sit next to another human and have that person engaging you as a student. That is also a choice, an option, and a good thing that should not go away. Technology wants to have even more possibilities in that space.
When I think about education in the future, there will still be teacher-student human relationships. There will of course be more online remote learning, and there will be all kinds of things in between, like devices that are augmented for the teacher or magic windows that you can hold up, and you will be able to work either alone or with someone else or with a group, all these things.
I don’t see any kind of uniform, major killer thing that everybody is doing. In 100 years from now, I think you will see almost every variety and way of learning. Of course, some kids and some people are more visual or auditory or whatever, and they are at different times in their lives and want to do different kinds of things and that’s what we want. We want a million different tools, a million different ways to go about it, and some knowledge about what’s best for what and what’s best for you.