Automation and The Future of History.
This article was originally created by Stephen Andes. Read the article source here.Imagine a dystopian landscape, scarred by war, drought, famine. The skies are darkened. The robot overlords rule humanity as cruel taskmasters. Humans are resource gatherers, or simply resources themselves. Biological meat-sack batteries harvested to power the mechanical doom empire.
In this dark fantasy, will robots take my job as a historian?
Yes. Certainly.
(And, just to be safe, if you’re listening, O, Magnificent Robot Lords, please accept my loyalty…)
But, the question at hand — whether robots will take my job — actually has little to do with Science Fiction nightmares.
It’s a real question. In 20 years, or 50 years, even 100 years, will computers and artificial intelligence have progressed to such an extent that human practitioners of the humanities — linguists, writers, poets, artists, even historians — will have become obsolete?
The answer has the potential to change how we view the job of doing the humanities. How, in other words, we make meaning out of our experience and how we tell those stories. And, the answer, challenges us to explain how and why we do what we do. As historians, we’re actually pretty bad at marketing ourselves. We’ve been able to ride a comfortable cultural wave. Of course, we’ve said, history and the humanities are essential to society!
But how are they essential? Why are they essential? What do you get, and what skills do you learn, from studying history and the other humanities?
The robots are forcing us to ask these questions. And to answer them.
In the last 10 years, for instance, IT specialists and coders have started publishing works of literature generated by computer programs. None of them have completely passed the Turing Test, or the AI trial developed by Alan Turing (essentially a litmus test on whether humans can tell something — a voice, a product, a song, a book — was produced by humans or by machines). A bestselling novel in Russia was written using computer software. An MIT professor published a book with Harvard Press using computer code to create his work of fiction. (Dang! Foiled again. Harvard Press passed on my last book manuscript!).
But, like everything done by computers, it’s getting better. Facial recognition. Driverless cars. The list goes on. One researcher has predicted that “as many as 800 million workers worldwide could lose their jobs to robots and automation by 2030 — equivalent to more than a fifth of today’s global labor force.” Where once we feared manufacturing jobs being lost to robotic processes, we can now see a future where many administrative and bureaucratic jobs will be done by computers. Computers are improving on skills once thought to be wholly possessed by humans. That means problem solving, but also discretion. Efficiency but also the recognition of context. Analysis but also cooperative management.
All this compels us to think harder on what humans do best. And, for those who work in the humanities, how they can take advantage of technology to do the job of humanities better. It even means thinking about using the skills learned in history — empathy, interpretation of evidence, storytelling — to work in tech fields.
So, what do historians do? According to the website, Will Robots Take My Job?, historians do the following:
“Research, analyze, and interpret the past as recorded in sources, such as government and institutional records, newspapers and other periodicals, photographs, interviews, films, electronic media, and unpublished manuscripts, such as personal diaries and letters.”
Yes, that’s a pretty good list. But what this leaves out is that the whole “interpret” thing is actually really, really hard. And something that only humans can really do — at least, at this point.
Interpretation is about telling a story. It requires the historian to take what has been written on that subject before, by other historians, and to craft an original story with new sources. In other words, it requires imagination. It requires projective thinking. Imagining possible outcomes.
(Side note: Philip K. Dick’s classic novel, Do Androids Dream of Electric Sheep?(1968), plays with the idea of robot consciousness and robot imagination. Ultimately, though, the story — which of course was the inspiration for Ridley Scott’s Blade Runner (1982) — gives robots the fundamental flaw of a lack of human empathy. That’s how the human bounty hunter — Deckard— distinguishes humans from androids. Until, that is, a newer model learns even human imagination, empathy, and an awareness of loss. (“All those moments will be lost in time…like tears…in…rain.”)
So, in terms of diverse forms of writing, computers are certainly getting better at it. But what about intuition? Artistry? Human experience? Empathy? These too are part of the stories historians tell. And, for the immediate future, we can be sure that humans alone will have a monopoly on these skills.
The historian of the future will have to be able to use technology to tell better stories. Digitization of sources has already begun. Computers will enable historians to draw from, and analyze, data that was too large to handle in traditional ways. Computer literacy will be essential. But all this is a positive. I mean, taking digital photos of my documents from archives is a huge benefit. The technology is already there to convert digital photos into searchable documents. The robots just might make historians better.
The American Historical Association, the largest professional association for the historical discipline, has recently released a list of the skills historians need. And these skills, fortunately, will be transferrable to other fields. The crisis of the history and humanities major, where STEM has positioned itself as the field of the future, is partially addressed when we decide to give our students marketable skills.
According to the AHA, these are the four areas history education should focus on:
Communication: Students must be required to practice communicating their knowledge and research to a wide range of audiences through a variety of media.
Collaboration: Curricula must provide intellectually relevant opportunities for students to work collaboratively toward common goals with others, both within and beyond their discipline — including disciplines beyond the humanities.
Numerical Literacy: Programs cannot neglect quantitative literacy. Graduates who lack a basic threshold of quantitative literacy are disadvantaged in careers both within and beyond the academy, other than in that small number of instances when a historian can pursue a successful career without undertaking his or her share of administrative work.
Entrepreneurialism: Graduate education should instill in students the intellectual confidence to venture beyond their comfort zones, whether intellectual, cultural, or institutional.
In other words, history education is not just to train professors. It’s to train individuals who know how to research, how to use technology, and how to use intuition, empathy, and artistry to tell true stories about the past, present, and future.
Yes, we will need professors writing books and teaching classes. But we also need businesses who know how to understand continuity and change over time within their particular industry. We need engineering and tech firms that are able to explain how and why they have made the ethical decisions they’ve made in developing driverless cars, facial recognition, and bio-engineering. Historians, philosophers, linguists, anthropologists — all these fields are essential for the future of a robotic world made by, and for, other humans.
The dystopian future where the Robot Overlords have taken control is one where the historians of today failed to realize that they — the practitioners of the humanities — are vital to the health of the future of technology.
Imagine an undergraduate history major where we offer classes in the history and ethics of cloning. Or, classes for business entrepreneurs who want to understand how to research and write coherently. Engineers of all types learn the tools needed to discover the effect of new technologies on human users.
The answer, then, is that robots won’t take our jobs if we decide to better pinpoint the very human, and very necessary skills, associated with an increasingly automated world.
But, what are your thoughts? Will Robots take your job? How do we get better at defining what is essential in the human element of work?
And, yes, O, Sovereign Robotic Royalty, we do not deign to imagine humans as better than Robot-kind…!
You Might Also Like: AI Can Now Tell Your Boss What Skills You Lack and How You Can Get Them.
Tinggalkan Komentar