I first used a computer in 1970 — just over 50 years ago. For my undergraduate thesis in psychology, I had asked 300 students to complete a long questionnaire and when it came time to analyze the data, I realized it would take me many months using a calculator. A professor suggested I use a computer and that he was prepared to give me a couple of lessons in a programming language called Basic. A couple of weeks later, I inserted a big stack of punch cards into a mainframe card reader and hundreds of analyses were completed in 90 seconds. I decided computers could be a big thing.
A decade later, working at Bell Northern Research (Canada’s Bell Labs), I was managing a group that was researching a radical idea. We believed that everyone would use a computer connected to a vast network of networks and that computers would evolve beyond just processing data and become a communications medium. A couple of years later, I published a book showing our research and developing that idea. The book did not sell well. It was a study in bad timing and the biggest objection I received was from the media who said managers and professionals would never learn to type.
Reflecting on that half a century, I have concluded that the digital age has now entered a second era. By understanding this second era, businesses and other organizations have a chance of making sense of the bewildering onslaught on technologies and responding effectively.
The first era of the digital age spanned the rise of mainframes, minicomputers, the personal computer, fax, the internet, mobility, the World Wide Web,…
Source link