How I Learned to Stop Worrying and Love Change
On Thursday June 28th, I took a trip up the 401 to the Kitchener/Waterloo Google office for their I/O Extended event — running synchronously with the principal I/O event in San Francisco. This local event featured a handful of talks by local employees, and streaming of the development-focused talks happening at the main event.
The event was pretty laid back and well-attended, and for me it turned out to be a more observational instead of immersive experience. With this in mind, I enjoyed another talk even more: about what computer scientists can learn from science fiction. It may have been introductory stuff for anyone who took computer science, I don’t know. What I know is that it struck a nerve.
In 1972 Alan Kay, then working at the Xerox Palo Alto Research Center, published a paper which put forth the idea of a personal computer akin to modern day laptops and tablets he called the DynaBook. It would be simple to use, and mainly used for simple purposes, like showing children images and vocabulary at school. Written in the same year that the compact disc was invented and Pong was released, he notes in the abstract that his paper “should be read as science fiction”.
However, he didn’t wait for the technology to be available to begin thinking realistically about it. He cites trends in miniaturization and price reduction, and lays out his ideas about the implications of such a device for a child’s education, even going so far as offering some possible hardware specifications and saying quite literally that we need to wait for a plasma or LCD screen to save space. His prediction certainly wasn’t as futuristic or daring as some other science fiction from the time, but his insights were unique among those in the technology world, namely because he understood the changes that were happening around him.
The ambitious projections for computers made in the ‘40s through the ‘70s by science fiction authors and some select academics certainly seemed outlandish, but in hindsight were vastly more accurate than other statements from those on the business side of the industry. These people didn’t take into account the exponential growth of our technological capabilities. Instead, they looked at the short term, in terms of business models like supply and demand.
Traditional engineering accomplishes things by building on the fundamental constants that have been learned over our history. The less you need to innovate, the better. You’ll be using parts and specifications that have been thoroughly tested already and subject to strict quality control, you’ll find labour who has experience with certain practices, and you’ll be able to work within a definable budget. In computer science, however, the constants underlying the development of new ideas are always changing, for example: screens are becoming more detailed, more memory and computing power is fitting into smaller spaces, internet bandwidth capabilities are getting better — and all of this growth is happening exponentially.
Now that we’re aware of this growth, shown through various laws (Moore’s, Nielsen’s, etc.), computer scientists need to think more like science fiction authors than like engineers in order to innovate. One can begin discussing the implications of using certain technologies before they arrive. One can plan ahead, and despite people shaking their heads about certain assumptions, one can still proceed with some certainty that those assumptions will eventually be realized.
Developers need to be humble to succeed these days, because new ideas and applications are being created every day — new frameworks and techniques are constantly being improved upon. You may feel, as I do sometimes, that the speed of change is too great to keep up with, but I argue that’s an opportunity, and not a problem. Developers need to embrace the fact that we will somehow feel constantly inexperienced, because we should now be aware of the accelerating changes to computer hardware and software. We each have a personal responsibility to shrink the gap between our knowledge and the collective knowledge of the community, but at the same time we need to understand that more opportunities lie in thinking about and planning for the future ahead of time than by waiting for it to arrive. The unsuccessful developers are those that are full of experience that is no longer valid.
Luckily for developers, the amount we can share with one another is also growing exponentially, and we do not have to approach this challenge alone. We can learn from other cultures, artists, authors, and time periods, and adapt their ideas to our own environments. Similarly, developers now have a healthy ecosystem with a community that’s more open to sharing ideas and software, and demonstrating new skills that are on the leading edge of what our web browsers and devices are capable of.
We’ve seen usage of smartphones grow exponentially, both in developed and developing countries, along with the increased fidelity of mobile networks and devices. On these devices, developers can now harness a gyroscope, the camera, the microphone, geolocation, and multi-touch gestures; but when developing practically, one still needs to operate within certain constraints like screen size, poor network connectivity, and the existence of various mobile browsers. New capabilities emerge from the work by competing companies, but for the web community to effectively harness them, some kind of workable standards need to be drafted and implemented. That can take some time. Hacks and shims are introduced because developers are justifiably impatient, and need these new capabilities to be available now.
The current focus for web developers is to make content on the web accessible to everyone: no matter if they’re sitting comfortably at home at their 27” desktop computer, or if they’re on the bus browsing the web on their phone, or in two years when they’re browsing the web through their glasses. Responsive web design, which was only brought to the mainstream a little over two years ago and seemed to most designers to be strictly a way to liberate old designs from the clutches of layouts of fixed size for desktop computers, is now turning out to the be a new paradigm for web developers to embrace.
Standards have always been important to a healthy development community, and they always take some time to be finalized. Think of “responsiveness” as a new thought standard — admitting that we may not really know where the consumer is, what they’re doing, what they’re thinking, or what device they’re using, but knowing with certainty that we have the functional ability to give them what they need, even if the form isn’t perfect. The same paradigm, I argue, should be embraced by any company that has a presence on the web.
While marketing budgets used to simply account for some initial creative work followed by media buys in magazines and television, new budgets need to take into account the new ways in which audiences are attempting to connect with a company. In addition, companies must realize that content production is increasingly important. Consumers no longer just want the product or service a company sells, but want access to great customer support in a variety of media, a view into the company’s culture, and its ambitions and commentary. More methods exist today than ever before to get this content to a company’s audience, and to encourage them to help spread that content and get instant feedback and real reactions. The same certainly can’t be said in budget meetings just ten years ago.
Digital marketing companies need to learn to love experimentation, so that when innovation goes mainstream, the ideas have already been flowing around the offices, and clients can be ready and confident to make use of that innovation.
The current changes in consumer behaviour is just as significant as the changes in the technology they are using, and companies wanting to gain some advantage should be ready to heavily invest in the web even if the opportunities seem like science fiction, because change is all around us and it shows no signs of slowing down.