Computer programming is the means of getting machines to assist with or perform cognitive tasks -- ranging from purely mechanistic ones to tasks with high levels of (semi-)autonomy. In this sense, this practice is fast becoming the language, currency and mechanism for advancement in almost all areas of human endeavor and enterprise.
The steady technological progress in both hardware and software has increased the scope and difficulty of tasks possible for a computer. However, the tools available to a programmer (i.e. editors, IDEs, debuggers, profilers, etc.) are based on one primary modality -- typing. This modality may increasingly prove to be an impediment to the goal of rapid prototyping of solutions to an increasing array of problems that are now amenable to a computer. In other words, one cannot usually type as fast as they can think! (Notwithstanding, of course, the legions of emacs and vim ninjas who would beg to disagree!).
This leads us to consider other modalities for programming -- in particular, voice, gesture and thought, with the latter being the holy grail. One can only imagine a sort of engineering singularity, where thoughts are directly compiled to machine code!
The current state of technology in voice recognition, gesture recognition and thought-controlled prosthetics and early prototypes of voice-based programming makes this vision of the future only a question of "when?" and not "if".
Comments
Post a Comment