The field of computing has come a very long way. From basic, obscure academic theory in the 1940s to industry in the 1970s, and homes in the 1990s, they are used everywhere today – you probably have one in your pocket right now. This is massive growth, on par with the industrial revolution in scale, scope, and social influence.
Computing started out ugly and difficult to use. Of course it was – the early computer pioneers were mathematicians and engineers, not designers. The early command line operating systems use old teletype standards, measured in lines and characters. Computers were thought of as machines to calculate, tabulate and sort, so you didn’t need much more.
When the typesetting and printing field started using computers to speed up the process of creating press-ready output, they had to create many things from scratch. From their efforts, we get modern high-resolution display adapters, Postscript, and the first true fonts – three bits of software that together can display any style of lettering at any size. Most importantly, this not only worked for desktop publishing but had ramifications that changed computing forever.
The big change that occurred is that computer software no longer had to look boring. There were more tools available to fix their usability flaws, and once that started happening artists and others started using computers more.
Perhaps the main reason computers are so very ubiquitous today is that through this acquired knowledge from other fields AND the new audience they had obtained, they were both forced and able to evolve. The computer field had to learn to work with the computer-illiterate rather than punish those who aren’t. Software was made usable through what we today call Human Factors design – working with people’s expectations, using familiar or obvious terms, adopting visual symbols and metaphors (windows, icons, opening, closing) and sticking with them, keeping them consistent.
This is a powerful concept that is often overlooked, but the evidence can be seen easily: tablets such as Apple’s iPad do human factors so well that infants can and do use them – the metaphor and action of “point and touch” is so universal in the human experience that it requires zero training to use.
Perhaps just as importantly, software is also designed for safety. On many operating systems and browsers there are big warnings that pop up when downloading a file with attached malware, or entering a site with forged credentials. Most modern systems prevent unauthorized people from even seeing crucial system files any more, never mind being able to cause damage.
This is what healthcare can learn – if thinking of it as a system, then the system needs to be designed so that a completely health-illiterate person, even a child, can use it and use it safely without additional training.
The biggest risk in today’s health sector is infections, and the first line of defense against those is hand hygiene. Like in computing, people need a good, hard-to-avoid prompt when hand hygiene should be performed, that uses universal symbols and metaphors and is well-designed for all users – visual acuity, literacy and physical factors are all possible barriers. This is something we’ve written about copiously, and it’s something that does take some real time, effort and artistry to do correctly, but when it works, it really works.
However, the folks who work in health are not human factors specialists. Like the mathematicians who made the first computers, they know what they do well, but don’t often see the viewpoint of other fields that might help as they simply don’t get exposed to it.
We would suggest that healthcare be open to bringing in industries that know and can wield human factors design, like the print industry did for computing. This includes signage, design, ergonomics, and arts. This has ramifications across all of healthcare, helping not only infection control protocol like hand hygiene and isolating patient rooms but also leading to safer biomedical device design and harder-to-misread pharmaceutical labeling.
It’s able to evolve and there’s an audience out there that would benefit greatly if it did. Infection control communication is still stuck in the age of the command line. By now, we should be in the age of the smart phone, and we have the tools to do it. We just need to learn to work together.
Images from Wikimedia are used under a Creative Commons Attribution 2.0 license
The Infection Control Symbol Package and its contents are available under a Creative Commons Attribution – NonCommercial – NoDerivs 3.0 unported license