Karen Sandler was 31 years old, working at a non-profit organization providing free legal help to computer programmers, when she was diagnosed with an enlarged heart and informed that she’d need a machine to help keep her alive.
Her mother accompanied her the day a doctor recommended that Sandler undergo surgery to implant a medical device into her chest. He handed Sandler a pager-sized machine called a cardioverter defibrillator – a miniature, implantable equivalent of having EMTs follow her around all day with defibrillator paddles should her heart stop.
The device was a round, metal compartment housing a tiny computer, an electrical pulse generator and a battery. Connected to her heart with metal wires, the device would monitor her heart rate and deliver an electrical pulse to shock it back to a normal rhythm should a mild burst of activity, such as hurrying across a street or running to catch a bus, over-exert her. Even as a self-professed "technology warrior," the prospect of becoming part machine caught Sandler off guard. Computers crash, run out of power and succumb to hackers. Would becoming a "cyborg" ultimately count as an affliction or an upgrade? And could she really trust a machine with her life?
Sandler grew up around machines and the programs that run them. Her father was a computer programmer; she taught her first basic computer class at summer camp when she was 16. She received a bachelor’s degree in engineering from the Cooper Union before pursuing a law degree from Columbia University, where she co-founded the Columbia Science & Technology Law Review. It was while working for the Software Freedom Law Center, an organization offering legal help to computer programmers working on open-source software projects, that she learned of her condition.
Sandler was scared but skeptical – not of the diagnosis, but of the machine. The diagnosis was serious and heart surgery is a complicated and dangerous procedure, but with the device in her hand and her worried mother sitting nearby, the first words out of Sandler’s mouth were, "What does it run?" While framed as a software question, her concern was much more personal: What exactly was the doctor proposing to weave into her heart? She had the physical device before her, but she was concerned about the imperceptible workings inside the machine to which she was to entrust her life.
Sandler had worked with computers long enough to know that all programs have bugs – that’s why computers need frequent updates and anti-virus software is a must. Undiscovered bugs can cause a machine to behave erratically or leave it open to infiltration by "crackers," the techie term for hackers with malicious intent who penetrate closed systems.
Sandler wasn’t ready to trust her heart to a program she hadn’t seen. Her work with open-source computer software had taught her that the best way to detect bugs and fix them is to tap the wisdom of the crowd through open-source programming. Open-source projects allow the world to view a copy of machine’s source code, the underlying instructions that tell the device what to do. In terms of an implantable defibrillator, that would mean making public a copy of the code that tells the device when to provide a shock and how much shock to provide, as well as how to monitor the heart rate and log unusual events. Modern heart devices can communicate wirelessly, so the software is additionally responsible for prescribing how a machine sends and receives signals and how it determines whether a signal is authorized to access the machine. While an individual person’s device needn’t be open to the world, a circulated copy can gather comments and suggestions that the device manufacturer can choose to adopt or ignore.
While it seems counter-intuitive, open-source software is often more reliable because it has had the benefit of being tested, checked and patched by a larger team of people. The most famous software programs are closed-source, such as Microsoft’s Word and Adobe’s Photoshop, but open-source software projects are silently ubiquitous. The U.S. Defense Dept., massive corporations like Merrill Lynch and the entire London Stock Market rely on an open-source project called Linux.
"It’s not a guarantee that bugs will be found if you make software free and open, but it makes it much more likely over time," Sandler says.
Sandler knew that the software protecting her heart was inevitably fallible, but the stakes were much higher than usual. Software flaws could not only mean errant shocks due to bugs in the code, but coupled with wireless accessibility they might mean someone could crack the code inside her heart. Sandler searched for new sources of information, having gotten nowhere with her doctor or the medical device sales reps he referred her to. The first specialist she talked to told her that she was paranoid – who would bother to crack a medical implant’s programming in the first place? No one had done it before and the implants were designed only to communicate with special computers sold to doctors. Sandler called St. Jude Medical (NYSE:STJ), Medtronic (NYSE:MDT) and Boston Scientific (NYSE:BSX), 3 of the biggest heart device makers, and found herself at a dead end each time. No one would tell her about the source code that would end up inside her body.
Device makers have good reasons for keeping their software a secret, a tactic sometimes referred to as "security through obscurity." Each manufacturer designs its own software to run its own devices, meaning that publishing the inner working of the machine would expose weaknesses. If the programming has vulnerable points, making them public could give competitors a leg up or give crackers the blueprints for bringing down the device.
Another motivating factor may be in the way the FDA reviews the machines and the software inside them. While the agency never directly reviews software unless something has already gone wrong, the FDA treats a patch in programming the same way it would treat a physical change to the product. A medical device with altered software is often considered a new device, which requires a new round of expensive and time-consuming evaluation. Furthermore, patients with the original device wouldn’t be allowed to simply download an updated version of the software – they would have to undergo surgery to implant a new device after the original product had been recalled. The danger in relying on obscurity as a security measure, however, is that weaknesses remain hidden to the community at large, but not to the crafty crackers who sneak their way in.
"Keeping the code closed doesn’t keep sophisticated people from hacking it," Sandler says.
And once the secret is out – once a single person has discovered and leaked a copy of the program – that device is exposed forever.