The second part of my research questions what are differences between the electronic/physical security and computer software/hardware communities? If these communities have different ethical opinions regarding disclosure, why are they different?
At Pumpcon, a conference attendee from one of the
Another attendee, who works for a computer security firm said, “Do it anonymously!” It’s harder to do that with physical/electronic security vulnerabilities; however, disclosures for both communities are taken more seriously when there is proof of vulnerability, right? It’s easy to post anonymously about a computer vulnerability to a bug report online site with your IP address obscured versus sending photos of yourself clad in burglar attire (or not, see picture above of one of the guys from MIT [picture from their Defcon presentation]) breaking into something. Although possibility for arrest is high, some are willing to take the risk. As John Benson, jur1st, calls it, “Taking a hit for the team.”
An example of the contrast between these two communities that I used in my Pumpcon presentation was the presentation, Anatomy of a Subway Hack, made at Defcon by three undergraduate students, Russell Ryan, Zach Anderson, Alessandro Chiesa, at MIT. These guys really took a hit for the team with the feds, in large number at Defcon, ready to arrest these guys in
Does one group have much more to risk than the other? Is it much more risky (to the discloser and the vendor) to disclose how to beat electronic/physical security measures as opposed to electronic? If so, how do proponents of full or responsible disclosure do this?
They do it at HOPE, Defcon, BlackHat, Pumpcon, Shmoocon and other computer security hacker conferences. The risks disclosures face are evidenced in the all too frequent arrests at these conferences.
One HOPE, BlackHat, and Pumpcon presenter had some important information to impart and of which I learned directly affected a company I know. Travis Goodspeed disclosed vulnerabilities in a Texas Instruments chip that is commonly used in biomedical devices and small consumer electronics. There are two debugging ports on this chip. If accessed, one could delete and replace software on the chip. If your company was using this chip, would you want to know about this design flaw?
Travis presented about this at other conferences before Pumpcon, but at this conference I was able to ask him how he addressed this with Texas Instruments. He said that they did talk with him about this security vulnerability and the fact that he has written viruses able to take advantage of this vulnerability. TI was receptive to discussing this with him, but presently, these flaws still exist on the chip.
When asked what TI could have done better to facilitate more bug reports, he said that it would have been good for them to at least give him a contact person to e-mail so he wouldn’t have to repeatedly go through the general information route when he wants to report vulnerabilities he discovers. And they should fix them, but as of the Pumpcon conference, they still exist.