follow me on Twitter

    Tuesday, December 9, 2008

    Discussion with Nick Farr, HacDC

    I was in the D.C. area in late November and couldn’t resist stopping in to meet the HacDC guys and see the space. The evening started with Nick Farr inviting me to a DorkBot presentation at George Washington University. Alden Hart, CTO of Ten Mile Square gave a great presentation about his LED projects. This was one of the most comprehensive technical presentations I’ve seen that encompassed everything from where to buy the parts, where to ship the PCBs for fabrication, to discussing details of the software and hardware designs. I'm thrilled he's going to release his hardware designs as open source.

    From there, we went to Froggy Bottom for sub-par pub food, but like most hacker group outings, the company was what was outstanding. Late—sometime around 11PM—we wrapped up the dinner and a group of us went to Nick Farr’s apartment (he has CASES of Club-Mate!) and then to HacDC. While we were there tapping into his Club-Mate stock (entire fridge full of it, too), out of his closet he pulled out a really old computer with an acoustic coupler. I’d never seen one that old because back in the 80s when we had 30+ phone lines going into our suburban D.C. house, we just had racks of slow modems, but none had couplers. He’d salvaged it for hacDC.

    On the ride over to HacDC, I was able to ask Nick specific questions about the organizational structure, management, and financing the space. Because I was driving, I wasn’t able to take specific notes as I was when I talked to Far of Hacktory, so don’t take this verbatim—especially the costs.

    I first asked Nick about the name. It includes “hac(k)” which, in my experience with some hacker spaces, is a turn-off for some participants. Maine’s hacker space is struggling with this, too. His response: if they don’t like hack, then they don’t really understand what we do here and this might not be the best organization for them to join. He said that “hack” in the name clearly separates the organization from other group work spaces, like co-working. However, he also said that some members solely have numbers assigned to them because of the need to remain anonymous because there are still some businesses that shun associations with anything related to hackers.

    The space was amazing! So far, this is the most complete hacker space I’ve seen. What’s also interesting is their location. A church has rented out space to non-profits and hacDC has a loft space. One side of the space is all shelving for storage and it’s packed. I saw some old payphones, an old PC being used as a ballast for a huge rotating white board, five Geiger counters (which I relished being able to play with), table saws, old modems, and tons of computers. Tables are in the middle of the room to be used for projects and Tim proudly told me, “We even have our own bathroom!” as he gingerly took some drinks out of the fridge.

    We discussed that they only have one fee structure which is about $40/month and have around 40 active members. For a large city and even larger technological suburbanite community, I understand how they can draw so many members. They have also started hacker-theme movie nights and will be offering educational classes. It seems as if they have weekly events which is very cool they can do that. I cannot wait to go back to hacDC during Shmoocon, the next time I’ll be in the D.C. area. That seems like an awesome place to be during the conference evenings. Love it, love it!

    Monday, December 8, 2008

    We Won Venture Capital Pitch Contest!

    The venture capital pitch competition was held last Thursday night at Pace University’s Business School in NYC. What a fun event! I started the pitch about something that most people like, fast cars and computers. I used Knight Rider as a theme for the pitch. I then briefly outlined the technical capabilities about what it can do now and what it will do with some VC money when the prototype is built-out. Slides with more technical info. were shown behind me as I described how the team did it and what we’d like to do with it in the future. During the Q&A, I addressed how much money we’re looking for ($30K just to build-out the prototype).

    The majority of the judges were VCs and one like it. I met with him the following evening along with the President of a car computer company that has related, but not similar, products. They liked the idea and said the market is huge, but didn’t like the reverse engineering and brute forcing the protocols that we’ve done. Although that has been a valid and legal business model in the past (Compaq did it to IBM), the VCs want it done with licenses and defensive patenting. We might be able to do it like that as long as we don’t lose the open source/free software platform. We’re talking.

    Monday, December 1, 2008

    Finalist for Pace University Venture Capital Pitch Contest

    We made it! OpenOtto is a finalist in a competition for venture capital financing of a new product. I'm off to NYC for the Thursday night presentation. I've been busy working on the presentation, but here is the winning pitch that got OpenOtto into the finals:

    "You don’t have to be David Hasselhoff in Knight Rider to have your car talk to you. OpenOtto is a platform for developing vehicle aware products for the consumer and industrial markets. While it will not ask you how you’re doing this evening, most people don’t realize how much information your car’s computer can tell you. OpenOtto consists of a hardware interface to your car's OBD II connector as well as an extensible software platform for communicating with all networked electronic devices in the car. Designed for flexibility and scalability, it is easily expandable to future vehicle capabilities.

    OpenOtto consists of two products targeted to different markets. The first is a car computer that acts as an interface between your car's computer and a 4" x 8" touch screen display that attaches to your dashboard. The interface shows easy to understand graphical output from your car's computer including, but not limited to, standard OBD II output: coolant temperature, engine speed, oxygen sensor readings, and emission related trouble codes. Advanced features include outputting suspension control, anti-lock/traction control, and air bag status.

    Additional safety and security features include a remote start and kill feature for anti-theft or convenience, display warnings to users when the transmission begins to fail, individual wheel speed indicating wheel slippage, and real-time engine performance monitoring.

    The second product is priced lower for the general consumer. It includes the ability to attach any cell phone with GPS to OpenOtto. Once attached, the car's computer will text message someone (e.g., a parent) if the car exceeds a certain speed and GPS coordinates will be texted, and call 911 if airbags deploy (no proprietary subscription necessary).

    Safety and security is important and built into the computer engineering designs. Some features will be access controlled and transmission of all sensitive data transmitted by OpenOtto will be encrypted using industry standard best practices to ensure safety, security, and privacy of the user.

    The software and hardware designs will be released as free and open source designs to encourage adoption and adaptation of the features.

    For consumers, a complete dashboard mounted display with computer will cost between $300-$500.00. The closest product currently on the market costs between $1000.-$5000.000 and does not include open software and hardware platforms, graphical dash board mounted displays, or customizable features. The low cost consumer device will target a retail cost of $100-$200.

    Try getting KITT for that price."

    Sunday, November 16, 2008

    Discussion with Far McKon, The Hacktory

    In the car on the way home from The Last HOPE in NYC in July 2008, three of us (Nothingface, Professor Rad, and Infochown) decided we want a hacker space in Maine. We were inspired by a presentation at HOPE about other hacker spaces.

    Since then, we’ve attended four 2600 meetings where we’ve spent a bit of time discussing the hacker space idea. Funding and a name (“hack” or not “hack” is the question) are the sticking points for establishing the space. With all of the companies going under in Southern Maine, there is a plethora of rental space (both business and industrial), so we have plenty of space options, but just not the funds yet.

    To find a solution to these issues and learn from others’ experiences, I’ve summoned wise hacker space organizers and asked if they could share their ideas, organizational structure, and “don’t make these mistakes like we did” stories with me. (The “hippy problem” stories are always the funniest.)

    The following is a discussion I had today with Far McKon, organizer of Hacktory in Philadelphia. (I took notes while we were talking, so forgive me, Far, if I don’t have everything exactly as you told me.) Far also said that the numbers are a general ballpark, so please don’t take this as a price quote.

    If Maine’s hacker space can pair with the art community and, perhaps, also attract the entrepreneur/writer community like IndiHall in Philadephia, I think we could have a pretty awesome co-work place!

    Discussion with Far McKon, November 16, 2008

    I spoke with Far, organizer of Hacktory in Philadelphia, about how they set up and manage their space. They are moving from a free space they had that was loaned to them by a company, but that space is too small and it’s not possible to use heavy machinery and big tools in that space; they cannot get it up to their 3rd floor space. It has a crazy, narrow winding staircase and zoning would not allow it.

    To solve their space problem, they grouped together with the art community in Philadelphia. (Brilliant! This is something that Maine’s Hacker Space surely could do—we have a HUGE artist community in Portland.) They will soon have a tech incubator space in the basement of a building that is zoned for heavy machinery/industry.

    Hacktory’s affiliation with the art community is crucial for what they are doing. The hacker space in Somerville, Massachusetts also paired with the art community. Many artists are using heavy machinery and tools for their art and either they don’t have the space in their homes/apartments, cannot use fire and big machinery without violating zoning or fire codes, or cannot afford their own large studio.

    The way the deal will work with the two other organizations grouping together with Hacktory is that one of the groups (not Hacktory) is always there (9-5) and controls access, signs for package deliveries, phones, secretarial services, etc.

    Physical Space:

    They have a large industrial space with a big, open work space in the middle. Locking studios (cubicles?) are grouped around the central open space. Those studios are small (about 50-100 sq. ft.). Some of those locked studios contain the more dangerous or expensive equipment; you get a key by taking 1-2 hour classes about “how not to break the stuff.” Some of the studios are rented to hobbyists who what a locked space and others are rented to for-profit businesses; there is a different fee structure for these two groups. All of the studio renters get the benefit of an address, secretary, shipping, etc.

    Another idea, although not what Hacktory is doing, is that some hacker spaces rent the use of your own Craftsman rolling case for about $50.-75.00./month. You use the shared work benches with your tools in your rolling case. When you’re done, you put your stuff away in your case and roll it into a locked area.

    Fee Structure for Studios:

    $1.25 sq. ft. for hobbyists

    $2.25 sq.ft. for-profit business, secretary, shipping

    Organizational Structure:

    Hacktory is a 501c3 non-profit. They have also registered with the state as an educational organization. With that status, they pay a lot less for insurance which is about $180-200/month.

    For paying members, they have about 5 active members and 7 who are occasional.

    How Does Hacktory Raise Funds?

    They offer classes and they have paying members. The classes they run are about $15.00 per hour of class time. The classes a member would need to take to operate the locked machinery is about $30.00 for 2 hours of instruction.

    Membership Rates for Hobbyists:

    • $15.00/month open hours, open means when there is an organizer/manager there;
    • A few students pay no fees if they watch the space, so could allow for more open; hours;
    • $65-85.00/month for Saturday and Sunday access only;
    • $125.00 open access with your own key card.

    Saturday, November 15, 2008

    Maine's Biggest Cyber Crimes Case--James Wieland

    All day Thursday of this week, I got e-mails from friends in different computer law, hacker, cyber crimes, and computer forensics communities. They'd ask, “Did you see the hacker case? What do you think? Did he do it?”

    The case to which they are referring is, to date, Maine’s biggest cyber crime’s case. This is bigger than the warez sales going on in Portland back in 2005.

    But there is something different about this case that’s intriguing to me. In the warez case, there was a group out to profit from copying software; it was an unsophisticated, but high volume, operation. The warez trade isn’t too difficult to do and it’s not terribly interesting.

    This case is different. Although I’ve spoken to James, we knew better than to discuss the case. The facts about which I write are only what are available to the public via news organizations. The news articles—from here down to the D.C. area—all have the same theme of, “evil hacker breaks and steals stuff,” but what’s really going on behind these headlines, only time will tell as the facts--especially the technical aspects of the case--are presented in court. Was this curiosity and experimentation that went beyond rational bounds or, on the other hand, was this a well-designed and calculated attack with fraudulent intent?

    The news reports state that James Wieland, student at University of Maine, spread a Trojan horse program (don’t know if it was a worm or virus, but this will matter in the case) by adding it as an attachment to an e-mail that contained a video game. When the recipients opened the attachment, the malicious program was executed. The Trojan reportedly contained a keystroke logger program. It was reported that James has been collecting and storing that data since August 2007. But the intriguing part—and I’m sure an aspect of the case that will be highly-debated in court—is why James allegedly collected this information and why he didn’t do anything with it? There were no reports that he used anything he allegedly collected.

    There are some hypotheses in the case: 1) James was the victim of a botnet attack on his computer meaning that he didn’t know about the Trojan or what it would do; 2) He released the Trojan as a curious experiment (didn’t write the code, but in script kiddy fashion, applied it and released it) but didn’t quite know what it was doing or how to stop it (classic Morris Worm case); 3) He wrote the code and released the Trojan with malicious intentions and wanted to collect private data from the victims and either sell it or use it for nefarious or fraudulent purposes.

    What’s striking to me about this case is that James has a lot to loose. He just got engaged in Italy last month, worked for a Christian school, has his own business consulting and web design company, and seems to be just starting his professional and family life. He just doesn’t fit the typical profile of a malicious cracker. If convicted, these felony charges could be more than 5 years in prison. The District Attorney states that the 5 year sentencing estimate may be just a start and that there might be more incriminating data found now that they’ve cleaned his place out of all electronic equipment.

    I wonder if University of Maine’s policy of attaching students' and faculty’s first and last names to our host names (HEY DEFENSE COUNSEL, YOU LISTENING?) had anything to do with them tracing an IP address to James Wieland? This can be spoofed, and if James was clever enough to write the program and orchestrate this attack, wouldn’t he have also been able to obscure his IP address?

    What I can do is bring as much as I can from Wieland’s case to our computer law and ethics class, COS 499, at University of Southern Maine in spring 2009. Like everything in my class, we analyze computer law and cyber crimes cases from a non-biased perspective. I have hackers as guest speakers as well as a few influential FBI and CIA agents discussing different perspectives regarding electronic crimes and how to prevent them with better computer security built from the ground up into software, hardware and network design practices we teach at U. Maine.

    I will be going to as many of Wieland’s hearings as I can; most certainly, I’ll be at the arraignment in January 2009. No matter the outcome of the case, there is a lot that security researchers, information technologists, and computer scientists can learn from this case. A lot of lessons will be learned both by James Wieland and by the University of Maine.

    I urge readers to please hold judgment until the facts—especially the technical aspects—are presented. I want to read more articles or hear in court that this is more than tracing an IP address to Wieland. Right now, there are no details beyond that.

    When we hear about the Trojan’s code (such as how it worked—I want the functions of the key algorithms discussed in court!), how and where the data was obtained and stored, 4th amendment practices (forensic hashing of the hard drive), access to the data files, and the Internet access to Wieland’s computer network, then we’ll discuss what went wrong.

    I’ll keep you updated. If you see news articles or find information online, please e-mail them to me.

    Wednesday, November 12, 2008

    Presentation for the Maine Association for Law and Innovation

    Yesterday I presented at Maine Law about my security vulnerability disclosure research. It was great to be back at Maine Law, but this time as a lecturer. I spent so many hours in the moot court room as a student that, just for a split second, I has a fleeting feeling of "OMG, am I going to get cold called on," when I entered the room. Some things from law school never leave you.

    I have made some changes to the disclosure presentation I gave at Pumpcon a few weeks ago. For instance, Juliet (aka, Victoria) said that I had too much stuff on my slides, so I tried to par that down. She was totally right about that.

    Also, I changed the title from "How to Responsibly Disclose," to a title that didn't reflect the ethical ramifications of the word "responsible." Using the industry practice of what is called "responsible disclosure," is not always the most responsible way to disclose vulnerabilities. The more research I do and the more well known security researchers with whom I discuss this topic, I find that sometimes other types of disclosures (full or partial) is what's needed for better security. My legalese peers may not agree with me, but from the computer researcher's perspective, the name "responsible disclosure" is not always as the ethical implications of that word suggest.

    Last, but not least, I added a lot more about disclosing physical/electronic security vulnerabilities. I did a lot of research into lock picking laws and industry practices. Really fascinating and it makes me want to do more lock picking should the opportunity present itself.

    However, are electronic biometric locks with cryptographic keys the future of the lock industry? I think they may be. That's good and bad--like all technology, isn't it? I've always been interested in the idea of faking biometric scans. For example, I know that a retina scan is really hard to fake, but mirrored or cloudy contact lenses mess up the scan. So, if you're getting your initial scan (such as going through the international terminal in Frankfurt, Germany) and you get scanned with these contact lenses in, will the computer reject the scan or will that base line scan, albeit "fake", be "yours"? I'll have to do a post about biometric scans because I did a lot of related research about that when I was working on RFID technologies and legislation.

    Saturday, November 8, 2008

    Altering Airline Boarding Passes—Schneier and Soghoian

    One of the conversations we had at the November 2600 meeting was about Bruce Schneier’s alteration of airline boarding passes and using one to get through a TSA checkpoint. Schneier admits that it is illegal, and if done, there is a possibility of arrest. (Note: If you’re reading this and considering doing it, remember that you are not Bruce Schneier. I don’t truly think that the Feds would arrest him, but they would arrest you.)

    At the meeting, we were discussing what those illegalities might be. To do so, we considered how fraud is different from a hoax or forgery. In short, fraud is where deception is used to unlawfully take property (usually money) or services from another. What about those theories applied to altering a boarding pass? Go to the link to see an altered boarding pass used by Jeffrey Goldberg—he even upgraded himself to 1st class for priority boarding. New York Senator Schumer was nervous about this exact scenario when he offered a bill that would treat these “federal criminals” named “Joe Terror” like a “…19 year old who makes a fake ID to buy a 6 pack of beer.” (Hhmm...Joe Terror sounds a lot like Joe Six Pack.)

    Not a "Joe Terror," or "Joe Six Pack," a PhD student named Chris Soghoian wrote a program accessible through is website that would generate a fake boarding pass. What happened is discussed in his blog: in short, the glass on his front door was smashed by the FBI, his computer equipment taken, and a search warrant (issued at 2 AM) was taped to his kitchen table. But how does the law address altering boarding passes? Consider this section of federal law addressing the falsification of airline tickets or boarding documents (highlighted for emphasis):

    From DHS Code Title 49, Volume 8; October 1, 2004 rev. [Page 302]:




    Part 1540.5 -- Terms used in this subchapter.
    §1540.5 Sterile area means a portion of an airport defined in the airport security program that provides passengers access to boarding aircraft and to which the access generally is controlled by TSA, or by an aircraft operator under part 1544 of this chapter or a foreign air carrier under part 1546 of this chapter, through the screening of persons and property.

    Subpart B_Responsibilities of Passengers and Other Individuals and

    Sec. 1540.103 Fraud and intentional falsification of records.

    No person may make, or cause to be made, any of the following:

    (a) Any fraudulent or intentionally false statement in any
    application for any security program, access medium, or identification
    medium, or any amendment thereto, under this subchapter.

    (b) Any fraudulent or intentionally false entry in any record or
    report that is kept, made, or used to show compliance with this
    subchapter, or exercise any privileges under this subchapter.

    (c) Any reproduction or alteration, for fraudulent purpose, of any
    report, record, security program, access medium, or identification
    medium issued under this subchapter.

    Below is something under the USC that is applicable to altering a document regarding a “matter within the jurisdiction of executive, legislative, or judicial branch of the Government":

    United States Code
    Title 18. Crimes and Criminal Procedure
    Part I.
    Chapter 47. Fraud and False Statements

    47 U.S.C. § 1001
    a) Except as otherwise provided in this section, whoever, in any matter within the jurisdiction of the executive, legislative, or judicial branch of the Government of the United States, knowingly and willfully--
    (1) falsifies, conceals, or covers up by any trick, scheme, or device a material fact;
    (2) makes any materially false, fictitious, or fraudulent statement or representation; or
    (3) makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry;

    shall be fined under this title or imprisoned not more than 5 years, or both.

    Although these codes would answer our question about Bruce Schneier’s experiment with altered boarding passes, they do not exactly cover Chris Soghoian with his website that created boarding passes. Most people who saw it when it was up (I did), thought it was a parody. Here’s what Chris recently said about that experience: “In 2006, the FBI investigated me for some of my research into boarding pass security. While no charges were ever filed, it's reasonable to state that I have little affection for the DOJ computer crimes section.”

    In summary, altering boarding passes—for fraudulent purposes or not-- is covered under these statutes. Beware if you’re not Bruce Schneier. And if you are Bruce Schneier or Chris Soghogian, thank you for your security research and for potentially, “taking a hit for the team."

    Wednesday, November 5, 2008

    Pumpcon, Philadelphia, PA, October 25, 2008—PART TWO, Computer Software and Hardware Security Vulnerabilities

    The second part of my research questions what are differences between the electronic/physical security and computer software/hardware communities? If these communities have different ethical opinions regarding disclosure, why are they different?

    At Pumpcon, a conference attendee from one of the U.S.’s largest electronic/physical security companies answered: “It’s because I’ll get my ‘arse’ fired if I talk about vulnerabilities and I’ll probably never work again.” He also made some comment about worrying about his personal safety post-disclosure.

    Another attendee, who works for a computer security firm said, “Do it anonymously!” It’s harder to do that with physical/electronic security vulnerabilities; however, disclosures for both communities are taken more seriously when there is proof of vulnerability, right? It’s easy to post anonymously about a computer vulnerability to a bug report online site with your IP address obscured versus sending photos of yourself clad in burglar attire (or not, see picture above of one of the guys from MIT [picture from their Defcon presentation]) breaking into something. Although possibility for arrest is high, some are willing to take the risk. As John Benson, jur1st, calls it, “Taking a hit for the team.”

    An example of the contrast between these two communities that I used in my Pumpcon presentation was the presentation, Anatomy of a Subway Hack, made at Defcon by three undergraduate students, Russell Ryan, Zach Anderson, Alessandro Chiesa, at MIT. These guys really took a hit for the team with the feds, in large number at Defcon, ready to arrest these guys in Las Vegas before their presentation.

    Does one group have much more to risk than the other? Is it much more risky (to the discloser and the vendor) to disclose how to beat electronic/physical security measures as opposed to electronic? If so, how do proponents of full or responsible disclosure do this?

    They do it at HOPE, Defcon, BlackHat, Pumpcon, Shmoocon and other computer security hacker conferences. The risks disclosures face are evidenced in the all too frequent arrests at these conferences.

    One HOPE, BlackHat, and Pumpcon presenter had some important information to impart and of which I learned directly affected a company I know. Travis Goodspeed disclosed vulnerabilities in a Texas Instruments chip that is commonly used in biomedical devices and small consumer electronics. There are two debugging ports on this chip. If accessed, one could delete and replace software on the chip. If your company was using this chip, would you want to know about this design flaw?

    Travis presented about this at other conferences before Pumpcon, but at this conference I was able to ask him how he addressed this with Texas Instruments. He said that they did talk with him about this security vulnerability and the fact that he has written viruses able to take advantage of this vulnerability. TI was receptive to discussing this with him, but presently, these flaws still exist on the chip.

    When asked what TI could have done better to facilitate more bug reports, he said that it would have been good for them to at least give him a contact person to e-mail so he wouldn’t have to repeatedly go through the general information route when he wants to report vulnerabilities he discovers. And they should fix them, but as of the Pumpcon conference, they still exist.

    Tuesday, November 4, 2008

    Pumpcon, Philadelphia, PA, October 25, 2008—PART ONE, Physical and Electronic Security Vulnerabilities

    When I was growing up, one of the benefits to having a CIA dad was that I got to play with the cool stuff he brought home. As a result, I’ve been trained to beat lie detectors, how to mentally isolate physical stimuli with training on a biofeedback machine, how to make instant keys with a magical metal that melts within seconds in a spoon held over a lighter flame, and how to break into the doors and windows in the house in which I grew up. There was also a very cool lock picking kit that he had, and once when I locked myself out of my car in a dark parking lot, my Dad showed up with the kit and had my car open in seconds. Growing up, I thought that was a skill that most dads had and that every kid should be taught. I was taught these useful skills that ended up helping me in many situations.

    So why are the best lock picking techniques and tools kept a secret or are illegal to possess in D.C.? Is this another example of “security through obscurity?” I’ve spent some time in the lock picking villages at HOPE and Defcon conferences. Why aren’t there more of these opportunities available to the general public, e.g. not only to locksmiths or computer security conference attendees?

    This was one of the issues I addressed in my presentation at Pumpcon. I had a great time talking about this at Pumpcon. Soon after Defcon and inspired by the events surrounding the three undergrads from MIT, I began researching how to disclose security vulnerabilities. Curious about the divide between the electronic/physical security breaches compared to those involving computer software and hardware, why aren’t both groups interested in truthful discussions regarding flaws? Shouldn’t we as consumers who rely on software designers to protect our home computers and mechanical engineers who designed the locks on our houses know if something isn’t quite as secure as advertised?

    It seems as if the hardware and physical/electronic security hacks take a long time to be exposed. For example, Marc Tobias presented at HOPE and Defcon in 2008 about how to pick Medeco key locks. However, according to my government fed sources, Medeco lock vulnerabilities have been known for more than a decade. What’s the reason for it not being publicly discussed? I’ve been told that it takes a highly skilled lock picker—or a locksmith—to successfully pick Medeco locks.

    But surely, there were skilled lock pickers out there, so why the silence? I found the following on a site that discusses Tobias’ book:

    “A detailed analysis is available together with a video demonstration that clearly shows the method of bypass. This publication has been restricted to locksmiths and the professional security community because of the simplicity of the technique and the potential security ramifications that could result from a public disclosure of the exact method. If you have security responsibility, you may contact the author for access to the restricted document. The password has been posted on ClearStar for security professionals.”

    I know that the locksmith community has a lot of power, but do they have the power to silence discussion about how to pick locks? Perhaps this is true especially considering that it’s illegal to own a lock picking kit in some states unless you’re a licensed locksmith. Is this an example of a good way to make things more secure or, on the other hand, an example of legislation protecting jobs in an industry? This is how the Freemasons started; they were protecting the secret ingredients for making cement thus making the formula a coveted secret protected by those sworn into a brotherhood and keeping the brothers with guaranteed jobs.

    For a modern example, someone from the crowd at Pumpcon told me that Joe Grand, aka KingPin, gets death threats regarding his lock picking research. I’d like to find more information regarding this statement. (Denied by Kingpin--see comments.) If true, that’s a sobering example of the power of a brotherhood of some kind protecting their own. Is lock picking worthy of protecting through this extreme measure? Does it make us more secure?

    Will this attitude eventually spill over into the computer hardware and software security industry? I fear it might.

    Note: The picture above is of the back of Kevin Mitnick’s business card. Obviously, it contains lock picking tools. They will work. Funny, but I think his thumb print is still on the other side of card! ; )

    Wednesday, October 29, 2008

    Live on the air in NYC on Off the Hook radio show

    I was live on the air on Off the Hook out of NYC on October 22, 2008. What a fantastic experience! Although voting machine fraud is not one of my strong suits—I didn’t know the show’s topic before I went on-- I loved discussing tech law and policy with the guys. Talking with Emmanuel Goldstein and bernieS was amazing—I read their cases in law school, and there they were on the air with me! Facing Emmanuel across the sound board and with bernieS on the phone, it felt like we were having a coffee shop discussion because they made me feel very much at ease. It wasn’t as difficult as my first TV interview; I had to constantly remind myself not to look at the camera, but that took such concentration that I seemed distracted. It was easier with radio, but it was more than just the medium. Emmanuel, bernieS, Not Kevin, Rob T. Firefly, and Voltaire on the air with me were awesome. You guys rock!

    After the show, we went for Mexican food in what I think was the Greenwich Village area of NYC. I miss talking about “geek” stuff in Maine. Among other things, we were talking about Second Life: What happens when your character is assaulted online? Legal recourse?

    For dinner, I had a very good chicken enchilada with mole sauce (bitter chocolate). I cannot find Mexican dishes with mole here in Maine, so it was a treat. After dinner, we went to a coffee shop to upload the show and then to a bar (Mars Bar) that is dark because it’s lit with a single bulb hanging from the ceiling. The graffiti is worth checking out if you can read it in the cavernous atmosphere.

    And sometime after midnight, I decided not to take the subway because I don’t know where the heck I’m going once I’d get out of the station in Brooklyn Heights, so I took a cab. The cab driver asked if I was from the midwest—could I have been given away by my slight southern accent that rears its head after a few drinks (or when I’m nervous)? Wait…midwest? That has never happened before. (But I’ve been mistaken for Michelle Madigan at Defcon. That one was the best.)

    Wednesday, October 15, 2008

    Disclosure of Security Vulerabilities and a Geocentric Universe

    The title to the presentation is the following:

    Disclosing Security Vulnerabilities: How to Do It Responsibly

    "Disclosure of security vulnerabilities is done for many reasons. Some of these reasons include: an interest in improving security; warning the public before those with nefarious interests exploit the vulnerability; or for public recognition of skills. There are also different ways to do it including in print or presentations at conferences. Considering both the reasons for disclosure and how it is done affects how security vulnerability research is accepted by the general public, the security community, law enforcement and by the designer of the product being critiqued. This presentation includes how disclosure has historically been done and the differences between the computer and electronic security communities as compared to physical security (locks, alarms, etc.) communities. Relevant legislation, intellectual property considerations and applicable criminal law will be discussed."

    I'm currently engaged in case law research for a presentation on this topic and using Westlaw in addition to my usual journalism searches. Not surprisingly, most of the cases I've read are more about violations of non-disclosure agreements and disclosure of trade secrets by disgruntled employees. However, my friend, Brenno de Winter, recently published an article titled, "Researchers Show How to Crack Popular Smart Cards." I was interested to read that researchers at universities in The Netherlands and in Germany have broadened the research done by the MIT undergrads who were not permitted to discuss or release their source code.

    What I'm discovering from my research about computer security disclosure is that a lot of the heat is primarily focused on academia. Remember Professor Ed Felten with Princeton's computer science department with SDMI? His team won the challenge, but they faced prosecution if they talked about it or tried to publish their academic research. The challenge explicitly stated: "So here's the invitation: Attack the proposed technologies. Crack them."

    However, what if the vendor producing an insecure product does not outright demand a challenge, but simply puts the product into the marketplace? A good example is the Mifare Classic and NXP Semiconductor. They fought the battle against the MIT students and, for the most part, won because their source code was not distributed.

    However, a group from the Dutch Radbouda University Nijmegen recently assembled complete published research that would allow someone to build a cloned card. The Dutch courts said that, "...researchers shouldn't fall victim to mistakes made by suppliers," and allowed publication. I was also amazed to read that a university in Germany cut down an actual chip, and by viewing the IC layers under a microscope, they were able to figure out how the chip works and derived the algorithm.

    Two different ways of figureing out security vulerabilities, but with the same result. It's now out there and readily available to a determined attacker. On the other hand, some might say that it's also readily available to a security researcher who can assess the security vulerabilities and make a better design the next time around.

    This debate is nothing new--consider Copernicus's and Kepler's revolutionary teachings and publications. But what surprises me is the fact that it is nothing new. The prosecutions--including criminal--for teaching, dicussing, and publishing are still a reality. Where would we be now if Galileo's Dialogue Concerning the Two Chief World Systems wasn't published or discussed because he feared being burned at the stake? We'd still be in a Ptolemaic system where the planets revolved around us--a pretty ego-centric way to view life (picture of a geocentric universe by Portuguese cosmographer and cartographer Bartolomeu Velho, 1568 [Bibliotèque National, Paris] above).

    Considering a modern perspective regarding security vulerability disclosures, it would be a more insecure world without discussions about design flaws. Professors like Ed Felten, although perhaps not (yet!) as influencial as Galileo, are to be lauded, not threatened with criminal prosecution.

    Saturday, September 13, 2008

    The Need for Judicial and Legal Technical Training

    Thank you, Neon Samurai! Someone is reading my stuff or watched my presentation at The Last HOPE. Neon Samurai said, "Tiffany Strauchs Rad had it dead on when she said that legislators and judges need only ask the experts what implications making such laws blindly will lead too ("Hackers" in her words; she's a professor of law and proud Hacker)."

    The project on which I’m working related to this is a volunteer-based non-profit that will bring together professionals and students with backgrounds in computer science, engineering and information technology alongside those with backgrounds in law, public policy, and politics. Our objective will be to create a judicial and legal education program for cyber crimes, digital forensics, intellectual property and electronic discovery providing a basic technical background for judges deciding on these cases in hopes that technical misunderstandings will be reduced thus providing more fair judicial decisions.

    Here's another idea I recently considered: Why not work to recruit and fund more people with technical backgrounds to run for political office? If we work to educate the judges and lawyers on these subjects and EFF is working to change legislation through their grassroots efforts and through the court system, let's try to get more tech-savvy people in office! Then we can hit it from all angles.

    I don't think that the future of politics, wars, and the economy is going to be about equality of the sexes, racial equality, and currency-based economics as we know it today. It's going to be about technology and how it affects these concepts: Online anonymity will blur concepts of race and sex, wars are going to be electronic over the Internet, and economics is going to be about intellectual property (or lack thereof) and new energy generated and enhanced by technology (as opposed to crude oil).

    The future of politics is going to be about the technology. But politics, law and legislation is still typically far behind reality. I think a large part of that relates to the people who are our law makers. Let's get more people in those positions who understand the technology and who will make responsible choices while understanding the ramifications. No more "series of tubes" legislators or those pushing for stronger intellectual property protection to prop up weak companies who fear competition and innovation. Also, let's get someone in office who recognizes that our civil rights also apply online.

    Tuesday, September 9, 2008

    What went on at the September 2600 Meeting?

    Who was there: Nothingface, Infochown, Export, C@t6, 4774x312, Charlye, and Prof. Rad.

    What we discussed: Particle physics, plasma cutters, patents, recent DOJ cyber crime prosecutions, packet sniffing, downtown Portland warehouse real estate for Hacker Space, Defcon 16, and the 3 undergrads from MIT with their Mifare hack. Anything else? Comment about what I may have missed while at Starbucks.

    Friday, September 5, 2008

    2600 and Hacker Space Meeting tonight

    Hello to all of the Portland, Maine hackers! Tonight is the 2600 meeting at the Maine Mall. It starts at 5 PM on the benches outside of the food court. If you cannot make it until 6 PM, we'll still be around sitting at the tables closest to the outside doors. Bring some money for dinner (or your dinner) and we'll chat about the progress of Hacker Space. There are some interesting new cyber crimes prosecutions I'd like to share with you, too. Nothingface will discuss some ideas he has about designing home monitored security systems (I think that Charlye also has some expertise about this topic).

    This should be our last meeting at the 5 PM time. After this meeting, we will have met 2600's requirements to change venue and the time. So if you want a say in where and when we meet in the future, please attend. No one who works likes the 5 PM meeting time, but we'll discuss the Maine Mall venue, too.

    I hope to see some new people there, too. Everyone's welcome.

    Thursday, September 4, 2008

    RFID and Mythbusters

    Did Mythbusters scrap their RFID episode because of legal pressure from the large credit card companies and Texas Instruments or did co-host of the show, Adam Savage, “...get some of his facts wrong?” A spokesperson for TI said that things went differently than Adam described during a presentation at The Last HOPE (Hackers on Planet Earth).

    Adam has retracted his statement made at HOPE. However, how much of the statement was retracted? It seems to me that he admitted that he may have gotten the facts wrong regarding who was in on the phone call and the retractions applies to Discovery Channel—and their advertisers—being associated with the decision not to do an RFID security episode. All this means to me is that the parties involved in the call were corrected and Discovery was exonerated from being associated with the decisions, but what was discussed or the rationale behind the decision as Adam says, “If I went into the detail of exactly why this story didn't get filmed, it's so bizarre and convoluted that no one would believe me...” is left for us to speculate.

    How much can be or should be disclosed about security vulnerabilities? It's a topic that everyone is discussing now.

    Last Day of Defcon 16

    Brenno did it! He presented “Ticket to Trouble” in the place of the 3 MIT undergraduate students who, under a last-minute Massachusetts court injunction, were not able to present. Before his talk, nervous he might get arrested, the legal contingent at the conference told him not to worry. The scope of the injunction was properly narrow: it only referenced the “MIT undergrads'” and their research pertaining to the crack for the Mass. transit cards. However, Brenno gave his presentation exclusively about an extremely similar crack in the Netherlands's mass transit cards last year. The parallels to the US' Mifare card was clear, but he made no reference to the MIT guys' research.

    He began his presentation by presenting a quote from the Dutch Constitution which included a statement about freedom of speech. He was able to freely present on the topic of the Dutch system because of these rights; his government cannot prevent him from presenting academic research and, in fact, specifically said it would not. Then he followed up with the words from the U.S. Constitution thus showing why, if he wanted to, U.S. citizens SHOULD have the same freedoms under U.S. laws. Unfortunately, the US courts didn't have the same opinion as the Dutch courts.

    Brenno (wisely) made no particular reference to the MIT students, but their presentation was close to what Brenno presented but without any technical specifications, code, or photos of people breaking into places. Those three elements were included in the MIT guys' presentation, but a former Fed Agent told me on Friday that the FBI had asked the MIT guys that they cut some slides from their presentation. I suspect that those were the ones in contention, but what I find interesting is that, from what I understood from the former Feds' comment, the FBI wasn't going to preclude the MIT guys from presenting but only asked their presentation be edited due to an ongoing investigation. However, the Massachusetts District and Federal court went as far as to chill their speech completely. It's incredible to me because, not only is the stuff (minus the executable code) already distributed to the public on the Defcon CD that all conference attendees received at registration as early as Thursday, but these guys were talking about exploits that were already out there and well known.

    I think that Brenno's valiant presentation, albeit about the Dutch and British systems, may have weakened the case against the MIT guys. The MA judges who made the decisions will be hearing about this. It was even on Twitter coming up on Brenno's laptop's screen during his talk. Thank you, Brenno. It took you, from the Netherlands, to get up in front of a standing-room-only crowd of over 700 cheering people, present academic security research and uphold our U.S. 1st Amendment Constitutional rights. You will have effected precedent in US courts regarding this case and (hopefully) improve security for an insecure technology. And, as a member of academia, a special THANK YOU in the spirit of academic freedom.

    And then, during Brenno's Q&A, I made a mad dash to the Vegas airport and barely made my flight, but seeing Brenno's presentation was worth the risk. This time I breezed through TSA security—no dumb questions regarding whether my unpeeled orange on a domestic flight could have been injected with bomb-making poisons (I'm not joking—this has actually happened).

    Thursday, August 28, 2008

    Night 2 Parties at Defcon 16

    My party source told me that there was a girl hacker party next door at the Peppermill lounge at 10 PM. Cool! A party of women in computing! I went to Carnegie Mellon Univ. and read Unlocking the Clubhouse by Jane Margolis and Allan Fisher. I don't really know any other techie hacker girls who attend Defcon, so it would be interesting. He gave me a pass that had lipstick kiss marks, a skull and cross bones, and a Bond girl-ish cross hairs gun sight on it. I thought that Edgeos must be a hacking group like the Ninjas or Hacker Pimps.

    The lounge at the Peppermill was retro lovely. There was circular seating around a fire pit that was coming out of a pool of water. On TV screens above the bar and the fire pit, a soft-porn video featuring the Edgeos girls was playing. They weren't hacking, but they were sitting on computers doing their soft porn stuff. I think a computer security company is Edgeos, but their marketing was confusing. The girls were also there serving drinks and I, graciously, ordered some VSOP and talked to a couple who does computer forensics for IBM. It was a bit of a disappointment that this wasn't the women in computing clubhouse, but the long line of guys standing outside to get into the party didn't mind at all.

    I left with Dallas and we went to the Freak Show Party in the Penthouse put on by Dan Kaminsky's company, IOActive. This party was awesome! They really went all out with the carnie theme. They had a contortionist, a freakishly tall guy, a bearded lady, and a Twister game set up. Out on the dance floor, I saw my friends from Seattle, met up with Brenno, and we all got to dance with Cap'n Crunch. Brenno joined a former Fed and me in the back of the room to smoke cigars. I had one Dominican left, Brenno had a small box of Dutch cigars, and the former Fed had a Cuban locked in his car in the hotel's parking garage. When he mentioned the Cuban, it seemed as if techno music slowed down for a moment, my thoughts became fuzzy, and the cognac seems a bit sweeter at that moment. I almost grabbed him by the collar, assertively asked for his keys, and told him I'd happily go get it if he'd share. I distracted myself from this impulse by becoming busy lighting my Dominican and puffed hard until the impulse passed. A locked car in a parking garage is no place for a Cuban cigar! It's to be treasured and shared....Actually, it was an angry security guard who broke our smoking bliss when she demanded we extinguish our cigars. We forgot that smoking is only permitted in bars—or something like that. Don't know for sure. We, regrettably, acquiesced.

    Brenno and I danced until the party shut down and we went across the hall to another Penthouse party. At this party, anyone who felt like it went behind the bar and served drinks. I met one of the Agent Orange guys, Obphusc8 (I think), who was amazed that I was, “ 22 or something and teaching college classes.” Even though I'm sure it was the beer talking, he still got major points for that one. Seriously. You rock.

    In the Penthouse after watching limos and expensive sports cars trolling the streets below and the neon-lit Vegas sky scrapers, the the morning sun seemed far from Vegas. Vegas is a city made for the night. The lights, theme-park like hotels and “what happens in Vegas stays in Vegas,” is only made for the cover of night. Before spoiling this intoxicating night-time view with the rising sun, I went back to my cavernously dark room.

    Wednesday, August 27, 2008

    Day 2 of Defcon 16

    I began today with Don Blumenthal's talk about working with law enforcement. He's really a good speaker: he's accurate with this tech and legal info. and he approaches the issues from direct perspective. He's right when he recommended that if a warrant is given, don't screw with law enforcement. Know your rights, but don't try to mislead them if they have properly requested materials.

    Scott Moulton talked about how, in a few states, one needs a private investigator's license to do computer forensics. I had never heard about these laws before, but it's shocking. Being a licensed PI in itself doesn't qualify one to work with electronic evidence, do computer forensics, or do audits for clients. In addition to the long apprentice training required, the PI exam is mostly composed of questions about guns and guard dogs.

    After I returned to Maine, I mentioned Moulton's talk at a TechMaine meeting of information security and network and sys. admin. professionals. Only one person had heard of this scary legislation, but we all agreed that before it could be proposed in Maine, we should let our legislators know that we won't accept it. It seems as if, in the other states with these laws, the legislation was quickly passed without the info. security groups knowing what was going on. Thanks to Moulton's talk, we'll be on top of this before the PI lobbying group gets to our state. That law would put a lot of good people out of work. And as if tech jobs are even easy to come by in this state!

    After Moulton's presentation, I went to get lunch in the contest room. I was delighted that Mycurial sat down next to me. I saw him present at The Last HOPE. We discussed how he won't let his employees at a large bank take their business laptops across the US boarder because of the laptop searches and seizures being done by US Customs. The policies allow for officers to take laptops for a “reasonable period of time” to “review and analyze information.” There are (shockingly!) no requirements for reasonable suspicion. I learn about stuff like that and wonder where our civil liberties are going and who's making and passing this legislation?

    In an e-mail Mycurial sent me, he said, “There has not yet been a National response from the Privacy Commissioner of Canada, but I'm not sure how long that might or might not take. In the interim, we just don't outsource data to the states.” He has bank employees take wiped hard drives through Customs and then download the data they need through an encrypted network after they've cleared Customs.

    After finishing lunch and discussing data storage laws (If you store your short-term memory on your hard drive because of a medical condition, should that stored data have a higher level of protection (think Johnny Mnemonic) ? What about for search and seizure protocols?), I slipped into the “Ask the EFF Panel.” However, the panel was canceled. The MIT students were slapped with a temporary restraining order prohibiting them from talking about their security research at MIT. Massachusetts Judge Woodlock really misjudged this one. Read Bruce Schneier's article (link above) because it's a good opinion article about why full disclosure of computer security issues is good for the computer industry. When the norm used to be to quietly tell the vendor, many vendors used the fallible “security through obscurity” routine and do nothing.

    Last, but not least, I went to “The Commission on Cyber Security for the 44th President.” The Center for Strategic and International Studies has a policy group composed of a myriad of professionals who wrote a security plan to be given to the next US President. Ed Felten is among a long list of impressive contributors. Someday I hope to be a part of a policy group such as CSIS.

    I know that I let down my Hacker Space group, but they requested a lot of pictures of the Grendel mobile hacker space van. I went outside a few times, but it was locked every time. However, I was able to get one picture of it (above, top of blog).

    The highlight of my day was being asked by a very nervous teen if I was Michelle Madigan. Huh? She was the Dateline NBC reporter who was run out of Defcon 15 last year. She refused to get a press pass but was trying to secretly film evil hackers breaking stuff. However, rumor was that it was a Fed who outed Madigan. Some of the Feds are working undercover and they didn't want to show up on a hidden camera on Dateline, either. I wonder what would have happened if I'd said, “Yes”? It would have been fun.