Iowa Student Gets Internship from Google for Reporting Security Flaw: More Proof Vendors Need Stronger Security Checking For Their Products

Last night while my sons and I were watching the news it was reported that in Davenport, Iowa a St. Ambrose University student, David Bloom, found a security flaw in early December when he was using the Google Docs and Spreadsheets program.

“…he discovered a way to run JavaScript code in the Web-based program. This vulnerability, referred to as cross-site scripting, or XSS, could allow a hacker to fool a user into divulging a password and other account information, which could result in malicious uses, such as sending spam or viruses from that user’s account.”

A pretty significant find!
He alerted Google on December 3 about the flaw. The co-creator of the flawed program, Sam Schillace, got back to Bloom the very next day and asked him if he wanted to do a summer internship with Google.
Bloom accepted and will be in Palo Alto, CA for his internship rom May 22 to August 14.
My 7-year-old Heath thought he should have gotten a big cash reward, “like, a thousand dollars.” My 10-year-old Noah said, “Gee, if that problem would have messed up lots of computers with viruses, and caused a lot of identity thefts, it would have cost Google millions! He should get a lot more money along with a job.” Noah has always preferred Ask.com to Google. 🙂
He has a good point. If the Google application flaw would have been maliciously exploited it could have cost Google millions to clean up the resulting mess. They are very lucky Bloom contacted them; Bloom discovered what the Google developers should have found and fixed before the application was put into production.
Of course, it is a good ethical practice to notify vendors of their security problems, and Bloom is to be commended. It is great to see a security conscientious college student. I think it is great Bloom is getting an internship as a result. This is certainly motivation for folks to report security problems. But I wonder, is this enough? Certainly having a significant security flaw such as this is worth quite a bit to Google to be notified about.
And what will Google do to strengthen their application testing procedures to ensure such a big security flaw does not again get placed into production?
This points out the need to incorporate information security and privacy checks into the entire SDLC process, as I’ve blogged about to many times before, such as here.
So many vendors either just shrug off the flaws reported to them, or try to make excuses for them. Perhaps other vendors will also start rewarding good samaritans for reporting security problems, as well as strengthening the incorporate of security checks throughout the SDLC process.
What do you software vendors think? What would you do if someone reported a significant security flaw in your product? And are your application testing procedures rigorous enough to catch this type of security flaw before your application is approved for production?

Tags: , , , , , , , , ,

Leave a Reply