I might be wrong on this but I will say it anyway.
The original idea for this came out of the Google App contest for Android. This was the idea that won the competition. So the idea was available for anyone to execute after that.
The red laser team might have come up with the idea independently or just learned about it from the Google Android contest. I am not sure if the contest or the app came out first. Someone here might know.
The latest version of RedLaser is much more impressive than what the Android guys were able to execute. It's able to identify barcodes before I stabilize my hand over the image, upside down, or sideways, almost as if I were swiping it. The vibration feedback and rest of the interface are a great UX touch. The app was brilliantly executed - both in UX and in hardcore vision algorithms. This wasn't an easy task - they deserve every penny of that $1m.
RedLaser was released on May 15, 2009. It works with original, 3G, and 3gs iPhones (although it is faster on 3gs). It was released for the pure need for that scanning barcodes was very subpar on the iPhone. They were the first (and I think still the only, most others license their tech) app on the iPhone to do barcode recognition straight on the phone (no need to take a picture or send it to a server to be 'read'). Until very recently that wasn't possible with horribly slow Android SDK.
The original idea for this came out of the Google App contest for Android. This was the idea that won the competition. So the idea was available for anyone to execute after that.
The red laser team might have come up with the idea independently or just learned about it from the Google Android contest. I am not sure if the contest or the app came out first. Someone here might know.