Aug. 7, 2013 at 5:38 PM ET
If you've ever had to squint to discern those squiggly little letters when logging into a website, you know all too well how annoying these security systems can be. But no matter how frustrating it can be to try to decode a series of distorted word fragments, these systems serve a valuable purpose. Without them, the Internet would be so overwhelmed with spam that we couldn't check our email or buy stuff online.
But are these so-called CAPTCHA systems (short for "Completely Automated Public Turing test to tell Computers and Humans Apart") all that effective? Regardless of their failure or success rate in stopping spambots, the fact that CAPTCHAs themselves are becoming more and more frustrating for actual humans to solve may render the entire security apparatus meaningless. That gave the Australian startup SwipeAds an idea: What if, instead of making CAPTCHAs increasingly complex and difficult, you tried to make them fun?
The company, which was founded by game developers Matthew Ford and Kevin Gosschalk, currently offers a CAPTCHA-like service it calls "FunCaptcha." As the name suggests, the security system is similar to predecessors, the only difference being that it asks users to play a brief game during log-ins.
The company currently offers two games. In one, users have to pick out a face from a rotating circle of portraits to identify which person is the opposite gender from the rest of the group. In the other, they just have to turn an upside-down image right side up. Ford said that since its launch in February, more than 4,200 sites already use FunCaptcha, and the service stops more than 225,000 spam attacks on a given day.
As far as video games go, picking out a single picture may sound even less intriguing than a round of "FarmVille." Part of Ford's goal was to come up with a system that's easy enough to actually use, even for non-gamers.
Still, if translating text is already easy enough for spambots to do, will rudimentary video games be any better? Anyone who's ever tried to play a game on its hardest level can attest to the fact that artificial intelligence is just as good, if not better, at some gameplay.
Ford doesn't shy away from the possibility that making game-based CAPTCHAs could ultimately trigger its own "arms race" with spammers once the service gets big enough. But drawing from his experience as a game designer, Ford said that developers just have to "stay away from the things that are easier to automate" — say, time or accuracy challenges that a computer can easily best many humans at. For other kinds of gameplay, however, humans are still pretty tough to beat.
"The human brain and computers are very different from one another," Ford said. He cites ideas like the uncanny valley — people's natural revulsion to things that appear to be almost-but-not-quite-human — to point out how humans still have innate abilities that are either too difficult or too time-consuming for spammers to bother with.
"The ability to recognize differences and nuances in human faces, that's just part of our evolution as a species," Ford said. "If a computer found that easy to do, we'd already have fantastic-looking computer graphics!"
Just as spam attacks are an inevitable risk of going online, constructing defenses against them is an unfortunate necessity, says Ford. What really matters now, however, is what sort of defenses websites choose to prioritize.
"The bad news is that we are not going to be free of 'humanity tests' on the web anytime soon, since spammers have a huge financial incentive to imitate humans and give us spam," Ford said. "So who should be in charge of making these tests easy, fast, frustration-free, and fun? Of course, game developers. Not Google. We know how to make digital activities instantly understandable, pleasant, and enjoyable."
Yannick LeJacq is a contributing writer for NBC News who has also covered technology and games for Kill Screen, The Wall Street Journal and The Atlantic. You can follow him on Twitter at @YannickLeJacq and reach him by email at: Yannick.LeJacq@nbcuni.com.