If you've ever wondered what mysteries lurk behind the friendly but stern visage of the U.S. Army's official chatbot, Sgt. Star, today is your lucky day. The Electronic Frontier Foundation has learned everything there is to know about this semi-intelligent living FAQ.
Sgt. Star (short for "Strong, Trained And Ready") is your virtual host when you have questions about the Army but don't know exactly where to look them up: What does the scholarship situation look like, how long is basic training, things like that.
He was created to help with the enormous increase in interest experienced by the Army after the 9/11 attacks, and has undergone expansion and revision over time — his answers, all 835 of them, are updated regularly.
Sgt. Star as he appears on the Army website.
Over five years (and at least $5 million) Sgt. Star has answered more than 10 million questions in 2.8 million sessions — an average of 1,550 per day. That's a lot less work for human recruiters.
But there's a vaguely sinister aspect to the sergeant as well. The EFF found an "inadequately redacted" government document explaining that the chatbot technology was originally used by the government to "engage pedophiles and terrorists online."
The precursor to Sgt. Star, then, roved services like IRC (Internet Relay Chat) looking for suspicious behavior and reporting it. Next IT, the company behind these original bots, also created commercial versions — if you've had a chat with a bot at Amtrak or United Airlines, it was probably in the same extended family.
A few details worried the privacy-minded folks at the EFF: Is Sgt. Star on the lookout for suspicious behaviors as well? If so, what kind? And why doesn't the Army keep records of its old responses? Answers may be forthcoming, but if you're curious in the meantime, you can read through the nearly 300 pages of possible answers (PDF) the sergeant can give you.
First published April 18 2014, 4:41 PM