The super-rare exceptions were those individuals (both execs and their assistants) who seemed to know what the other person was thinking. Of course, telepathic secretaries weren't common, but a few folks were sensitive to all of the nuance in communication. They learned to discern need and intent via cues and habits far beyond the wording of declarative statements. Facial expressions. Tone of voice. Past experience. The multitude of variables that impact any given moment. And trust.
Is this the "intelligence" that Siri promises to deliver?
It comes from some super-secret government AI research, and will possess machine learning in addition to natural language processing. That means it'll go past recognizing words, and develop track records of how a user combines them (and to what purposes). It'll also combine actions, so asking it to "find a cheap designer suit within walking distance" will surpass the indexing deliverables of Internet search, and tee-up locations, a map on how to walk there, etc.
People have trouble articulating what they want, not to mention knowing it themselves, so it'll be interesting to see how Siri overcomes this human trait. So much of need is based on context and the immediacy of the moment; can a computer app discern these mitigating factors?
And why should anybody trust the results? How will users know that the answers weren't somehow sponsored, or otherwise manipulated by someone's interests other than their own? Every system can get gamed, or monetized by commercial interests, not to mention simply be wrong.
Ultimately, understanding real needs and deserving of trust are probably the attributes that'll qualify Siri as truly "intelligent."