To what degree is Siri listening?

Siri must "listen" for commands, however, that being said if it does not receive the command prompt "Hey Siri," can users rely on the algorithm to not record/send conversations or other private audio?

If device specificity is required for the question. Assume an iPhone X with IOS 13.X


Your asked about the trust-worthiness of an algorithm. The answer to your question is a firm and absolute "No" - users can not rely on the algorithm to not record/send conversations or other private audio. Some algorithms are definitely better than others, but no algorithm is perfect - what is invented by the human mind can be defeated by the human mind.

Let's muse over another word in your question: rely. Reliance is not a binary state of mind; it's much more subtle than that. Consider that we may rely on the mail carrier to deliver our mail each day, but as the consequence of failure is not catastrophic, we do not rely upon it heavily. OTOH, if you are a diabetic living in a remote location, your reliance on the mail carrier's delivery of your insulin has far greater implications.

The "threat" is also a factor in our reliance. Security-wise, perhaps the thing we have going for us is that nobody really cares what we say or write or do. But if we happen to attract attention to ourselves - perhaps as a suspect, or the target of a criminal enterprise, or have become involved in "matters of national security" as this chap recently did, then our level of reliance on algorithms - and computer security in general - becomes a different matter. And so it is with your reliance on Apple's algorithm: does the mail carrier bring you junk, or does he/she deliver insulin?

Finally, let's consider the other variable in reliance: the quality of the algorithm. Reliance is not a binary state, and no algorithm is perfect. Nevertheless we must still consider how good it is in determining how heavily we should rely upon it.

Unfortunately it is a practical impossibility for us to know how secure Apple's algorithm is:

  • it's not open source - people outside Apple have to disassemble the code to inspect for flaws. They cannot read the source

  • even if it were open source, it wouldn't be perfect

  • even if it were perfect, Apple still makes mistakes

  • Apple could even abuse their privilege or not catch an employee that attempts this abuse

Instead, and as a substitute for a quantitative analysis of the algorithm itself, perhaps the best we can manage is an attempt at objective assessment of some facts. Consider the following:

  1. A complete and massive failure of the algorithm would be bad for Apple's business. Consequently, it seems likely that Apple would put a serious effort into the algorithm, and safeguards for the data.

  2. Apple has weighed up Siri's business value against the risks of failures of their algorithm, and have made the decision to retain Siri. Impose may be a better word than retain in this case as Apple has not provided an option to remove or un-install the Siri software.

These facts do not shed much light on the actual quality of Apple's algorithm, but they do speak to motives and motivation. We can safely conclude that Apple has reasonably strong financial incentives to design a reliable algorithm. However, it also seems safe to conclude that Apple considers our private data to be within their purview. Why else would they not provide an option for removal of Siri to accommodate their customers' preferences?

None of this is intended as "Apple-bashing". I am an Apple customer - I have spent more money than I'll ever admit on their products. But I feel it's also important to be realistic. Apple's interests do not coincide perfectly with ours. Should we rely on Apple's judgments and algorithms to protect and safeguard our security and privacy? I hope this answer has provided a little food for thought.


Epilogue:

I have just read @Bradley's answer to this question. I have to say that if the allegations leveled by Apple's contractor in Ireland over the "Siri Incident" are true, then Apple deserves far more than the "Apple-bashing" I tried to avoid in my answer. They deserve substantial fines - perhaps even criminal proceedings. It is simply unacceptable that anyone or any organization be allowed to use their wealth and influence to trample the rights of others. More to the point of answering this question, this episode (if true) speaks volumes on how we should consider relying on Apple's software.


The answer is it is always listening. The recent article in the Guardian, of the former apple contractor, demonstrates precisely why the best option for you is to disable Siri all together if you want to mitigate the risk of yourself being recorded. Equivalently, buy the cheapest non-smartphone.

This is not just an Apple-centric issue. Unfortunately, when you accept the terms and conditions, you are giving Apple, and other companies the right to use your data however they want. This should not be the case when you are not using an application, but many times over it is the case.


We can only tell you what Apple publicly states. Can you rely on it, well that's entirely up to you. IMO, I trust Apple and Microsoft orders of magnitude more than I do Google, Facebook, Twitter, et al. That doesn't mean I trust them implicitly, either.

Specific to this question, it is addressed in this press release from Apple: Improving Siri’s privacy protections. Below is a quick summary of what's in the press release.

How Apple Protects your Privacy

  • They try to do as much of the Siri processing on the device as possible
  • Any data stored on Apple servers are not used to build a marketing profile nor is it sold to anyone
  • Siri tries to use the least amount of private data as possible. The example used was for general queries, they would use a general location, but for more specific and granular requests, more detail is used
  • Siri dictating messages is done on the device and not stored on the server

Changes they are making to improve your privacy

  • audio recordings to improve Siri are no longer kept, only computer transcriptions will be used to improve Siri
  • Siri's improvement program is opt in rather than opt out; meaning you have to affirmatively give Apple permission to use your recordings
  • Only Apple employees will be able to review customers recordings

So, based on this, is Siri listening?

Yes.

Am I worried about it?

Not worried, exactly. I trust it, for now, but remain skeptical. Siri is listening, but if most of what Siri does is on device, there's no need to record/process your conversation until it hears "Hey Siri;" and responding to that has no need to be sent to servers for initial processing.