I have explored privacy and identity in a previous post, taken from the individual's perspective.
This post is from the perspective of the state/ nation/ law. I commented on Michael Becker's LinkedIn post about his list of words and definitions for the identity/ privacy space. I commented that everyone remains fixated on making their particular solution work to a problem the user has not got which is "#privacy." Whilst every language and country has an ideal notion of privacy, the wide variety of cultures means there is no unified “concept of privacy”, even if privacy is explicitly named or defined in their specific language law or culture.
I asked #chatGPT, the “AI” bot from Google, the question, “how would a capitalist and socialist view privacy?”
“Capitalists would see privacy as an important aspect of individual liberty and autonomy and they view privacy as a property right that can be traded or sold, and they may support policies that allow companies to collect and use personal data as long as they have obtained the necessary consent.
A socialist would define privacy as the right to personal autonomy and control over one's own thoughts, feelings, and personal information. It is the ability to make decisions about how and with whom one shares personal information, without interference or coercion from external forces such as the government or capitalist institutions. In a socialist society, privacy would be protected as a fundamental human right and would be respected and upheld by the community as a whole.”
StJohn Deakins added to the comments by suggesting we move on from ideals of the political spectrum to examples.
Korea: dictatorship with command economy: Kim owns you and owns your data, including your thoughts (rights to brain data will become a seriously scary "thing" in the next decade)
China: one-party state, ruling on behalf of 'the people’ with a state-controlled market economy: the party owns you and your data (on behalf of the Chinese people)
USA: individualist democracy with a free market (where companies have the same legal rights as individuals). You own your data, and others own it, too (lots of them).
EU: social democracy with a free market (where markets serve people, and people choose how). You own your data (as a human right). You, in theory, participate in choosing how.
It is remarkably common that concepts can be well-defined in language and law, but there remains to be a wide variance in the understanding and use. Words and language are dynamic and never stationary, and we should not try to own or control them; defining for a context is critical. Just consider love, truth, safety, honesty, work, trust, beauty and risk.
Plotting Privacy on the Peak-Paradox framework makes me consider how the different interruptions are framed by the purposes presented in the mapping. It is evident that the interruption for privacy must vary as we move about our understanding of what we are optimising for. Within Peaks, such as the Peak Individual Purpose, there is a creation of dilemmas, for example, Mark Zuckerberg, who wants privacy for himself but not for any of his #meta users.
To add an additional layer to the complexity onion of privacy, it should be noted that the concept of privacy probably exists in each of us until there is an external demand or constraint. As unpacked in this post, where I pick up on the work of two behavioural scientists, John Darley and Daniel Batson, who published in 1973 “From Jerusalem to Jericho: A study of Situational and Dispositional Variables in Helping Behavior.” It was an investigation into the psychology of prosocial behaviour. The research demonstrates that when we introduce time pressure, our dispositional factors (I like privacy) stand for nothing and that we will act differently because of the constraints.
When you design for privacy, you need to understand what the user is optimising for and be aware that your design and/or policy will change what privacy means to the users because it is a constraint.