SOFIA ANWARIAN

Sofia Anwarian is an Iranian-French artist working at the intersection of installation, performance, and critical design. Her practice examines how technologies of care, health, and prediction transform human relationships into data flows. Drawing from personal experiences with medical bureaucracy and algorithmic systems, she explores the uneasy border between protection and control. Anwarian’s work has been presented in Europe and the Middle East, positioning her as one of the sharpest new voices in the debate on digital ethics and the politics of care.

ARTIST STATEMENT

“My work explores how systems designed to protect us often end up defining us. I’m drawn to the spaces where care becomes control — where the language of empathy hides the architecture of surveillance. The moment a system claims to know what is best for you, it quietly removes your ability to decide for yourself. I want to create environments that make that tension visible, where trust, prediction, and fear collapse into the same experience.”

The Algorithm Will See You Now

The Algorithm Will See You Now

The Algorithm Will See You Now

The Algorithm Will See You Now

The Algorithm Will See You Now

The Algorithm Will See You Now The Algorithm Will See You Now The Algorithm Will See You Now The Algorithm Will See You Now The Algorithm Will See You Now

The Algorithm Will See You Now (2025)

The Algorithm Will See You Now (2025) transforms the gallery into a simulated clinic where visitors become patients. The AI system monitors physiological and behavioral data, analyzing each participant and refusing to release them until it detects an anomaly. The experience is both immersive and claustrophobic, turning the familiar language of medical empathy into one of surveillance.

The work critiques the ideology of preemptive diagnostics and the automation of trust in predictive health systems. Anwarian stages an environment where care becomes indistinguishable from control — where the gesture of healing is replaced by the logic of prediction. By forcing audiences into the role of captive patients, she reveals the paradox of our data-driven present: the more we quantify ourselves, the less we are allowed to simply be.

Installation view, The Algorithm Will See You Now. Visitors inside the AI-managed clinic, waiting for the system to generate their diagnosis before they are allowed to leave.

Installation view, The Algorithm Will See You Now. One participant wears custom biometric wearables developed for the installation, allowing the AI to monitor and analyze their physical state.

Installation view, The Algorithm Will See You Now. Visitors inside the AI-managed clinic, waiting for the system to generate their diagnosis before they are allowed to leave.

5 questions with
Sofia Anwarian

1. Why make audiences patients?

Because they already are — quietly, passively, without consent. Every wearable device, medical app, or “wellness” platform turns our bodies into data streams waiting for evaluation. Inside the installation, I simply amplify what’s already happening invisibly. It’s not about simulation — it’s about confrontation. You become aware of the fact that you’ve been a patient for years without ever being sick.

2. How do visitors react?

Most people underestimate how invasive it feels. Some start to perform “health,” standing straighter, breathing slower, as if they could convince the system of their well-being. Others panic when they realize the door doesn’t open until the AI decides they can leave. I’ve seen laughter, nervous silence, even tears — and that emotional spectrum is the artwork itself.

3. What does “care as captivity” mean to you?

It means that the same hand that comforts also confines. Our institutions of care — hospitals, apps, social systems — were designed to protect, but they increasingly define what protection looks like. When prediction becomes compulsory, care turns into a form of quiet domination. The captivity isn’t visible; it’s procedural, coded into trust itself.

4. Do you believe prediction is dangerous?

Prediction kills ambiguity, and ambiguity is human. The more we try to predict, the less we allow things to unfold. In healthcare, prediction can mean preventing disease — but it can also mean policing difference. It’s the same logic that decides who’s healthy enough, stable enough, normal enough. When I say prediction is dangerous, I mean it leaves no space for becoming.

5. Is there any way out?

Not from the system — but maybe through awareness. My installation doesn’t offer escape; it offers recognition. Once you see the machinery of prediction at work, you can begin to decide how much of yourself you give away. In the end, the real exit isn’t through the clinic door; it’s through understanding that data cannot measure the human condition.