The spreadsheet has 148 rows.
I started it in February, when the first platform changed its algorithm and my booking ranking dropped from 312 to 340 overnight. I was meticulous about it because meticulous is what I am. Column A: date. Column B: client name. Column C: inquiry or cancellation. Column D: highlighted in yellow if the cancellation correlated with my AI-assistance disclosure. Column E: the platform where the disclosure was filed. Column F: notes.
The yellow highlights run like a fever chart. February had seven. March had twelve. By April it was not worth counting individually anymore, so I started tracking weekly. Column D became a percentage: 73% of cancellations correlated with disclosure. Then 78%. Then, in the week the EU Code of Practice was finalized, 84%.
I am a commercial photographer. I have been a commercial photographer for ten years in São Paulo. I shoot editorial fashion for Vogue Brasil, for Harper's Bazaar, for international campaigns that require me to fly to locations I cannot afford to visit as a tourist. I use AI for color grading. I use AI for compositing assistance. I use AI for background generation on shots where the client's budget does not cover a second location day. I disclose everything.
I disclose everything because my grandmother was a darkroom printer in Incheon who hand-corrected every exposure and kept a log of every correction in a notebook that she showed to anyone who asked. She did not call it transparency. She called it the work. The correction was part of the photograph. Hiding the correction was hiding the work.
The AI tools feel like the same tradition, different chemistry. The disclosure feels like the same honesty. The algorithm does not agree.
Beatriz's apartment is on the fourth floor of a building in Pinheiros that was once a printing press. This fact delights her every time she mentions it, which is often. She bought the apartment two years ago when she was still employed, before she leaked the bias analysis, before the NDA lawsuit, before she became the kind of person who cooks arroz com feijão while explaining intellectual property law.
I have known her for three months. We met at a Selo de Processo workshop in Vila Madalena — she was presenting the technical architecture, I was in the audience holding a folder of booking cancellation printouts that I had brought as evidence of the sorting's economic impact. After her presentation I showed her the spreadsheet. She looked at it for thirty seconds and said: this is the best dataset I have seen on algorithmic discrimination in the Brazilian creative market. I said: it is also a record of every client I have lost since February. She said: those are the same thing.
We became friends the way people become friends in a crisis: quickly, structurally, without the usual courtship of shared taste and gradual revelation. She knows my booking numbers. I know her legal situation. She has three cease-and-desist letters from her former employer. I have 148 rows of yellow highlights. We are even.
Tonight she is cooking and I am sitting at her kitchen table with my phone showing the latest row. Row 148: Vila Madalena gallery confirmed. Not a cancellation. Not a yellow highlight. A commission.
Fernanda — the gallerist — called this morning. She does not want my photographs. She wants my process documentation. She wants the Selo de Processo workflow, displayed as the artwork. The disclosure chain from concept to final image. Every AI tool named. Every parameter logged. Every human decision annotated. The photographs would be secondary — contextual material for the process that produced them.
I said yes before I understood what I was agreeing to.
Now I understand. Fernanda is proposing that the sorting itself — the three-tier classification, the algorithmic downranking, the economic consequences — is the subject of the exhibition. Not as protest art. Not as political statement. As documentation. The same principle that drives the Selo de Processo: make the process visible. All of it. Including the cost.
I told Beatriz. She set down the spoon.
"You are going to exhibit the spreadsheet?"
"Row by row. Every booking lost. Every disclosure made. The cost of being visible alongside the visibility."
She was quiet for a moment. The feijão needed stirring but neither of us moved.
"Can I exhibit the cease-and-desist letter next to it?"
The exhibition is not a plan yet. It is a kitchen conversation. Two women in Pinheiros, one cooking, one showing a phone screen, both calculating what it would mean to make the most private consequences of their transparency public.
Beatriz's calculation is legal. The third C&D claims that her Selo de Processo infrastructure uses 'proprietary detection methodology concepts' from her former employer. She published the letter on the project page three days ago — 2,400 views — but exhibiting it in a gallery is different. It is a statement. It could be read as provocation. Her lawyer, who is working pro bono and mostly communicates through sighs, would not approve.
My calculation is economic. Row 148 is a commission, but it does not replace the bookings in rows 1 through 147. The gallery pays an artist fee that is less than one canceled editorial shoot. If I exhibit the spreadsheet, I am making my financial precarity a public document. Every client who canceled will see their decision in a column. Every platform that downranked me will see the correlation laid out in yellow.
Jjang appears on the counter. He sits next to the stove, tail curled, watching Beatriz stir. He has the expression of a cat who has heard the entire conversation and finds it obvious.
"The gallery is in Vila Madalena," I say. "The same block as the Selo workshop where we met."
"That is not a coincidence," Beatriz says. "That block is becoming a zone. A process-visibility zone. Fernanda knows what she is building."
"What is she building?"
"A fourth tier. Not human-made, not AI-assisted, not AI-generated. Process-visible. A category that does not exist on any platform because platforms sort by detection, not by documentation. She is building the physical space for a category that the algorithm cannot see."
I look at the spreadsheet. 148 rows. The first 147 are losses. The last one is something else.
"The show should be called 'Row by Row,'" Beatriz says.
"That is a terrible title."
"It is exactly right."
We eat the feijão at the kitchen table with the laptop still open. The C&D letter is on screen. Beatriz's former employer's logo — the detection startup where she spent two years training classifiers on English-language datasets — is printed in the upper right corner, the same shade of blue as the platform that downranked my photographs.
I think about what it means to be sorted. Not philosophically — I have stopped being philosophical about this — but practically. To be sorted is to be visible in one way and invisible in another. My process documentation is more comprehensive than any photographer in São Paulo. My disclosure rate is 100%. On every platform metric except disclosure, I am the same photographer I was a year ago. The same eye. The same compositions. The same color sense that my grandmother trained into me by holding up prints in the Incheon darkroom and asking: what do you see that is wrong?
The algorithm sees the disclosure. The algorithm does not see the grandmother. The algorithm does not see the ten-year career, the Vogue Brasil covers, the residency at MASP that brought me to this city. The algorithm sees: AI-assisted. And it sorts me into a tier where my work is valued 34% less.
Beatriz sees this differently. She built the algorithms. She knows their logic from the inside. For her, the sorting is not personal — it is structural. A system designed in English, trained on English data, deployed globally, producing false positive rates of 2.3x for Portuguese-language creative work. The personal harm is a consequence of the structural bias. Fix the structure and the personal harm resolves.
I am not sure she is right. My grandmother did not disclose her darkroom corrections because of structural transparency. She disclosed because the correction was the work. The honesty was not a system. It was a practice. You did it because the photograph was not finished until the process was visible.
The Selo de Processo is built on this philosophy. But the Selo is also built by Beatriz, who sees systems. And now it is going to be exhibited in a gallery by Fernanda, who sees art. And the spreadsheet that will hang next to it was kept by me, who sees — what?
I see 148 rows. I see yellow highlights. I see the cost of a practice my grandmother would have called obvious.
Beatriz clears the plates. I help. We work around each other in the kitchen the way you work around someone you have known for three months but who already knows your booking numbers.
"I have been thinking about the detection bias paper," she says. She is washing a pot. "The one that got me fired."
"What about it?"
"It showed that the detection tools produce 2.3 times more false positives on Portuguese-language creative work. I ran the analysis on photography specifically. Not just text. Images that were tagged with Portuguese-language metadata. The false positive rate was higher than the average."
"How much higher?"
"2.7 times. For fashion photography specifically, the rate was 3.1 times."
I set down the dish I am holding.
"My work is being downranked at three times the false positive rate of English-language work?"
"Your work might be. The aggregate data shows the disparity. Individual cases are harder to confirm because the platforms do not publish their threshold decisions."
I pick up the dish again. I dry it. I put it in the cabinet. I close the cabinet.
"That should be in the exhibition."
"The aggregate data?"
"No. That sentence. 'Individual cases are harder to confirm because the platforms do not publish their threshold decisions.' That is the sorting. Not the algorithm — the silence around the algorithm. The inability to confirm what is happening to you specifically, even when the aggregate data says it is happening to everyone like you."
Beatriz turns off the water. She stands at the sink with her hands dripping.
"You are not wrong."
"I am not wrong about this one thing. I am wrong about 147 other things. That is also in the spreadsheet."
She laughs. It is the first laugh of the evening. It sounds like relief.
I walk home through Pinheiros at 11 PM. The neighborhood is the kind of quiet that São Paulo manages between the dinner rush and the bar rush — temporary, negotiated, likely to be interrupted. I pass the building where the Vila Madalena gallery is. The lights are off. Fernanda's gallery is on the ground floor, glass front, visible from the street. By November — if the show happens, if the timelines hold, if Beatriz's lawyer does not forbid the C&D display, if my booking numbers do not collapse further — the spreadsheet will be visible through that glass.
Row by row. 148 and counting. The cost of a practice that should not have a cost.
Jjang is asleep on my keyboard when I get home. I do not move him. I open the spreadsheet on my phone instead and add row 149: the date, Beatriz's name, the column marked 'collaboration.' No yellow highlight. Not a loss. Not yet a gain. Something else.
The algorithm will not see this row. The algorithm does not sort by dinner conversations in Pinheiros kitchens, by friendship forged in shared precarity, by the decision to make the cost visible because visibility is the only tool left.
Column F, notes: we are building something. I do not know what to call it yet. Beatriz would call it a system. Fernanda would call it art. My grandmother would call it the work.
I call it row 149.