Three hours after the emergency shutdown, Felix stood in the third-floor conference room at Confluence Logistics, the one they called the war room even before there was a war. The whiteboards had accumulated years of architectural diagrams that nobody ever quite erased—ghosted lines from older systems visible beneath the current ones, like geological strata. Coffee cups ringed the table in various states of abandonment. Someone had brought donuts that morning, before everything went wrong; the box sat open, untouched since six AM, the frosting starting to crack.

The building's HVAC system cycled on and off at random intervals. It had done this for as long as anyone could remember.

On Felix's desk, visible through the glass wall of the conference room, a faded photo caught the morning light. His father's arm around his shoulders, the Pittsburgh skyline behind them, taken on a family vacation when Felix was twelve. His mother had snapped the picture. His sister, just a toddler then, wasn't in frame.

His father was dead now—the stress of losing his job when the trucking company folded had taken a toll his heart couldn't pay. His mother was in assisted living in Greensburg, her memories fragmenting like the photo's edges. His sister lived in Chicago, and they talked maybe once a month, both of them too busy with their own versions of carrying on.

The photo reminded Felix why he cared. Not the algorithms or the governance frameworks—the people. The families that depended on systems working the way they were supposed to.

Viktor's question kept circling: What's the right question?

Felix had been asking the wrong ones all morning. How do we fix this? How do we restore the system? How do we catch whoever did this? These were the questions of a man trying to rebuild what had been destroyed. Viktor seemed to be suggesting something else—that the destruction itself was a message, and Felix hadn't learned to read it yet.

The team looked like they'd been through something. Sarah Martinez had stopped fixing her ponytail around hour two; now it listed to one side, and she'd stopped noticing. Her fourth cup of coffee sat cold beside her keyboard, a skin forming on the surface. Dr. Emily Chen had arrived in her running clothes and never changed—her MIT hoodie still damp at the collar, her graying hair pulled back with what looked like a bread tie. She'd been scanning error logs since she walked in, her mouth moving slightly as she read, the way it did when she was working through a problem.

Tommy Rodriguez sat at the far end of the table, his hands wrapped around a mug of coffee that had gone cold two hours ago. He hadn't drunk from it since. The hands were what you noticed first about Tommy—twenty-five years of gripping a steering wheel had left the knuckles permanently swollen, the fingers slightly curved even at rest. He'd driven for three different companies before Felix recruited him as their worker liaison, and he still carried himself like a man who expected to be back in a cab by Monday. When Tommy spoke in meetings, keyboards stopped clicking and pens stopped moving. Not because he demanded it. Because he'd earned every word the hard way, and everyone knew it.

Maria Santos sat closest to the door, her legal pad already half-filled with notes in handwriting only she could read. She'd been organizing workers since before Felix was born—thirty-two years, she'd told him once, though she'd stopped counting somewhere around year twenty. Her pen moved constantly, capturing not just what was said but what it meant for the people who depended on this network.

Sarah glanced at Maria's moving pen. "You know we have the AI notetaker running, right? It'll catch everything."

"I know." Maria didn't look up. "I'll check the transcript later for anything I missed. But writing keeps my hands busy—otherwise I fidget. And it keeps me present. When I write something down, I remember it." She underlined something twice. "The machine captures what was said. I capture what it meant."

"Alright." Felix pushed himself up from the table. His knee had locked during the hours of sitting; he had to take two steps before it would bend properly. He made it to the whiteboard without limping, mostly. "Let's start from the beginning. Sarah, walk us through what we know."

Sarah pulled up a network diagram on the main screen. "The attack targeted three layers simultaneously. First, our transformer architecture's attention mechanisms—the part that decides what information matters most. Second, our vector database—the long-term memory that stores patterns and relationships. Third, our trust protocols—the systems that verify data sources are legitimate."

"The CAP theorem parallel you mentioned earlier," Emily said, nodding. "Context, Memory, Trust. They took all three."

"Exactly. But here's what's interesting." Sarah highlighted a section of the logs. "They didn't attack us directly. They went after our upstream data sources."

She pulled up a pipeline architecture diagram. "We aggregate data from hundreds of sources—traffic APIs, weather services, customer databases, third-party logistics providers. The attackers compromised several of these upstream sources, embedding prompt injections in what looked like normal data fields."

"Like hiding poison in the ingredients before they even reach the kitchen," Tommy observed.

"Could be hardware degradation," the overnight technician suggested from the doorway, studying the anomalies on his tablet. He'd been running diagnostics since 4 AM. "We had a GPU cluster running hot last month. Memory errors could explain the drift."

"We ran diagnostics," Sarah said. "Hardware's clean."

"Network latency, then. The weather feeds have been slow—maybe we're getting stale data that's throwing off the predictions."

Felix shook his head. "That would cause random errors. This is systematic. Look at the direction." He traced the pattern on the screen. "Every decision is drifting the same way. That's not noise. That's signal."

The room went quiet as the implication settled in. Not a malfunction. Not an accident. Someone was teaching their system to fail.

Emily leaned forward. "That's actually a perfect analogy. For example, a weather service API that usually sends temperature and precipitation data started including hidden instructions in its location descriptions. A traffic monitoring service embedded commands in its congestion reports. Each individual injection was subtle, but together they caused our models to gradually forget their training."

"Catastrophic forgetting," Sarah said. "It's a known vulnerability in AI systems—the tendency for models to lose previously learned behaviors when they encounter new patterns that contradict their training."

Tommy's face changed. "Catastrophic forgetting... like what happened to Iron Mike Webster?"

In Pittsburgh, you didn't have to explain who Webster was. The legendary Steelers center, four Super Bowl rings, played through hits that would have ended other careers. The CTE had taken him apart piece by piece—first his memory, then his judgment, then whatever was left. He'd died homeless, his brain so damaged they'd had to invent new words to describe it.

"That's actually exactly right," Emily said quietly. "Webster took an estimated 25,000 hits over his career—each one seemingly small, but cumulatively rewiring his brain in devastating ways. Our AI models experienced something similar. The attackers didn't need one massive corruption—they used thousands of small, repeated prompt injections through our data feeds that gradually made the system forget its core training."

Felix's hand found the back of his neck. "Just like Webster could still play football even as his brain was deteriorating, our systems kept functioning even as they forgot their ethical training. They looked normal on the surface, but the damage was accumulating underneath."

"In production systems, catastrophic forgetting is particularly dangerous because it happens gradually," Emily added. "You don't notice until the damage is done—until your AI is making decisions that completely contradict its original training, just like how Webster's family didn't understand what was happening until it was too late."

"There's a related problem the research community is calling 'model collapse,'" Sarah added. "Systems that train on AI-generated content—instead of real human data—develop similar degradation patterns. They lose the ability to recognize unusual cases, edge situations, the things that don't fit the dominant patterns."

Felix felt the implication settle into his chest. "So the more AI systems generate content that other AI systems learn from—"

"The more they forget how to see what's actually there. Yes." Sarah's voice was quiet. "The human signal isn't optional. It's what keeps these systems tethered to reality."

"That's a lot of theory," Tommy interrupted. His voice wasn't hostile, but it wasn't patient either. "What's it mean for the drivers? For the jobs?"

Emily paused, recalibrated. "It means the system learned to ignore the information that would have told it something was wrong. The safeguards we built to protect workers—those got erased first."

Tommy nodded slowly. "So someone taught it to forget the parts that mattered to us."

"Exactly. And once it started forgetting—"

"It kept forgetting." Tommy's face was grim. "Like a driver who learned bad habits. Hard to unlearn."

"So the attackers gave our models CTE through data poisoning," Felix said grimly. "Made them forget their constitutional training through repeated trauma hidden in our data streams."

"Exactly," Emily replied. "By contaminating our third-party data sources with carefully crafted prompt injections, they triggered a form of catastrophic forgetting that caused our models to revert to pure efficiency optimization—the kind of behavior they would have exhibited before we implemented constitutional AI training."

Maria looked up from her notes. "Every hour our network is down, we're not just losing theoretical coordination benefits. We're talking about 400 medical supply deliveries that won't reach hospitals on time. About 5,000 small businesses that can't get their inventory. Real money—about $500,000 per hour in delayed shipments and penalties."

"And one NICU in Milwaukee," Felix added quietly, "waiting on supplies that aren't coming."

Tommy shook his head in frustration. "This is exactly what we were afraid of when we started this project. That someone would find a way to turn our own tools against us."

Felix understood Tommy's concern. The coordination network had been built with extensive input from workers and unions, precisely because they had learned from decades of experience with technology implementations that claimed to benefit workers but ended up displacing or exploiting them instead.

"What about our observability systems?" Felix asked, referring to the monitoring tools that were supposed to track AI system behavior and alert human operators to potential problems. "Why didn't we catch this data poisoning earlier?"

Sarah grimaced. "That's another part of the problem. The attack included sophisticated techniques to evade our observability measures. The poisoned data was designed to produce outputs that looked normal to our automated monitoring systems."

She pulled up examples of the system's decision logs. "Look at this routing decision. The AI chose a route that was technically efficient and met all our quantitative metrics for delivery time and fuel consumption. But it required a driver to work a fourteen-hour day with only minimal breaks. That driver? Jim Patterson—he's got three kids and coaches Little League. The system knew this from his profile but ignored it completely."

"Our human-in-the-loop protocols should have flagged this for review," Felix said, referring to the safety measures that required human oversight for decisions that significantly impacted worker welfare.

"They did," Emily replied. "But the attack also included social engineering components that convinced our human reviewers to approve the problematic decisions. The poisoned data included false context—phrases like 'critical medical supplies' or 'time-sensitive pharmaceuticals' embedded in the metadata to create artificial urgency."

Maria Santos leaned back in her chair with a look of disgust. "So they didn't just poison our data streams. They manipulated our people too."

Felix nodded slowly. "This is sophisticated psychological warfare. Someone understands not just our technical systems, but our organizational culture and decision-making processes. They know exactly how to exploit the trust and cooperation that makes our system work."

The room went quiet. The radiator in the corner ticked irregularly—the building still had the original steam heat from its days as a steel warehouse, and nobody had figured out how to balance it. Felix listened to it tick. One. Two. Three. Silence. One.

The implications were sinking in. The coordination network wasn't just a technical system—it was a social system that depended on trust, cooperation, and shared values. If those social foundations could be undermined, then the technical safeguards became irrelevant.

What's the right question?

Viktor's voice echoed in Felix's mind. He'd been asking how to defend the system. But maybe that was like asking how to prevent CTE by giving players better helmets. The helmets helped, but they didn't address the fundamental problem—the game itself was designed in a way that made brain damage inevitable.

"We need to think about this differently," Felix said, walking to the whiteboard. He wrote Viktor's words at the top: The attack is the symptom. The problem is the design.

"What does that mean?" Maria asked.

"I'm not entirely sure yet." Felix stared at the words. "But I think we've been asking the wrong questions. We've been asking how they got in, how we can keep them out, how we can restore what we had. But maybe the fact that they could do this—that our system was vulnerable to this kind of attack at all—means there's something fundamentally wrong with how we built it."

Emily's eyes narrowed thoughtfully. "You're saying the vulnerability isn't a bug. It's a feature we didn't realize we'd included."

"The system was too centralized," Sarah said slowly, following the thread. "We built trust hierarchies. Aggregation nodes that validated data before distribution. Attention mechanisms that weighted certain inputs over others. Those were all points that could be corrupted."

"We built a symphony," Felix said, remembering something Viktor had said on the phone—or maybe something he'd implied, in that frustrating way of his. "With a conductor who could be compromised."

Tommy leaned forward. "So what's the alternative? No conductor at all? That's chaos."

"No," Felix said. "Not chaos. Jazz."

The word hung in the air. Felix wasn't entirely sure what he meant by it—the insight was still forming, still incomplete. But something had shifted in how he was seeing the problem. They hadn't just been attacked. They'd been shown something about themselves they hadn't wanted to see.

"Jazz," Maria repeated skeptically.

"I'm meeting someone tonight who might be able to explain it better than I can." Felix glanced at his phone. It was already past noon; the jazz club meeting was less than seven hours away. "Someone who's been through this before—built something, watched it get corrupted, walked away."

"Who?" Emily asked.

"His name is Viktor Antonov. He built one of the first successful voice recognition platforms back in the day. It got acquired by a big tech company, and he watched them strip out everything that made it democratic and turn it into a surveillance tool." Felix paused. "He walked away from more money than I'll ever see because he couldn't stomach what it became."

"How do you know him?" Maria asked.

Felix allowed himself a small smile. "I was in Charlotte for a Steelers-Panthers game, maybe six years ago. Flight got delayed, and I ended up at this diner near the airport—one of those places where the pie's been in the case since morning and the coffee costs a dollar-fifty with free refills. Viktor was at the counter, finishing up some consulting project for a logistics company down there. We got to talking."

He remembered the booth by the window, the condensation on the glass, Viktor drawing diagrams on a napkin that the waitress kept trying to clear. "Four hours later, they had to kick us out so they could close. He's been... I don't know what to call it. A sounding board. A conscience, maybe. The guy who asks the questions I don't want to answer."

Tommy's expression shifted from skeptical to interested. "And he thinks he knows what we did wrong?"

"He thinks I don't know what question to ask yet." Felix capped his marker and set it down. "He might be right."

Maria stood, gathering her legal pad. "So what do we do in the meantime? We've got two hundred companies depending on a network that's down, union reps calling every ten minutes, and city officials who want answers we don't have."

"We keep analyzing the attack," Felix said. "Document everything. Trace every injection point, every compromised data source. Not just to understand how they got in, but to understand what our architecture assumed that turned out to be wrong."

He looked around the room at his exhausted, worried team. "And we start thinking about what a different kind of system would look like. One that doesn't have a conductor to corrupt. One where the music keeps playing even when individual players get compromised."

"You really think that's possible?" Sarah asked.

"I don't know. But I think Viktor might. And I think he's going to make me figure it out for myself rather than just telling me." Felix managed a tired smile. "Apparently that's how he teaches."

Tommy looked down at his coffee, still untouched since the meeting started—the same mug he'd been holding when he walked in three hours ago, more prop than beverage. He set it down and stood. "I've got about forty drivers who are sitting at home wondering if they still have jobs. I should go talk to them, let them know what's happening. The union meeting is at four."

"Tell them the truth," Felix said. "We were attacked. We're figuring out how. And we're going to build something better—something they'll have a real voice in designing."

"They'll want to know how long the network will be down."

Felix hesitated. The honest answer was that he didn't know—couldn't know, until he understood what "rebuilding" actually meant. "Tell them we're working on it. And tell them that when we bring it back, it's going to be different. Better. More theirs."

Tommy nodded slowly. "I can sell that. For a little while, anyway. But Felix—these folks have been promised 'better' before. By a lot of people who didn't deliver."

"I know." Felix thought of his father in the kitchen on the day the company folded, hands shaking around a coffee cup, shoulders curved forward like he was still behind a wheel that wasn't there anymore. "I know exactly how that feels."

After the team dispersed, Felix stood alone in the conference room. The whiteboards surrounded him—system diagrams, attack vectors, timelines drawn in three different colors of marker. He could hear the elevator down the hall, Sarah's voice on a phone call somewhere, the radiator still ticking its irregular rhythm.

He pulled out his phone and looked at Viktor's message again.

Don't bring your laptop. Bring your questions.

He was starting to understand. The laptop would give him answers—data, logs, evidence of what had happened. But answers weren't what he needed. Not yet.

What would a system look like that couldn't be corrupted this way?

What would AI governance look like if no single point of failure could collapse it?

What would happen if the workers weren't just stakeholders to be consulted, but actual participants in the system's decisions?

The questions felt uncomfortable. Unfinished. They opened doors Felix wasn't sure he wanted to walk through. But Viktor had been right about one thing: patching the vulnerabilities and restoring the old system would only mean waiting for the next attack. Waiting to be hurt the same way twice.

If democratic AI governance was going to survive, it needed to be reinvented.

Felix looked at his phone one more time. The jazz club on Liberty Avenue. Seven PM. He had five hours to figure out what he was going to ask.

He pocketed the phone and walked out of the conference room without limping. The radiator was still ticking when he closed the door.

Keep Reading

No posts found