We're constantly told that AI is changing everything. The last couple of years has been described as a watershed.
That's the same period our colleagues in Gaza have been mostly unable to work, because of the genocide.
When the world talks about Gaza, it rightly focuses on the deprivation of food, healthcare, shelter, and life itself. But when the same world is talking about how profoundly AI is reshaping work, we must confront the long-term inequity being added on top of everything else our colleagues are suffering. Industries are being remade, and entire communities are being locked out of that remaking.
The conditions of returning to work
Our colleagues have tried to return to work at different points over the last two and a half years. They get displaced again. They try again. The pattern is not occasional disruption. It's continuous instability shaped by ongoing siege, bombing, and forced movement.
Even when work is possible, the conditions are the story.
You can only code when there's sun, because power comes from solar panels. The window for productive work is determined by weather and panel availability, not by what's on your task list. You rely on patchy internet, dropping in and out, with no clear idea when it will drop out and when it might return.
This matters more now than it would have five years ago. Coding used to be the most offline-capable digital work. You had a local server, a local environment, offline documentation. You could be productive without connectivity. AI has changed that completely. The tools that have become central to how developers write, debug, learn, and review are almost all online, and almost all paid.
This financial dimension is rarely acknowledged. AI coding tools cost money. Pro and Max plans, API usage, token consumption. For developers in companies, these costs are absorbed. For displaced technologists trying to maintain their skills or rebuild a career, the costs come out of pockets that are also paying for food, water, and shelter, in a context where the local economy has collapsed and basic supplies cost more than you can honestly imagine. Ever rising subscription pricing in dollars, when you have no dollars.
LinkedIn posts on "Building an app in a weekend with AI" and how everyone now just needs to seize on this liberating opportunity assume conditions that aren't universal. Stable electricity. Reliable internet. Money for token usage. Safety. Vibe coding is a luxury, one of many.
A skills gap that compounds
Two and a half years is an enormous gap in any technical field. In AI, it's catastrophic.
Consider what's changed in this window. The agentic shift, with Claude Code, Cursor, and other tools moving from simple autocomplete to autonomous task execution. The widespread adoption of RAG systems for knowledge-grounded AI. The mainstreaming of MCP and other protocols. New design paradigms around prompt engineering, model selection, and evaluation. Whole new categories of tools (Claude Design, Figma Make, Lovable, v0) that didn't exist 18 months ago.
For skilled developers we've worked with since the early days, coming back means more than finding your coding skills again. The whole ground has shifted. The tools they used to know are not the tools that exist now. The skills they had are not the skills the industry currently rewards.
The market they're returning to has changed too. Junior hiring has slowed across the industry, because AI is increasingly doing entry-level work. So Palestinian developers returning to work are facing two compounding problems at once: the technology has shifted, and the entry points into the industry have narrowed. They're competing against people who've spent the last two and a half years building AI-native workflows, while also facing an industry that's hiring fewer of those people too.
This is not a training problem. It's an emotional one before it's anything else. You're not just learning new syntax. You're absorbing the fact that your hard-won expertise has been partially obsoleted while you were locked out of the process of obsoleting it.
The inequity is not just about individual skills. People playing catch-up without the tools to do so means their voices aren't shaping how the industry develops next. Whose products, whose perspectives, whose priorities define the AI tools we'll all be using in five years? Right now, not theirs. The window in which the foundations of agentic AI, model selection norms, and developer workflows are being established is happening without them.
What gets built reflects whose hands shaped it. We are deciding the next decade of the industry now, and Palestinian technologists are being prevented from contributing to that decision.
The same technology, used against them
This last point becomes particularly relevant because the same technology that's reshaping the industry, the same companies that are racing to ship more powerful systems and acquire more enterprise customers, are also providers and partners to the systems being used to target Palestinian civilians.
Israel's use of AI targeting systems in Gaza is documented. Lavender, The Gospel, and Where's Daddy… these systems use machine learning to identify targets at scale and to optimise strike decisions, in ways that human analysts could not. The technology that has been pitched to the world as the next great productivity revolution is also being deployed at a level of automation that journalists and lawyers describe as enabling unprecedented rates of civilian harm.
This is not an external coincidence. It is structurally connected to the skills gap argument. The same people that are being excluded from shaping AI are the same people AI is being used against. The same Palestinian voices that aren't in the rooms where AI futures are being designed are the voices on the receiving end of decisions those rooms have made.
Whose data, whose voices, whose constraints get built into AI systems determines what those systems do, who they're built to serve, and who they're built to surveil and target. And when companies that profit from the AI revolution don't address the use of their own underlying technology against civilian populations, they are not being neutral. They are making a choice.
Why this conversation isn’t happening
The AI industry has spent the last two and a half years in an extraordinary outpouring of public commentary. Every release is analysed. Every benchmark is debated. Every leader's social post is screenshotted. The discourse is loud and overwhelming.
And yet, almost no one is discussing the issue I’m raising here. I won’t name all the reasons why, but here’s a few.
First, the political sensitivity. Many companies operating in the AI industry have commercial relationships with the Israeli state, Israeli companies, or the broader military-tech infrastructure.
Second, the way "digital divide" framing absorbs specific injustices into general issues. Talking about underrepresentation in AI is acceptable. Naming the specific reason a specific community is being excluded right now is harder and causes more commercial and political risk for those invested in AI. General framing allows everyone to feel they're engaging with the issue without committing to anything concrete.
Third, Palestinian lives are routinely flattened. Even before the latest genocide, Gaza has been perceived for years through the prism of a wall, a conflict, a backdrop. The lives going on inside it have rarely been visible to the wider world as full lives, with the same texture as any other. And even now, in the middle of a genocide, it remains easier for many to hold Palestinians as one-dimensional victims than as people who, like anyone else, want to build things, learn, develop careers, and shape the world they live in.
That flattening is part of how this is allowed to continue. Making people three-dimensional makes what's happening to them harder to accept. So the dehumanisation is not incidental. It is structurally useful to the systems that permit the violence.
Recognising a Palestinian technologist as a full technologist (with ongoing expertise, ambition, and contribution) disrupts that flattening. Which is part of why this particular argument is hard to make, and harder for the industry to engage with. Once you accept that the people being locked out of the AI moment are people you would otherwise call colleagues, the silence becomes more difficult to maintain.
What real response looks like
At Yalla we're trying to do something about this through Yalla Labs. We offer structured, paid programme placements for Palestinian technologists who have the skills to succeed in the global market but need a supported bridge back to it.
This is not a training course; it is a real working environment, built around genuine project delivery, designed to give participants everything they need to become independent, well-paid professionals in the international tech economy.
It's real and it's growing. It's also small, and the scale of need vastly exceeds what one organisation can provide.
What real response looks like is bigger. I don’t claim to have the solution but I also don’t want us to just accept it is too big a problem to address. There are concrete things that organisations can do, and my hope is in naming them that this is how the conversation can continue.
- Infrastructure support. Power and connectivity are foundational. Organisations with relevant supply chains or relationships can provide hardware, solar capacity, and connectivity solutions to displaced technologists. This is not abstract. It's specific equipment, in specific places, sustained over time.
- Token sponsorship. AI coding tools have become essential. Funding access to them, at meaningful volume, removes a direct financial barrier. A company could underwrite the AI tool subscriptions for fifty displaced technologists for a year, at relatively modest cost compared to most enterprise software budgets. Few are doing this.
- Funding for learning. The skills gap requires more than access. It requires structured opportunity to catch up: mentorship, defined pathways, recognition of prior expertise. Organisations that run bootcamps, training programmes, or apprenticeships can dedicate cohorts to displaced technologists with the contextual flexibility their situation requires.
- Hiring that genuinely understands the context people are working in. The standard hiring playbook does not work for displaced technologists. Stable hours assume stable lives. Equipment requirements assume infrastructure. Time-to-productivity assumptions assume environments people aren't in. Hiring well means designing roles around what's actually possible: async-first workflows, flexible hours, outcome-based deliverables, a safe environment that empowers ongoing conversation.
- Commercial pressure on governments. This is the one most companies will resist, and it is the one that matters most. Companies have leverage. They can use it. The choice to remain commercially neutral while their employees, partners, and supply chains live under siege is not neutrality. It is a position. Other positions are available.
None of these responses are sufficient on their own. None of them substitute for the political action that would end the genocide and restore the conditions in which Palestinian technologists, like any others, can simply do their work. But they are real things that can be done now, by organisations that have the means to do them.
A question for the industry
If you're posting happily about how AI is going to revolutionise everything, it's worth asking who's being left behind.
Not in the abstract "digital divide" sense. Concretely. Communities currently experiencing genocide and displacement, whose technologists are being locked out of the most significant shift in this industry's history.
The conversation about AI is going to continue. The decisions about what gets built, by whom, on what foundations, are being made right now. The voices in that conversation, and the voices missing from it, will define the technology that defines the next decade. Some of the people who should be shaping these conversations can't be, because they don't have power, internet, or safety. That's not their failure. That's ours, collectively, if we keep talking about AI as if everyone has the same starting line.
If you're confident AI is going to reshape the world, ask who's in the room while it's being shaped.
Then ask who isn't, and why.
Then ask what your organisation, with its specific position and its specific leverage, is willing to do about it.
