Defragmenting philanthropy, then the world.
Ryndel Labs is building the coordination infrastructure the world never had. We map how capital, resources, and people connect across communities, model where the system breaks, and deploy capital and initiatives into what the model finds. Starting with philanthropy, because federal funding cuts made the need unignorable.
The world is stuck in
local maxima.
For the last year we have been inside the rooms where capital, resources, and decisions move. Philanthropy. Government agencies. Universities. Family offices. Venture capital. Private equity. Corporate R&D. Leaders of movements. Nonprofits working across all of them. Every conversation ended at the same place.
It is not that the introduction layer is missing. The incentives to build it do not exist. The world is complex enough that coordination is genuinely hard, and every actor inside it has a local reason to keep their piece of the picture to themselves. Intermediaries profit from gatekeeping. Platforms monetize scarcity. Information stays siloed because someone benefits from the silo. Each actor optimizes locally and gets stuck on a local maximum, and the whole system holds itself in place.
The result is that almost everything we care about at the systems level depends on serendipity. The nonprofit meets the right funder at a conference. The founder runs into the right investor on a flight. The researcher and the company that could commercialize her work find each other through a friend of a friend. A nonprofit said it to us directly, and we kept the line: serendipity is inefficiency. Our work is to systematically manufacture what the world currently waits to stumble into.
Why now. Three things converged at once. Federal funding cuts forced a reckoning with efficiency that the nonprofit sector has been able to defer for decades, and the sector has the sharpest catalyst for change of anywhere we looked. The cost of modeling networks at scale has dropped by orders of magnitude thanks to the breakthroughs in transformer architecture, which made language-heavy institutional data tractable to process for the first time. And the broader political and technological reset is producing the most rapid change the world has seen in a long time, making the inefficiencies we have been talking about newly obvious. The pressure is real, the tools are finally cheap, and the moment is unusually open.
A community foundation
for the world.
Where a community foundation coordinates a local ecosystem, we coordinate at every layer above it, driven by a model that learns from every deployment. We see the picture no single sector can see from inside itself. We bring together the data that makes the picture actionable. We deploy capital and initiatives into the gaps the model identifies.
Every product feeds the model. Every deployment generates new data. Every new geography makes the map denser. The institution gets sharper with every action it takes. There is no exit from that loop.
Two real partners. Real problems. Today.
We are in active development with two philanthropy connectors, one in Connecticut and one in North Carolina. The data being surfaced right now is the foundation the scalable product will be built from.
Three failures. Every time.
Nobody is responsible for the whole
Every actor owns a piece. Nobody owns the seams. The space between institutions is a commons, and a commons without a keeper fails predictably. No one maps the whole because it is no one’s job. No one is measured on the connections that should have formed and did not. So coordination happens when someone has time left over, which means it mostly does not happen at all.
The incentives reward fragmentation
Institutions measure what happens inside their walls, and nothing else. Funders are rewarded for dollars out and visibility, not for outcomes. Organizations compete for capital with the organizations they should be collaborating with. Institutions outlive the problem they were designed for, because winding down means admitting the job is done. And no one moves first toward a better equilibrium, because the first mover pays for it alone. The ego is structural, not personal.
The complexity exceeds any single coordinator’s bandwidth
Even with a keeper and the right incentives, the complexity defeats any single coordinator. Coordination means matching thousands of actors across dozens of sectors, each with its own vocabulary, each holding a piece of the relational knowledge in people’s heads and informal ties. Trust is slow and expensive to verify at scale. Context does not transfer cheaply. Every community stays at a local maximum because reaching a better one requires a view nobody has.
“Serendipity is inefficiency.”
One map.
Three readers.
Where a community foundation coordinates a local ecosystem, we coordinate at every layer above it. The same dataset that gives a philanthropy connector resolution over its ecosystem becomes deal flow, operator intelligence, and gap modeling for the capital allocators and institutions that read it at scale.
Built where no one else will go.
Hyperlocal entity data is the hardest data in the world to collect. It is undigitized, heterogeneous, politically complex, and requires genuine institutional trust, not just public scraping. Every serious data company surveyed this problem and chose something easier. No existing vendor covers what we are building. That is not a gap in the market. That is structural difficulty, which is exactly why the data is worth something once collected.
Community foundations are the entry point because they already hold the trust and the relationships that make real coordination possible. A connector does not hold all of the information about its community; the community itself holds it, distributed across every organization and person inside. What the connector has is legitimacy. What we bring is the network layer that turns legitimacy into structure.
At Layer 3, the map meets the model. Every mapped entity, every identified gap, every tracked coordination failure flows into the Underlying Inefficiency Model. Capital deployed by that model generates new entity data. The loop does not end. The institution gets sharper with every deployment and more valuable with every new geography mapped.
Communities are where every network intersects.
A community is not a silo. It is the ground floor where venture capital, government, universities, philanthropy, nonprofits, and local enterprise all touch. Map what is happening inside one community and you are mapping a cross section of every network that runs through it.
Philanthropy connectors are our first partners because they already hold the trust and the relationships that make real coordination possible. What the connector has is legitimacy. What we bring is the network layer that turns legitimacy into structure. Together we surface the connections that already should have happened, and the ones that will become critical as the community changes.
The value compounds with scale. One mapped community is useful. Ten connected communities change what is possible for every nonprofit, funder, researcher, and operator inside them.
Most of the conversation in capital allocation focuses on the allocators. We are building for everyone on the other side of the table too.
A nonprofit looking for a funder whose thesis fits their work
A founder looking for capital aligned with their specific problem rather than their zip code
A researcher trying to reach the business that can take her work out of the lab
An operator in an underserved market who has been invisible to every standardized database
Our job is to make people findable to the people who should be looking for them. Coordination runs in both directions.
The dataset is one thing. Three readers look at it and see three different resolutions. The shape of what they find is what each of them came looking for.
The dataset is the same. The resolution of what you find in it depends on which side of the table you sit on.
Mapping is the input.
The model is the product.
The map tells us what exists in a community and how it connects. The model tells us why coordination is failing, which connections would produce the most impact, and where the next inefficiency is about to emerge. Mapping is necessary. The model is what turns it into infrastructure.
Coordination failure has a repeatable structure.
We are not building a system to find people or store relationships. We are building a system that models relationships and proactively uses them to generate actionable collaborations between institutions. Every initiative we generate, successful or failed, feeds back into the underlying data. The model gets sharper with use. The dataset gets more valuable with every deployment.
We are building the model on real theoretical grounding rather than product intuition. Asset-Based Community Development for the mapping taxonomy. Burt’s structural hole analysis and Barabási network science for the algorithmic gap detection. Pfeffer and Salancik’s resource dependence theory for why the gaps persist. Banerjee and Duflo’s causal inference methodology for measuring whether facilitated connections actually produce impact.
The long-term ambition is something closer to a research institution in the tradition of Santa Fe Institute or Bell Labs, but more directly connected to capital allocators (being a Fund of Fund). Sector-agnostic research on system inefficiency, subsidized by the allocators who benefit from the findings, feeding the model that directs our own deployments and initiatives. Every iteration of the map sharpens the model. Every deployment generates new data. The model becomes more accurate with every community mapped and every action taken.
The same three structural failures appear in every broken system we have studied, in different combinations and severities. If they are enumerable and measurable, then coordination failure is not intractable. It is a pattern problem. Pattern problems have pattern solutions that generalize across domains without reinvention.
Traditional
Ryndel
One map. Every slice trains it further.
Every mapped community adds density to a single shared layer of hyperlocal relational data. The cohorts who read the map read it through different lenses, and every lens produces feedback the model learns from. Hover a cohort to see its slice.
Every slice draws from the same underlying community data, and every slice builds on top of it. Both the raw data and the model that reads it get denser with use. What makes the map valuable to one cohort is what makes it valuable to all of them. The value compounds across cohorts rather than dividing between them. It is also what makes the map hard to replicate: you cannot slice data that was never collected.
Every community looks different from the inside. Coordination failure looks the same. The same patterns keep showing up: capital that cannot see the operator, a resource and a need in the same place that never meet, an ecosystem one introduction away from a better state. Enough communities in, and the model starts reading those patterns directly. It knows which inefficiencies repeat, which local maxima everyone is stuck in, and where the next move actually moves things.
Understanding how
systems break.
Ryndel is a research institution in the tradition of Santa Fe Institute and Bell Labs, but directly connected to capital allocators. The work runs on two layers. One is a set of domain probes into specific coordination failures. The other is the research underneath them, asking what makes inefficiencies repeat across fields that look nothing alike. Every probe feeds the model and the model sharpens the next. Below you’ll find what is active, what is under research, and what we have surfaced but not yet pursued. If your work belongs here, or if you want to join a project, reach out.
Honest about
the horizon.
Long-term work deserves long-term honesty. This is what we are building toward and when.
For-profit for scale. Mission at the core.
Our goal is to make the largest physical impact possible on the world, measured in real outcomes across real communities. That goal dictates the structure.
A nonprofit can do meaningful work but cannot compound at the speed the problem requires. A standard for-profit can scale but usually drifts from the mission under the pressure of quarterly capital. We are building Ryndel as a for-profit institution because scale is the only thing that matches the size of what we are fixing, with the values, incentives, and governance designed from the start to hold mission in place as the institution grows.
The focus is mapping and modeling. Everything else is downstream of that. As the model finds inefficiencies worth filling, we stand up the funds, initiatives, and partnerships that fill them. Sometimes that is a coalition of existing funders. Sometimes it is a standalone fund built around a specific hole. Sometimes it is a direct initiative we operate ourselves. Every deployment feeds new data back into the model. Scale and mission are the same loop.
Two founders.
One problem.
We are foundationally curious people, drawn to the deep systems-level connections running between every field and harnessing it for making the largest impact possible.
We have spent our lives taking that curiosity into building things, weaving disparate entities into coherent systems that can carry themselves. Ryndel is us pointing that same instinct at the outcome we care more than anything about: the largest sustained impact we are capable of making on the world, compounding for centuries after we are gone. We have built it so that the work can be done with meraki.
Ryan has spent his entire life studying how the world works at the system level and then diving into the specifics, watching how the threads connect. Fourteen years inside the startup ecosystem. Founded seven companies. A year inside venture capital. Over a decade mentoring and facilitating inside a startup accelerator, sitting at the exact point where founders and resources fail to find each other. TEDx speaker on neurodiversity and reframing struggle.
He is driven by a refusal to accept that broken systems have to stay broken. Most of what looks like dysfunction is a design problem that has not been solved yet.
Thirteen years of cross-domain research across renewable energy, biomedical engineering, and fintech. Presented at national conferences. Published in the Library of Congress. Built and scaled thirty robotics education initiatives across nine countries, reaching over seven thousand people, including twenty five hundred children with a particular focus on girls.
Hoshita works at the intersection of design thinking and systems research. Every domain she has worked in had a different vocabulary for the same underlying coordination problem. Ryndel is the structural answer.
Load-bearing assumptions.
The right people
make this work.
We are building something designed to outlast any single founder, fund cycle, or market condition. If you recognize yourself below, reach out directly. Ryan or Hoshita responds personally to every message.
Tell us what you are working on.
Ryan or Hoshita responds personally, usually within 48 hours. No intake process. A direct conversation about whether there is real fit.
The entry point for the map. Phase 1 cannot exist without foundation partners willing to map their ecosystem. If you are a community foundation or philanthropy connector, we want to talk to you first.
Usually within 48 hours. Ryan or Hoshita responds personally.
A direct conversation. No pitch deck, no intake form. We want to understand what you are working on and where the overlap might be.