“Open models win adoption. Closed models win margins.”

AI product lead

Redeployed is a weekly newsletter that breaks down one important AI story at a time for leaders in technology. Every issue explains what the shift means for technology companies and how smart leaders can use it to get ahead.

A product team is sitting in a planning meeting, trying to make a decision that did not exist a year ago.

They are choosing which model to build on. One option is open. It is cheaper, easier to customize, and gives them more control over how the system behaves. The other is closed. It performs better, handles edge cases more reliably, and requires less work to get to production. Both options are viable. Neither is perfect.

For the past year, that choice felt binary. You picked a side and built around it.

According to recent reporting, the company plans to release parts of its next generation of AI models under an open-source license, while keeping other components proprietary and reserving its most advanced models as closed systems. At first glance, it looks like a compromise between two competing philosophies. In reality, it signals something more deliberate.

Meta is not trying to win the open versus closed debate. It is using both.

Then the Tradeoff Started to Break

For years, the conversation around open source in AI was framed as a belief system. Some companies leaned into openness as a way to accelerate innovation and build community. Others focused on closed systems to protect performance advantages and capture value.

That framing is starting to fall apart.

Open models are not just about collaboration. They are about distribution. They spread quickly, get embedded into tools, and become the default choice for developers experimenting, building prototypes, or launching early versions of products. Once a model becomes part of the workflow, it gains something more valuable than raw performance. It gains mindshare.

Closed models serve a different purpose. They concentrate. They offer higher performance, better reliability, and access to the latest advances. They are where companies maintain an edge when quality matters most.

What Meta is signaling is that these are not opposing strategies. They are complementary layers of the same system.

Open where reach matters. Closed where advantage matters.

So Teams Started Building Differently

For companies building products, this changes how decisions get made.

The question is no longer whether to adopt open or closed models. It is where each one fits.

A team building internal tooling might rely on open models for tasks like classification, tagging, or retrieval, where cost and flexibility are more important than absolute accuracy. The same team might switch to a proprietary model for customer-facing features, where mistakes are visible and harder to recover from.

Instead of committing to a single provider, the system becomes a mix of capabilities. One model for speed. Another for cost. Another for quality.

The architecture becomes less about picking the right model and more about designing how different models interact inside the same workflow.

That shift puts more pressure on the people building the system, not just to implement it, but to understand the tradeoffs behind every decision. And in practice, it is pushing teams to look for engineers who can actually build across these mixed environments, people who understand how to integrate models, data, and workflows into real systems. Many companies are expanding in that direction through nearshore AI development teams that already work across open and proprietary stacks and can move quickly without adding unnecessary complexity.

This issue of Redeployed is brought to you by Tecla: The shift is not just in the models themselves, but in how companies build around them. Teams are no longer choosing a single approach. They are combining open and proprietary systems based on cost, performance, and control. That changes how products get built and who you need on the team. It’s not just about adding engineers, but bringing in people who can design and scale across multiple AI stacks. The companies moving fastest are hiring for that flexibility. Tecla helps companies hire senior tech talent in the U.S. and nearshore who already work in these environments, from AI leadership to the engineers building and scaling these systems, so teams can move faster without locking into one path.

And That’s When the Advantage Moved

As this pattern becomes more common, the competitive layer starts to shift.

Access to models is becoming less of a differentiator. Open ecosystems are expanding, and proprietary providers are competing aggressively on performance and pricing. Over time, most companies will have access to similar capabilities.

What begins to matter is everything around the model.

How easily can developers integrate your product into their workflow? How quickly can they adapt it to their own data? How well does it fit into the systems they already use?

In that context, open source is not just a technical choice.

It becomes a distribution strategy.

The companies that make it easiest to build, extend, and experiment often win early adoption. And early adoption tends to shape long-term defaults, even if the underlying models are not the most advanced.

But the Tradeoff Didn’t Disappear

This hybrid approach introduces its own set of challenges.

Flexibility can create hidden dependencies. A system that looks modular on the surface may still rely heavily on a proprietary provider for critical functions. If those dependencies are not visible, they become harder to manage over time.

There is also a tendency to over-index on open models in situations where performance still matters. Not every use case can tolerate tradeoffs in accuracy or reliability, especially when the output is exposed directly to users.

At the same time, as more companies build on similar open foundations, differentiation becomes harder. If everyone has access to the same tools, the advantage shifts to execution, integration, and product experience.

The technology becomes more accessible.

The bar for using it well gets higher.

What This Actually Means Going Forward

The market is moving away from simple choices.

Open versus closed is no longer the decision. It is one variable in a broader system design problem.

If you are building a product today, you are not just choosing a model. You are deciding where you want control, where you are willing to depend on external providers, and how easily you can adapt as the landscape changes.

The companies that succeed in this environment will not treat models as fixed foundations. They will treat them as interchangeable components, designing systems that can evolve as new options emerge.

Because the real risk is not choosing the wrong model.

It is building a system that cannot adapt when the right one changes.

And in a market that is moving this quickly, that might be the only mistake that matters.

Connect With Other Technology Leaders

If you want to connect with other technology leaders having real conversations about AI and how it is changing business, check out GILD Curated Circuit.

More to come…

Gino Ferrand, Founder @ Tecla

Keep Reading