After the pivot from hardware to software, the question became: what kind of software company should we build? We could build a product. We could build a platform. The distinction matters enormously — a product solves a specific problem for a specific user, while a platform enables others to solve problems we haven't thought of yet.
The more we examined the AI workflow space, the more convinced we became that the platform model was the right one — not just commercially, but structurally. Here's why.
The Unsolved Workflow Problem
The long tail of AI workflow needs is vast and largely unaddressed. There are thousands of specific, high-value workflows that could be built with today's AI capabilities — workflows that exist at the intersection of a particular domain, a particular file type, and a particular intent. Most of them haven't been built yet, not because they're technically hard, but because no single company has both the domain knowledge and the engineering capacity to build all of them.
Consider the diversity of people who have document-related problems that AI could solve:
- A hiring manager who receives 200 resumes and needs a ranked shortlist against a specific job description
- A compliance officer reviewing contracts for specific clause patterns across a portfolio of documents
- A researcher extracting structured data from a set of unstructured research papers
- A freelancer tailoring a portfolio document for different client pitches without rewriting from scratch
- A student condensing a semester of lecture notes into a structured study guide
Each of these is a real, high-value workflow. Each could be built once and used many times. But the person who has the hiring manager problem is not the same person who can build the AI system to solve it. And the developer who can build that system may not know that the hiring manager problem exists, or what its nuances are.
This is a matching problem. The demand exists. The supply of people who can build exists. What's missing is the connective tissue between them.
The Two-Sided Vision
The Workflow Workshop is what we're building toward. On one side: people who have workflow needs — specific, recurring tasks that could be automated if someone built the right AI system. On the other side: developers who can build on a proven, stable workflow architecture and deploy their solution to users who need it.
Demand Side (Users) Supply Side (Builders)
──────────────────── ──────────────────────
"I need a workflow "I can build a workflow
that does X with for X using the Goal Lane
files of type Y" architecture and MCP tools"
│ │
└──────── Workflow ──────────┘
Workshop
Platform
│
Stable architecture
Proven tooling
Revenue sharing
The architecture is what makes this possible. If every builder had to design their own agent loop, their own tool invocation system, their own progress streaming, their own human-in-the-loop mechanism — the barrier to entry would be prohibitive. Building all of that correctly takes months. But if the architecture is already built and proven, a skilled developer could take a well-specified workflow requirement and build a high-quality solution in days, not months.
Convilyn — the platform we're building now — is simultaneously a product (you can use it today) and a proof of concept for that architecture. Every workflow we add validates that the architecture can support it. Every architectural improvement becomes available to every future builder on the platform.
Why the Community Layer Matters
Technical platforms succeed or fail on the quality of their ecosystems, not just on their technical merit. A workflow architecture that developers can build on confidently requires more than good documentation — it requires a community where builders share what works, identify what's missing, and collectively improve the platform.
The community we're cultivating has two distinct functions. First, it surfaces real demand. The most valuable input a developer can receive is: "I have this specific problem, and I'd pay for a solution." That signal is far more useful than guessing at what workflows might be valuable. Community members with problems are not just users — they're product managers for workflows that don't exist yet.
Second, the community accelerates quality. When one builder solves a hard problem — a tricky document parsing edge case, a workflow phase structure that handles user indecision gracefully, a quality validation criterion that distinguishes good output from bad — that knowledge should propagate. A community of builders working on a shared architecture turns individual solutions into collective intelligence.
Architecture as the Foundation, Not the Product
One of the most important decisions in platform design is choosing what to commoditize. We're deliberately commoditizing the infrastructure layer — the agent loop, the tool invocation system, the spec-driven workflow definition, the human-in-the-loop mechanism, the output validation. These should be available to every builder on the platform, free of charge, as table stakes.
What builders differentiate on is domain knowledge: the understanding of what a good resume looks like, what a compliant contract looks like, what a useful research summary contains. That knowledge can't be commoditized — it comes from domain expertise. The infrastructure can be. And when it is, domain experts can build high-quality workflows without needing to also be distributed systems engineers.
This division — commoditized infrastructure, differentiated domain knowledge — is what lets a workshop model work. The platform provides the stable base. The builders bring the vertical expertise. The users get workflows that combine both.
Convilyn as the First Plank
We describe Convilyn as the first plank of a bridge we're building. That metaphor is deliberate. A single plank isn't a bridge — but it's real, it's load-bearing, and it has to be built before anything else can follow.
The workflows we're building now serve two purposes simultaneously. They serve real users with real needs — the resume builder, the interview prep assistant, the compliance checker. And they stress-test the architecture against real-world conditions: edge cases in file parsing, unexpected user responses, workflows that run longer than expected, tool failures that have to be recovered from gracefully.
Every edge case we encounter and solve in our own workflows is an edge case that future builders on the platform won't have to solve themselves. Every architectural improvement we make is an improvement to the foundation everyone builds on. The product and the platform evolve together, each making the other stronger.
