It shows up on your calendar like a completely normal thing.
"Data integration kickoff," scheduled for an hour, with an overly optimistic note that says, “This will probably only take about 30 minutes.” You glance at the attendees and count five other names. The implementation PM, a data engineer, the CSM who owns the account, a solutions analyst, someone from leadership who joined because it’s a big client. The employer sent five people of their own.
Halfway through the meeting, the PM asks the engineer for an estimate. They confidently reply, “No problem, this is just a sprint.”
Not scheduled? Thinking about what that sprint actually costs.
Here's what the next couple of months actually look like.
The Kickoff
You spend the first twenty minutes on introductions and context that everyone mostly already knows. The employer describes their HR system (Workday, ADP, some regional thing built in 2009 that’s still running). You share your data spec. They share theirs and talk about the five custom fields that they must have in order to map employees correctly into their org. Everyone agrees to review them, with each side thinking that the other spec smells like work. You schedule a follow-up and agree to meet weekly until the data is live in production.
Spec Review
Your engineer reads the spec document in full, which takes 2-3 hours because the spec is a spreadsheet with multiple tabs and a comment thread from the last vendor who used it. They find four ambiguities. They send an email. The employer's HR contact replies three days later, answering two of the four questions. The other two require a call, so the PM schedules it and includes the analyst so they know how this affects plan configurations.
Two meetings later, the client and their data provider politely make it clear that they don’t have capacity to generate a special file feed just for you, so, inevitably, you agree to use their spec. Good thing your team has gotten so good at vibe coding in the past few months.
Test File Validation
The employer sends an initial extract. It doesn't match the spec in five places, three of which are surprises. One field you thought would be alphanumeric is numeric-only. The dependent records are formatted differently than the employee records. There's a column that isn't in the spec at all and nobody on either side knows what it is. Your engineer spends a day and a half in the data, builds a field map, logs the edge cases.
You run the file through your intake process. Some records load cleanly. Others fail validation: the formatting’s fixed, but there are unexpected values in three columns. The engineer fixes the mapping. The analyst cross-checks member counts against the employer's expected enrollment. They don't match, which means either your counts are wrong or the employer's are. This takes another meeting to resolve.
The Review Meeting
Fifteen minutes turns into 45 when it turns out the enrollment mismatch is because the employer's system uses different effective date logic than your spec assumed. The PM, the CSM, the engineer, and two people from the employer's HR team work through it together. Someone's director joins for the last ten minutes to hear the resolution.
Documentation and Go-Live
Your PM writes up what they agreed. Your engineer documents the custom mapping so the next person who touches this feed knows what they're looking at, and the solution analyst updates their plan logic for the fifteenth time. You monitor the first few production loads and catch one more edge case: employees on leave aren't flagged the same way the spec said they’d be. But hey, the format stayed consistent, so that’s a win!
Here’s the problem: the CTO’s carrying a much smaller number in their head, because they’re just thinking about engineering time.
The Bill
Add up the hours on your side of the table. An engineer running from kickoff to go-live on a clean integration puts in somewhere between 20 and 30 hours. The PM is probably at 10-15. The analyst? 8-10 on meetings alone, never mind the spec and config updates. The CSM is on half a dozen calls and threads. Someone from leadership shows up twice. Fully loaded, you're looking at $4-6000 for an integration that goes well.
When something goes sideways, that number can quickly double from the meeting overhead alone. Add a second employer whose HR system outputs data in a format nobody anticipated, and you're not running two integrations, you're running two custom projects.
Here’s the problem: the CTO’s carrying a much smaller number in their head, because they’re just thinking about engineering time.
When I hear someone say, "we can do this in a sprint," I know that they’re technically right, but also massively underestimating the cost. Yes, a focused engineer can ingest a clean feed in a sprint. The sprint estimate misses the rest of the team – the PM, CSM, analyst, and leadership appearances – as well as the meeting overhead from weekly check-ins and troubleshooting sessions. While we all know that engineers keep coding during those meetings, I'm not sure you can count it as productive time. And, of course, no on budgets for the moment three months later, when open enrollment rolls around and all of the plan codes change and fail validation.
This isn’t a failure, but it is a mistake. The process is rational, but the costs are hidden. Every integration is a custom project because every data set is unique: different systems, field conventions, logic, valid values. And every project team includes more than the engineers, because customer onboarding is as much a customer service activity as a technical one. When you forget that, the true costs pile up, even if they don’t show up directly in your budget.
The real pain comes from opportunity cost: the PM who can't take on other work, the engineer who's in the weeds on data mapping instead of advancing your product roadmap, the analyst who’s limited to four clients instead of six because they keep getting pulled into data meetings.
It might be “just a sprint,” but if so, it’s an expensive one.



