- Issue: #1 – Validate CSV structures against BC master data requirements
- Status: 🔄 PR raised (#5) – reviewed and approved, awaiting merge
The last update left us with 73 CSV schema files mapped across 19 extensions and a worldbuilding knowledge base ready to drive data generation. Before we fired up the agent and started producing thousands of records, though, we needed to answer one question: are these schemas actually correct?
The validation process checked what existed and check other work that had been since completed. By the time the agent finished, we had 91 files, not 73. That was the job of Issue #1.
Time spend planning pays back many fold during execution
Business Central’s master data import requirements are specific: field names need to match AL table field references exactly, required fields for configuration packages have to be present, and data types need to align with what BC expects at import time.
The issue was written against the 73 files in worldbuilding/data-planning/. Some had been built carefully against BC field references. Others had evolved organically as the worldbuilding developed, and there was a real chance that naming conventions had drifted, required fields had been skipped, or formats had been set up in ways that would cause silent failures at import. The only way to know was to actually check.
Letting the Agent Do the Legwork
Rather than manually reviewing the files, I assigned Issue #1 directly to a GitHub Copilot agent. This was a trial for me personally having not used GitHub’s agentic issue assignment on this project.
The task was well-suited to it. This wasn’t a code generation job. It was a structured review task with clear acceptance criteria: check field names, verify required fields, confirm data types, document findings. Exactly the kind of methodical work where an agent can move faster than a human without cutting corners.
The prompt was specific: review all CSV files in worldbuilding/data-planning/, validate against BC master data requirements, and produce a structured validation report at worldbuilding/data-planning/bc-validation-report.md.
One important constraint was included: don’t modify any of the existing CSV files already in the repository. The agent’s job was to report on what was there, not fix it. Any corrections identified would become separate work items. New files were a different matter. If something was genuinely missing from the data model, creating it was in scope.
Copilot created a branch automatically on assignment and got to work.
The Scope Problem – and How the Agent Handled It
The agent started with the 73 files as expected. Part way through, Jeremy commented directly on the PR: the CSV list was larger than the issue described. Had the agent cross-referenced the Nubimancy/PencilSketch repo of BC extensions? As this had been worked on since the issue was raised (so we are saying its a good think it took me 3 weeks to get around to this right…)
The agent did exactly that. And it found that the entire RiniFighterProgressionAnalytics extension, covering 18 AL tables about Rini’s fighter progression, injury tracking, career transitions, and security consulting work – had zero corresponding CSV schema files. Not gaps in existing files. The whole extension was simply absent from the data-planning folder.
The agent responded to Jeremy in the PR thread confirming what it had found, created all 18 missing CSV schema files (files 74-91) derived directly from the AL table definitions in PencilSketch, updated the validation report to cover the full 91-file scope, and flagged the discovery clearly.
That back-and-forth between humans and agents is a good illustration of how these agentic workflows actually function. Not fully autonomous, but not purely manual either. A human pointing at something, an agent doing the legwork. The final scope: 91 files across 9 BC extensions.
What the Validation Found
The validation report covered all 91 files in a per-file table grouped by BC extension, with status, missing required fields, format concerns, and recommended action. The headline numbers:
| Status | Count |
|---|---|
| 🟢 READY | 59 |
| 🟡 MINOR issues | 26 |
| 🔴 REVIEW – must fix before import | 6 |
| ⚠️ ID CONFLICT | 2 |
The six critical blockers are the ones that matter most for the next phase:
- Chart of Accounts – missing
Direct_Posting,Blocked,Gen_Posting_Type, and all VAT/posting group columns.Balance_Sheet_TypeandIncome_Statement_Typeneed consolidating into BC’s singleIncome/Balancefield - Customers and Vendors – missing full address blocks and
Blockedflag. Records as structured are non-functional for document printing and tax jurisdiction. VendorLead_Time_Daysis an integer. BC expects a Date Formula (5D, not5) - Services file – conflates Item (Service type) and Resource schemas in a single file; BC target table is ambiguous
- Posting Groups – one flat file conflating five separate BC posting group tables, missing payables and receivables account fields
- Volume Pricing Tiers – missing
Item_No/Item_Category_CodeandSales_Type/Sales_Code; the file as structured cannot be consumed by BC’s pricing engine
Beyond the blockers, the report includes a full import dependency order for all 91 files, giving the correct sequencing for configuration package processing – and 15 prioritised action items across immediate, short-term, and pre-import phases.
A Bug Found in PencilSketch
One finding went beyond the CSV validation scope entirely. The agent identified that table ID 54005 is declared twice in PencilSketch – once for ClearanceArenaRestriction and once for MentorshipAssignment in a different namespace. This is a BC compile error. It will prevent deployment and needs resolving in the PencilSketch repo before we get anywhere near import testing. That wasn’t in the brief, but the agent found it and we logged a new issue to track the resolution. Fix duplicate table ID 54005 in RiniFighterProgressionAnalytics extension · Issue #1 · Nubimancy/PencilSketch
The PR Review
With the agent’s work complete, the PR went to review. The verification confirmed:
bc-validation-report.mdcreated – 305 lines of structured analysis- All 18 new CSV schema files (74-91) present and correctly derived from AL table definitions
- All 91 files assessed across 9 sections
Why This Mattered Before Anything Else
It would have been tempting to skip straight to data generation – there’s more visible progress in building records than in validating schemas, but generating thousands of records against a broken schema compounds the problem downstream. Every issue in the four-task sequence would have been harder to debug.
We also now know the true scope of the data model: 91 files, not 73. That changes the planning for Issue #2 significantly.
Getting the structure right first is the unglamorous work that makes everything else reliable.
What’s Next
Once the PR is merged, the validation report becomes the authoritative reference document for the current state of all 91 CSV schemas, the import dependency order, and the prioritised remediation roadmap.
Issue #2 – Create representative sample data for 5 hero companies is next. This is a significantly more complex task – generating coherent, cross-referenced sample data for Bran, Thorin, Delyra, Rini, and Weltina, with records internally consistent across all five businesses. Because it involves creating and iterating on multiple files, we’ll tackle this one locally in VS Code with Copilot agent mode, using the validation report as direct input.
The full four-issue sequence:
| Issue | Task | Status |
|---|---|---|
| #1 | CSV Structure Validation | 🔄 PR under review |
| #2 | Sample Data Creation – 5 hero companies | ⬜ Pending |
| #3 | Cross-Reference Validation | ⬜ Pending |
| #4 | BC Import Testing via Configuration Packages | ⬜ Pending |
If you want to follow along, the Knowledge repository is public on GitHub – the full PR, validation report, and all 91 CSV schemas are open.
More updates as Issue #2 gets underway. If you’ve built BC demo environments before and tackled the master data problem then I’m sure you will be interested to follow along on this AI / agentic backed approach.
