Eventbrite Sync, DeepL Translation, and Claude AI: The Integration Stack Behind City of Om

Three third-party integrations that saved hundreds of hours building Ottawa's wellness festival platform. How we connected Eventbrite, DeepL, and Claude into a single Rails pipeline.

Eventbrite Sync, DeepL Translation, and Claude AI: The Integration Stack Behind City of Om

Eventbrite Sync, DeepL Translation, and Claude AI: The Integration Stack Behind City of Om

The City of Om festival platform doesn't exist in isolation. It connects to Eventbrite for ticketing, DeepL for bilingual content, Claude for content cleanup, and Google Sheets for legacy data import. Each integration solved a specific problem. Together, they eliminated hundreds of hours of manual work.

This post breaks down how each integration works, what went wrong, and what we learned.

Integration 1: Eventbrite (Attendees and Events)

The Problem

Tickets are sold on Eventbrite. The festival app needs to know who bought tickets so they can book classes. Two systems, one identity problem.

The Solution

Two sync mechanisms running in parallel:

Polling (every 5 minutes): The Eventbrite::AttendeesImporter hits the Eventbrite API, pulls recent attendee records, and creates or updates local Attendee and User records. Each attendee's email becomes their login. Each ticket maps to a ticket type (VIP, General, Add-on) which determines booking privileges.

Webhooks (real-time): An endpoint receives Eventbrite webhook payloads for check-ins, cancellations, and refunds. When someone checks in at the gate, their status updates instantly. When a ticket is refunded, their bookings are flagged for review.

The Tricky Parts

Deduplication. People buy multiple tickets, change their email, or have their assistant purchase on their behalf. We track unique_id from Eventbrite and match on email as a fallback. Duplicate detection runs on every import cycle.

Auto-generated users. When an attendee syncs, we create a User account with a default password. This means someone who bought a ticket at 3am can log in and book classes immediately without any manual provisioning. The trade-off is that first-time users need to know their email was used as their username -- we handle this with clear onboarding emails.

Rate limits. Eventbrite's API has rate limits. The 5-minute polling interval was chosen to stay well within limits while keeping data fresh enough that new ticket holders can book quickly.

What We Learned

Don't try to replace the ticketing platform. Eventbrite handles payments, refunds, and ticket distribution. We handle the festival-specific booking logic that Eventbrite can't do. Clean separation of concerns.

Integration 2: DeepL Translation

The Problem

Ottawa is bilingual. Every teacher bio, class description, FAQ entry, and page section needs to exist in English and French. The festival has 80+ teachers, each with a bio and 1-3 class descriptions. That's potentially 300+ pieces of content to translate.

The Solution

The Translation::DeeplService wraps DeepL's API for async translation. When English content is created or updated, a TranslationJob queues up and translates it to French in the background.

The Mobility gem handles the storage layer -- each translatable field gets English and French versions stored transparently. The app reads the correct version based on the current locale.

The Pipeline

Content flows through a specific order:

  1. Raw content arrives (teacher application, admin entry, or import)
  2. AI cleanup runs if applicable (Claude suggestions generated)
  3. Admin reviews AI suggestions and approves/edits
  4. Translation fires on the approved English content
  5. Admin reviews French translation (optional -- most are good enough)

This ordering matters. You don't want to translate raw, unedited content (garbage in, garbage out). And you don't want to AI-clean the French version separately (it might change the meaning). Clean the English first, then translate the clean version.

The Tricky Parts

Translation quality for wellness terminology. DeepL is excellent for general text but sometimes stumbles on yoga-specific terms. "Vinyasa flow" doesn't need translation. "Heart-opening practice" translates oddly. We added a review step for admins to catch these, but most translations were usable without edits.

Async timing. Translations run as background jobs. If an admin creates content and immediately switches to the French version of the site, the translation might not be there yet. We added a "translation pending" indicator so admins know to wait a few seconds.

Mobility gem gotchas. The Mobility gem is powerful but has sharp edges with eager loading and query scoping. We had to be careful about N+1 queries when loading translated content in list views.

What We Learned

Translate late in the pipeline. Let humans and AI clean up the source content first. Machine translation of clean input produces dramatically better output than machine translation of messy input.

Integration 3: Claude AI (Content Cleanup)

The Problem

Teacher bios arrive in every possible format. Some are professional. Some are stream-of-consciousness. Some are written in third person ("Jane has been teaching yoga for 20 years") when the festival style guide calls for first person. Some are three sentences. Others are three paragraphs of unbroken text.

Manually editing 80+ bios to match a consistent style takes weeks.

The Solution

The AiContent::ClaudeClient sends teacher bios and class descriptions to Claude (Sonnet) with instructions to:

  • Clean up grammar and formatting
  • Add paragraph breaks where needed
  • Normalize to first person
  • Preserve the teacher's voice and personality
  • Keep it concise (150-200 words for bios)

The AI generates a "suggestion" -- it doesn't overwrite the original. Admins see the original and the AI version side-by-side and choose which to use. They can also edit the AI suggestion before accepting.

The Pipeline Integration

AI processing triggers on application creation. A background job calls Claude, stores the suggestion, and marks the application as "AI processed." The admin dashboard shows which applications have AI suggestions ready for review.

Once approved, the cleaned content feeds into the DeepL translation pipeline. The whole flow -- application to published bilingual profile -- can happen in minutes with just a few admin clicks.

The Tricky Parts

Preserving voice. The prompt engineering took iteration. Early versions of the AI cleanup made every bio sound the same -- polished but generic. We tuned the prompts to emphasize preserving the teacher's unique voice while fixing structural issues. The difference between "rewrite this" and "clean this up while keeping their personality" is huge.

Cost management. Running 80+ bios and 150+ class descriptions through Claude adds up. We batch-process during off-peak times and cache results aggressively. Re-running AI cleanup on unchanged content is wasteful, so we track whether the source text has changed before re-processing.

Non-English submissions. Some teachers submit their bio in French. The AI cleanup prompt is English-focused. We added language detection and skip AI cleanup for French submissions, routing them directly to the admin for manual review.

What We Learned

AI content cleanup is most valuable as a suggestion system, not an auto-replace system. Admins trust it more when they have a review step, and the occasional weird AI output gets caught before it goes live.

Integration 4: Google Sheets (Legacy Data)

The Problem

Before the festival app existed, teacher applications came through Google Forms and lived in Google Sheets. The organizing team had years of historical data in spreadsheets, plus their current workflow was built around Sheets.

We couldn't just say "stop using Google Sheets." We had to meet them where they were.

The Solution

A GoogleSheets::Client uses a service account to read from specified spreadsheets. The SheetsImportConfig model maps spreadsheet columns to application fields (teacher name in column A, bio in column B, etc.).

The SheetsImportJob polls every 10 minutes. New rows become new applications. Updated rows update existing applications (matched by unique_id or email).

The Tricky Parts

Column mapping is fragile. If someone inserts a new column in the middle of the spreadsheet, every mapping shifts. We store mappings by column letter, not by header name, which means structural changes break the import. In hindsight, mapping by header name would be more resilient.

Duplicate handling across sources. A teacher might apply via the website form AND appear in the Google Sheet (because the organizer added them manually). We deduplicate by email address, with the most recent update winning.

Permission scoping. The Google service account needs read access to each imported spreadsheet. The organizers manage multiple spreadsheets across different Google accounts. Setting up permissions for each one is a manual step that requires technical support.

What We Learned

Legacy integrations are about respect. The organizing team built their workflow around Google Sheets over multiple years. Replacing it entirely would be disruptive and arrogant. Syncing from it lets them keep their familiar tools while the app handles the heavy lifting downstream.

How They All Connect

The full pipeline for a teacher going from application to published festival profile:

  1. Teacher submits via Google Form -> Google Sheets captures the row
  2. Sheets Import syncs the row to a TeacherApplication
  3. Claude AI generates a cleaned bio and description
  4. Admin reviews and approves the AI suggestion
  5. DeepL translates the approved English content to French
  6. Admin converts the application to an AgendaHost (confirmed teacher)
  7. Eventbrite sync links the teacher's events to the festival calendar
  8. Attendees (synced from Eventbrite) book the teacher's classes

Four integrations, one seamless pipeline. Each service does what it's best at. The Rails app orchestrates the flow.

The Numbers

  • Eventbrite syncs: ~300 API calls/day during peak ticket sales
  • DeepL translations: 400+ content pieces translated EN->FR
  • Claude AI suggestions: 80+ bios and 150+ descriptions cleaned
  • Google Sheets imports: 3 spreadsheets, polling every 10 minutes
  • Total manual hours saved: Estimated 200+ hours over the festival planning cycle

The integrations cost about $50/month total in API usage (DeepL and Claude). The time savings are worth orders of magnitude more.


City of Om runs on a platform built by Loadout. Need to integrate multiple third-party services into a cohesive workflow? Let's talk.