Selected technical work
Platforms, products, and live systems
Five representative production programs: a multi-year K-12 platform, a commerce-grade booking product, an editorial and social automation engine, a telephony-backed voice bundle, and a managed conversational AI service. Together they show how VeraLux works across product strategy, full-stack implementation, packaging, and ongoing operations.
Serious systems, explicit roles, and delivery meant to stay running after launch—whether or not you touch the filters below.
Part of the VeraLux stack: AI Solutions · Web Design · Consulting · Contact
Production intent
These are not mockups. Every entry includes shipping discipline: APIs, background work, configuration, and operator-facing surfaces where the domain requires them.
Stack breadth
Python and FastAPI, TypeScript and React, Node services, PostgreSQL, Redis, Docker-style packaging, and carrier-facing voice paths appear where the problem demands them.
Role clarity
Student, parent, staff, customer, barber, owner, tenant admin: boundaries between audiences are modeled explicitly, not implied by a single shared login.
Long-term fit
Architecture leaves room for policy change, new integrations, and operational handoff. The goal is maintainable systems, not one-off demos.
How to read each project
Projects are ordered on purpose: institutional platform depth first, then voice infrastructure, then commerce and media breadth, then managed AI operations. The summary lists what shipped and the primary technologies. At the top of each card, a compact stylized UI excerpt echoes colors and layout patterns from the real product (not a live embed). The expandable section is written for buyers—risk, time to value, and what class of vendor you are hiring when one team must span product, infrastructure, and operations.
Long-form case studies and deep-dives are linked from each project card on veralux.ai where noted.
Stylized UI excerpt · MODE web app (Next.js)
MØDE Liberty Launch Academy
Product name MØDE is pronounced “mode” (the letter Ø is part of the brand spelling).
Learning, growth, and governed AI features for an operating school.
VeraLux designed and built MØDE as Liberty Launch Academy’s primary digital layer for instruction-adjacent workflows: multiple secure portals, structured data for learners, and AI-assisted experiences that stay tied to reviewable sources and staff oversight. The program reflects multi-year product ownership, not a one-off integration.
What shipped
Representative stack FastAPI, Next.js, PostgreSQL with pgvector, Celery (or equivalent workers), Redis where appropriate, and vLLM-class model serving for local or controlled inference paths.
- Distinct experiences for students, parents, mentors, and staff backed by a shared domain model and permissioning.
- Retrieval and verification paths so AI-generated or AI-assisted outputs remain accountable to approved content.
- Portfolio-style growth artifacts, exports, and long-running tasks handled outside the request cycle (workers).
- Vector search in PostgreSQL (pgvector), structured APIs, and inference aligned to institutional scale and cost.
Schools and training organizations rarely need “a chatbot.” They need role-aware software, audit-friendly workflows, and AI that respects boundaries. This project shows VeraLux operating at that level: schema design, safety-conscious features, and delivery that assumes real users and real semesters.
For a prospective client, it answers whether we can own ambiguous requirements, coordinate stakeholders, and ship something that still runs after launch week. Detailed architecture, screenshots, and narrative live in the linked case study once you publish it on your domain.
Stylized UI excerpt · neural ops console (static admin)
VeraLux Receptionist bundle
Phone-native conversational AI with installable services, control plane, and tenant configuration.
The Receptionist bundle packages voice runtime, audio processing options, and operator-facing HTML consoles so an organization can route real calls through an AI receptionist instead of only embedding chat on a website. It is built for Docker-oriented deployment, optional GPU-backed components where you choose to run them, and Telnyx-class carrier integration patterns.
What shipped
Representative stack Voice runtime (Node/TypeScript family services), control plane static and dynamic surfaces, Redis as a coordination surface, Python-side audio stacks where enabled, and carrier APIs as configured.
- Separated concerns: voice media path, tenant configuration, and control-plane UI rather than one opaque binary.
- Documentation for operations: ports, environment variables, offline bundles, and realistic failure handling.
- Audio pipeline flexibility (including optional GPU services) documented so teams know what they are signing up to run.
- Foundation for per-tenant branding and policy without collapsing everything into a single shared demo tenant.
Voice exposes every gap that web demos hide: latency, barge-in, codec behavior, and carrier edge cases. This bundle shows that VeraLux will engineer through those layers, not outsource them invisibly.
It is the right reference when your buyer asks for phone lines, SLAs, or on-prem or customer-cloud deployment, and you need proof that the architecture is meant to be operated, not only demonstrated.
Stylized UI excerpt · public booking flow (React)
Loom (VeraLux Booking)
Scheduling, payments, staff tools, and owner oversight in one product codebase.
Loom is a full-stack barbershop operations product: public discovery and booking, authenticated client and barber experiences, and owner or admin workflows for services, staff, and money movement. It exists to prove that a vertical SaaS slice can stay coherent as features accrue, instead of collapsing into unmaintained scripts.
What shipped
Representative stack React (Vite), Node API layer, PostgreSQL with Prisma, Stripe integration, structured admin UI patterns for owners and internal operators.
- Customer-facing booking and account flows separated from staff and owner tools by authentication and routing.
- Commerce-related paths (for example Stripe-backed billing or customer records) wired with explicit server-side checks.
- Operational CRUD for services, schedules, and profiles without exposing internal data on public routes.
- Developer workflow suited to ongoing delivery: typed APIs, migrations, and tests appropriate to a product team.
Many agencies can ship a landing page. Fewer can ship a domain with money, calendars, and multiple logged-in roles without security debt. Loom is the reference when your RFP sounds like appointments plus payments plus staff dashboards, and you need evidence that the team has done it before.
A separate static client showcase exists in the Loom repository for marketing-style walkthroughs; the agency case study file is the right long-form companion for a hire-me audience. Point links at your hosted copies when ready.
Stylized UI excerpt · social workspace (React)
Vivaldi
Editor automation and a social publishing layer that share VeraLux control-plane patterns.
Vivaldi connects post-production reality (for example DaVinci Resolve-oriented workflows) to outbound social channels through explicit APIs, workers, and React-based operator UIs. “Namespaced” here means each integration surface is isolated by contract: fewer accidental cross-calls and clearer failure modes when a vendor API changes.
What shipped
Representative stack TypeScript, React, worker orchestration, REST-style interchange routes, disciplined configuration and logging suitable for always-on editorial teams.
- Background workers that respect project state and editor handoff points instead of blind cron jobs.
- Structured HTTP boundaries between creative tooling, internal services, and external networks.
- Operator-focused screens that match the rest of VeraLux’s control-plane visual language for faster onboarding.
- Room to extend: new destinations, new approval rules, or new asset checks without rewriting the core.
Use Vivaldi when the brief sounds like “connect our creative toolchain to outbound channels” or “stop duct-taping Zapier for every release.” It demonstrates integration architecture under real editorial pressure: retries, observability, and UX that operators can trust.
The repository includes a dedicated portfolio HTML page that mirrors control-plane styling; host or excerpt it alongside this summary when you want a technical reader to go deeper than marketing copy alone.
Stylized UI excerpt · operator dashboard (Node admin)
Solomon
Dedicated chat experiences, integrations, and operations for organizations that outgrow generic widgets.
Client-facing product name: Solomon. Internal repository title: Question Mark Bot.
Solomon is how VeraLux productizes managed conversational AI: per-organization isolation, admin governance, webhook-style delivery into your systems, and an explicit service relationship instead of a shared multitenant pool. The underlying application implements tenant-scoped configuration, queue-backed work, and RBAC-sensitive admin flows suitable for ongoing operations.
What shipped
Representative stack Node.js service tier, structured admin UI, Redis-backed queues, worker processes, database-backed tenancy and roles, and provider keys handled as environment-level secrets.
- Visitor-facing chat surfaces paired with server-side policy, storage, and routing you can stand behind commercially.
- Outbound connections (for example to CRMs, ticketing, or internal APIs) framed as integration contracts, not afterthoughts.
- Operational posture: updates, monitoring, and support channels described as part of the engagement, not a GitHub readme alone.
- Pricing and scope narratives that treat AI usage, hosting, and labor as separate, legible line items for the buyer.
Buyers are rightfully tired of “chat demos” that break under traffic or leak data across customers. Solomon exists to show the opposite: isolation, operational ownership, and integration readiness bundled into how the product is sold and delivered.
Commercial copy and tiering live in the repository’s client-facing overview documents; link those materials from your site when you want procurement readers to see pricing posture and engagement rules in full.
