Local Development & Deployment
Where the live app lives, how it deploys, and how to set up a local environment when you need to change code.
Production environment
- Live URL:
https://clients.flooringpros-marketing.com/ - Hosting: Hostinger (Node.js plan).
- Repo: GitHub, under Aniello Infantini's account. Push access limited to Aniello today; collaborators are added by invitation.
- Branch:
mainis the deploy branch. Anything merged here becomes live within minutes via Hostinger's GitHub integration. - Build step: Hostinger runs
npm install && npm run buildautomatically on each push. - Environment variables: stored in the Hostinger control panel, not in the repo. Updates require a redeploy to take effect.
- Database: Supabase (production project). Schema changes go through the Supabase SQL editor — see Schema Reference.
main to production. There is no staging server. Test changes locally before merging — once a commit hits main it is in front of paying clients.When you actually need a local environment
For most data-analyst work — auditing metrics, mapping new variables, writing Supabase queries — you do not need to clone the repo. Use the live dashboard plus the Supabase SQL editor.
You will need a local environment when:
- You are changing the React UI (a new card, a renamed metric).
- You are adding or modifying an API route.
- You are debugging an integration that returns the wrong data and you need to inspect the raw API response.
Prerequisites
- Node.js 20.x or newer.
- npm 10+ (yarn and pnpm are not configured — package-lock.json is the source of truth).
- GitHub access to the repo. Ask Aniello to add you as a collaborator.
- Supabase access — either the production project (if Aniello shares the keys) or your own dev project.
- A Google account that is added as a test user on our Google Cloud OAuth client, OR access to the shared agency Google account.
Install
git clone git@github.com:aniello-infantini/<repo-name>.git
cd <repo-name>/coreboard
npm installEnvironment variables
Copy .env.example to .env.local and fill in the values. A minimum working set is:
# Supabase — REQUIRED
NEXT_PUBLIC_SUPABASE_URL=https://xxxx.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=eyJhbGc...
SUPABASE_SERVICE_ROLE_KEY=eyJhbGc...
# Google OAuth — needed for GA4/GSC/GBP/YouTube/Google Ads
GOOGLE_CLIENT_ID=xxx.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=GOCSPX-xxx
GOOGLE_REDIRECT_URI=http://localhost:3000/api/auth/google/callback
GOOGLE_API_KEY=AIza...
# Google Ads — REQUIRED to fetch ads metrics
GOOGLE_ADS_DEVELOPER_TOKEN=xxx
GOOGLE_ADS_LOGIN_CUSTOMER_ID=000-000-0000
# GHL (GoHighLevel) — agency API key
GHL_API_KEY=xxx
GHL_AGENCY_ID=xxx
# Meta Ads — for Meta Ads metrics
META_APP_ID=xxx
META_APP_SECRET=xxx
# Optional but recommended
OPENAI_API_KEY=sk-... # for future AI features
N8N_WEBHOOK_SECRET=xxx # for the metrics push endpoint.env.local and .env are gitignored. Service role keys grant full database access — do not paste them into Slack screenshots, GitHub issues, or this documentation. The production env vars live in the Hostinger panel and should never be copied into a local file unless you are doing a controlled debug.Running the app
npm run dev # Start dev server at localhost:3000
npm run build # Verify production build (run before pushing)
npm run lint # ESLint
npx tsc --noEmit # Type-check without emitting JSFirst login
- Open
http://localhost:3000. The middleware redirects you to/login. - Sign in with your assigned admin email. The first time, Aniello or Dan must invite you via the Supabase Auth dashboard.
- Once logged in, you land on
/dashboard. - Click any client (Denver Carpet & Flooring is recommended for testing).
- Verify the data loads: GA4 metrics, GSC keywords, Google Ads spend, GHL pipeline KPIs.
Deployment workflow
- Pull latest
main. - Create a feature branch.
- Make changes locally. Run
npm run buildandnpm run lint. - Verify the change in your local browser against Denver.
- Push the branch and open a PR against
main. - Aniello reviews and merges. Hostinger picks it up and rebuilds within a few minutes.
- Once it's live on
clients.flooringpros-marketing.com, click through Denver in production to make sure it survived the build.
npm run dev.Connecting Google for the first time
The Google integration is agency-level. One OAuth flow grants access to every Google property (GA4, GSC, GBP, YouTube, Google Ads) the authenticating account can see, and that token is stored once in integrations.provider = 'google_master_agency'.
- Go to
/settingsor open the global Integrations panel. - Click Connect Google. You will be redirected to Google's OAuth consent screen.
- Approve all scopes (analytics.readonly, webmasters.readonly, business.manage, youtube.readonly, adwords).
- You will be redirected back to
/api/auth/google/callback, which writes the token to Supabase.
Testing API endpoints directly
The dashboard fetches from /api/metrics/* endpoints. You can hit these directly while logged in:
# GA4 metrics for a client (property_id required)
curl 'https://clients.flooringpros-marketing.com/api/metrics/ga4?property_id=000000000&start_date=2026-04-01&end_date=2026-05-01' \
-H 'Cookie: <copy your session cookie from devtools>'
# GSC top keywords
curl 'https://clients.flooringpros-marketing.com/api/metrics/gsc/keywords?site_url=https://example.com&start_date=2026-04-01&end_date=2026-05-01'
# GHL pipeline (location_id + client_id)
curl 'https://clients.flooringpros-marketing.com/api/metrics/ghl?location_id=xxx&client_id=xxx&start_date=2026-04-01&end_date=2026-05-01'For convenience there are also unauthenticated test endpoints under /api/test/* that skip the middleware. They are hard-coded to the Denver client and useful when you need to debug an API integration without going through the UI.
Database changes
Supabase is the source of truth. Schema lives in supabase/schema.sql with patch files for every additive migration:
fix-profile-trigger.sql— fix on the auth.users → profiles trigger.update-clients-schema.sql— adds new columns toclients.update-integrations-schema.sql— same forintegrations.add-youtube-field.sql,add-gbp-place-id.sql— newer per-feature migrations.
When you add a column, write a new add-XXX.sql file. Run it manually against the Supabase project via the SQL editor. We do not have automated migrations.