How PadelPickle works

Behind the live UK padel and pickleball map sits a repeatable data pipeline. It discovers venues, checks their facilities, and updates the site so players can trust what they see.

Overhead view of multiple padel courts ready for play
The data pipeline keeps pace with new courts so rows of fresh venues appear on the map as soon as construction wraps.

Step-by-step pipeline

1. Map the search grid

We build a grid that covers the UK, spacing points so every postcode area gets swept. Each point triggers Google Places Nearby Search calls for “padel” and “pickleball”, capturing clubs, leisure centres, and new pop-up courts.

2. Collect structured details

Place IDs feed into the Google Place Details API to pull official names, addresses, coordinates, ratings, phone numbers, and website URLs. Duplicate results are merged so one venue equals one listing.

3. Follow the links

For venues with websites, a secondary script crawls key pages to spot indoor/outdoor counts, floodlights, coaching, membership offers, and booking links. We keep the heuristics conservative to avoid guessing.

4. Run quality checks

New datasets are compared with previous runs. If a venue disappears, we flag it for manual review before removing it. Added or changed fields get highlighted so we can sample check accuracy.

5. Publish to the map

Once a batch passes checks, the JSON files under /data are updated. The map, city pages, and landing stats read directly from those files, so every deploy reflects the latest crawl.

What the dataset includes

Core fields
Club name, formatted address, latitude/longitude, Google Maps link, sport, and website URL.
Amenities
Indoor or outdoor courts, floodlights, changing rooms, showers, parking, cafe/bar/shop, wheelchair access, coaching, equipment hire, membership, and drop-in sessions.
Signals
Ratings, review counts, booking links, noted surfaces, and minimum price if clubs publish it.

Update rhythm

Full crawls run at least monthly, with hotfix runs when clubs announce new builds or price changes. Because everything is scripted, pushing a refresh is quick: regenerate data, review the diff, and redeploy.

If you spot something outdated, raise an issue on GitHub or send a note via the contact details in the privacy policy. Verified updates can be merged into the next batch or, if urgent, patched immediately.

Technology under the hood

The collector scripts are written in Python, using the Google Maps Places API plus lightweight HTML parsing. The web app is static: Leaflet powers the map, data ships as JSON, and everything can be hosted on simple object storage or a static site platform.