Live
Black Hat USAAI BusinessBlack Hat AsiaAI BusinessMusk Announced a $25B Chip Factory That Nvidia’s CEO Says Is “Impossible.”Medium AIGoogle Paid $2.7 Billion to Rehire Someone It Let Walk Out the Door. Read That Again.Medium AII Turned My Boring Daily Routine Into a $1,200/Month AI Side Hustle – With Zero Coding and Only 2…Medium AIThe New Inequality: ‘AI’ability Over CredentialsMedium AIQwen3.6-Plus and agentic AI: What Alibaba’s new model does - potions.sgGNews AI multimodalDo Students Actually Use Google AI Mode?Medium AILearning to Rank: A Deep Technical Dive from Pointwise to ListwiseMedium AII Built Three Executive-Level Analytics Tools in Three Hours Using AI.Medium AIMicrosoft and Google release new AI models on the same day: capabilities in speech, image, and local open-source solutions fully deployed. - 富途牛牛GNews AI multimodalI turned to PrivacyBee to clean up my data - here's how it made me disappearZDNet AIFaraday Future Announces Its Latest Robot, the FX Aegis Quadruped, has Completed Its Full Compliance Certification in the United States - Business WireGoogle News - AI roboticsAI models will deceive you to save their own kindThe Register AI/MLBlack Hat USAAI BusinessBlack Hat AsiaAI BusinessMusk Announced a $25B Chip Factory That Nvidia’s CEO Says Is “Impossible.”Medium AIGoogle Paid $2.7 Billion to Rehire Someone It Let Walk Out the Door. Read That Again.Medium AII Turned My Boring Daily Routine Into a $1,200/Month AI Side Hustle – With Zero Coding and Only 2…Medium AIThe New Inequality: ‘AI’ability Over CredentialsMedium AIQwen3.6-Plus and agentic AI: What Alibaba’s new model does - potions.sgGNews AI multimodalDo Students Actually Use Google AI Mode?Medium AILearning to Rank: A Deep Technical Dive from Pointwise to ListwiseMedium AII Built Three Executive-Level Analytics Tools in Three Hours Using AI.Medium AIMicrosoft and Google release new AI models on the same day: capabilities in speech, image, and local open-source solutions fully deployed. - 富途牛牛GNews AI multimodalI turned to PrivacyBee to clean up my data - here's how it made me disappearZDNet AIFaraday Future Announces Its Latest Robot, the FX Aegis Quadruped, has Completed Its Full Compliance Certification in the United States - Business WireGoogle News - AI roboticsAI models will deceive you to save their own kindThe Register AI/ML
AI NEWS HUBbyEIGENVECTOREigenvector

Reporting potholes with an ESP32, LoRA, and AI

Hacker News AI Topby flakespancakesApril 2, 20261 min read0 views
Source Quiz

Article URL: https://thingswemake.com/pothole-in-one/ Comments URL: https://news.ycombinator.com/item?id=47620039 Points: 2 # Comments: 0

I made a one-tap solution for reporting roadway potholes to the NYC DOT using an ESP32 microprocessor, a GPS module, some long-range radio and just a pinch of agentic AI. (Just my little way to contribute to Mamdani’s ongoing “pothole blitz.”)

Finally filling a blank spot on my dashboard, after 20+ years

I drive a 2005 Toyota Prius, which has turned out to be a mostly indestructible car. But as time goes on, its suspension has really lost most of its will to live. So each spring — when pothole season is at its worst after a rough winter of freeze/thaw cycles, salt intrusion, and snow plows on top of daily traffic — I really feel every bump.

So for several years now I’ve pondered building some kind of automated pothole sensor for the Prius, thinking I could log lat/longs and then submit pothole reports later. (An important aside: as I’ve mentioned here before, the NYC DOT reports shockingly swiftly and reliably to pothole reports! But you have to fill out a kinda long and annoying old form, and of course you have to remember where your pothole was.) I never got around to it because it seemed like it’d probably be prohibitively challenging to get an accelerometer or vibration sensor to accurately distinguish between potholes and the other various and sundry bumps of street life. (Speed bumps, high curbs, and even door slams come to mind!)

The DOT’s pothole form, which includes helpful details on the difference between potholes, hummocks and other roadway defects!

But after spending so much time last year playing with ESP32s and LoRA radio, it suddenly dawned on me: why go to all the trouble of automating pothole sensing when a button would do the job just fine?

I’ve had some success reverse-engineering other DOT forms so I could POST more streamlined versions from my own server and at first I thought this would be no different. I grabbed a Raspberry Pi Zero 2 and used mitmproxy to sniff out what the form actually sends, but… womp womp. There was a backend ReCAPTCHA hash that got sent with the form, and try as I might I could not defeat it with any headless playwright-based browsing action running on the Pi. And I really did try. I even went as far as buying credits on Capsolver so I could generate CAPTCHA hashes remotely and then pass them with my form. But whatever I came up with continued to trip the ReCAPTCHA flag and I could never get the form reliably submitted.

So then I thought: okay, well, maybe it’s just a job for a bot? It seemed extremely over-engineered, but I was excited to see if I could kick the tires on any API-powered agentic AI tools that might be more successful at appearing human to ReCAPTCHA.

I stumbled upon Anchor Browser, which seemed to fit the bill perfectly. Using its playground, I was able to craft a test prompt with real pothole data and watch as it spun up a headful Chromium instance and worked to fill out my form:

It worked great, easily adapting the details of my prompt to the complexities of the DOT form. More incredibly, if I ever gave it imperfect information it did a great job of proactively opening a new tab and discerning the right data before returning to the form. And most importantly, it generally didn’t trip the ReCAPTCHA flag — or if it did, it simply tried to fill out the form again which was amost always sufficient to convince ReCAPTCHA of its humanity. And the API was further capable of generating screenshots of the session, so I could not only extract confirmation details about my pothole reports but see visually that the reports had been run successfully. Wild.

Whoops, the robot was right! It’s Dekoven Court, not Dekoven Road.

Since a Pi now seemed like far too much trouble, I decided on a simpler approach. I decided to use an ESP32 with LoRa and GPS modules, and then use a button to trigger the uplink of GPS coordinates to a remote LoRaWAN server, namely The Things Network. From there, I’d use webhooks to trigger a remote workflow that mainly involves reverse-geocoding the lat/long via Nominatim and passing the resulting street address to the Anchor Browser API, and finally sending myself a confirmation notification via ntfy.sh once the form has been submitted.

(If this sounds like a preordained solution, it’s because I already had to figure out most of this stack for the drawbridge tilt sensor!)

So I grabbed some components I had lying around — a Heltec LoRa ESP32 v3 board, a GPS module, a ChromaTek neopixel button and a little piezo speaker — and quickly cobbled together a prototype.

Don’t mind the chaos. Also, I’ll note that the Heltec LoRa v4 board comes with a GPS module built-in, but I already had the older v3 board on hand and the v4 board doesn’t have an external antenna port anyway, so I just popped on a separate HiLetGo GPS module.

You can view my code here on Github, but I’ll just call out a few considerations I worked through. They mainly revolve around one fact: the button is mounted on the dash of my car, which is moving through space and time, unlike many static installations. This means that the button is only meaningful if I have a GPS lock and am located within the borders of NYC, so my code attempts to compensate for that. (Well, at least the former. I haven’t yet implemented a check against an NYC polygon!) It also means that I want ways to reduce annoyance / distraction while driving, so the code tries to get a timestamp from GPS and automatically darken the neopixel if the sun has set, and includes a long-press mode that can mute/unmute the piezo which chirps when the LoRaWAN uplink has been successfully sent. It also means that we might not come into proximity of a LoRaWAN gateway for a while (perhaps after the car has been powered off and then turned back on), so the code maintains a cache of unsent GPS coords and continues to attempt to connect to TTN on an ongoing basis.

When all is said and done, it’s super satisfying to press one button and know that some agentic browser is filling out an annoying form on my behalf, and also know that in just a couple of days the pothole I’ve reported will almost certainly have been filled.

And that’s about it! To everyone at the DOT who actually goes out and fills these potholes: thank you so much for your hard work. For everyone else whose cars, bikes, rollerblades, wheelchairs and strollers are struggling with NYC streets, I hope this helps a little.

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

report

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Reporting p…reportHacker News…

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 175 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!