EngiSphere icone
EngiSphere

Can Self-Driving Cars Handle Long Expressway Tunnels? 🚗

: ; ; ; ; ; ;

How engineers are solving GPS blackouts in ultra-long expressway tunnels for autonomous driving.

Published August 5, 2025 By EngiSphere Research Editors
Self Driving Car Inside an Expressway Tunnels © AI Illustration
Self Driving Car Inside an Expressway Tunnels © AI Illustration

TL;DR

Self-driving cars struggle in long expressway tunnels due to GPS loss, and while sensor fusion helps, a 113.62 cm positioning error proves it's not enough—smart infrastructure like U-GPS, BLE, and UWB is needed for safe, precise autonomous navigation underground.

The R&D

Ever tried using Google Maps in a long tunnel? 📱➡️🕳️ Suddenly, your little blue dot vanishes. The navigation stutters. You’re flying blind—until you emerge back into the sunlight and GPS signals return.

Now, imagine this happening not to a human driver, but to a self-driving car. 🚗💨 In autonomous driving, even a small navigation error can be dangerous. Lane-level precision—knowing exactly where the vehicle is—is critical. But what happens when GPS disappears, like in ultra-long underground expressways?

That’s the challenge tackled by a team of South Korean engineers in a recent study published at the 2025 Suwon ITS Asia Pacific Forum. Spoiler: They’ve made progress, but the solution isn’t perfect—yet. Let’s dive into their experiment, results, and what it means for the future of smart highways. 🛣️🔍

🌐 The GPS Problem in Tunnels

We all rely on GPS every day. But here’s the catch: GPS doesn’t work underground. Or in deep urban canyons. Or under thick tree cover. Why? Because GPS satellites orbit 20,000 km above Earth 🛰️, and their signals are weak. When you enter a tunnel, the concrete and rock block those signals completely. No signal = no positioning.

For human drivers, this is a mild annoyance. But for autonomous vehicles, it’s a serious safety issue. Self-driving cars need to know their position within 50 centimeters or less to stay in their lane, avoid collisions, and make smart decisions. Commercial GPS alone can drift by several meters—sometimes even tens of meters! 📏

So how do you keep a robot car on track when it can’t “see” the sky?

🧪 The Experiment: Testing Navigation in Real Tunnels

To answer this, researchers from the Korea Automotive Technology Institute (KATECH) and Korea Expressway Corporation Research Institute conducted real-world tests in two major tunnels on the Seoul Metropolitan Area 1st Ring Expressway:

  • Suri Tunnel (~1.8 km long)
  • Suam Tunnel (~1.2 km long)

They used a modified KIA EV6, equipped not just with standard GPS, but with a hybrid navigation system combining multiple sensors:

🛰️ GPS – for open-sky positioning
🧲 IMU (Inertial Measurement Unit) – tracks acceleration and rotation
🚘 IVN (In-Vehicle Network) – uses wheel speed and steering data
📸 Camera – sees lane markings and road features
🌫️ LiDAR – creates 3D maps of surroundings

This combo is known as sensor fusion—a smart way to keep navigating when one system fails. Think of it like your brain using sight, balance, and memory to walk in the dark. 🧠💡

But here’s the twist: tunnels are not kind to sensors. Poor lighting, reflective surfaces, dust, and moving vehicles cause signal noise and errors. So the team didn’t just drive through—they trained AI models to recognize 18 types of tunnel infrastructure, like:

🧯 Fire extinguishers
🔔 Emergency alarms
🚪 Evacuation signs

These fixed objects act like landmarks—helping the car re-calibrate its position, even without GPS. 🎯

🤖 AI That "Sees" the Tunnel

To make sense of the camera and LiDAR data, the team used three AI models:

  1. YOLOP – Detects lane markings (even faded ones)
  2. YOLOv9 – Spots 2D objects like signs and equipment
  3. CasA (Cascade Attention Network) – Builds 3D object detection from LiDAR point clouds

These models were trained on 30 km of tunnel data from Seoul and Gyeonggi Province. That’s a lot of concrete walls, lights, and emergency boxes! 🏗️📊

By recognizing these objects, the car could correct its position mid-tunnel—like saying, “Ah, there’s Fire Cabinet #7—I must be 300 meters in.”

But AI isn’t perfect. Moving trucks, shadows, and dirty lenses can fool the system. So the team needed a smarter way to combine all the sensor data.

🔗 The Brain: A Smart Kalman Filter

Enter the Loosely Coupled Kalman Filter—a mathematical genius that fuses GPS, IMU, and IVN data in real time. 🧮⚡

Think of it like a conductor orchestrating different instruments. Each sensor plays a note:

  • GPS says: “I think we’re here.”
  • IMU says: “We turned left and sped up.”
  • IVN says: “The wheels spun 100 times—so we moved X meters.”

The Kalman Filter listens to all of them, weighs their reliability, and predicts the most likely position—even when GPS drops out. It also accounts for errors like:

  • Gyroscope bias (sensors drifting over time)
  • Wheel slip (speedometer inaccuracies)
  • Attitude errors (tilt and roll miscalculations)

This system kept the car “aware” during the round-trip drive from Anyang–Pyeongchon to Pangyo, passing through both tunnels.

📊 The Results: 113.62 cm of Error

So, how well did it work?

In open areas, the system was highly accurate—thanks to GPS. But inside the tunnels, where GPS was blocked, the error grew.

The maximum positioning error reached 113.62 cm (~3.7 feet). That’s over 1.1 meters off from the true position. 🎯❌

Now, for your average GPS navigation app? That’s fine. You’ll still get to your exit. But for autonomous driving, that’s a red flag. 🚩

Why? Because 1.1 meters could mean drifting into another lane—especially on a high-speed expressway. Autonomous systems need sub-50 cm accuracy for safe lane-keeping, merging, and obstacle avoidance.

So while the hybrid system worked better than GPS alone, it’s not precise enough for full self-driving in tunnels.

🚧 The Verdict: We Need Smart Infrastructure

The researchers concluded something crucial:

Sensor fusion + AI vision is helpful, but not enough.
To achieve true high-precision navigation underground, we need infrastructure-based positioning.

In other words: the tunnel itself must help the car know where it is. 🏗️🚗

They propose building "K-Underground Expressways" with embedded tech like:

  • U-GPS (Virtual GPS) – Ground-based transmitters that mimic satellite signals
  • BLE (Bluetooth Low Energy) – Beacons on tunnel walls that ping nearby vehicles
  • UWB (Ultra-Wideband) – High-precision radio signals with centimeter-level accuracy

Imagine driving into a tunnel and instantly connecting to a local positioning network—like Wi-Fi for cars. 🌐 These signals aren’t blocked by concrete and can guide autonomous vehicles with lane-level precision.

This isn’t sci-fi. Similar systems are being tested in smart cities and mines. But scaling it to ultra-long expressways? That’s the next frontier.

🔭 Future Prospects: The Road Ahead

This study is a stepping stone. It shows that:

✅ Current navigation tech can function in long tunnels
❌ But it’s not accurate enough for full autonomy
🚀 The future lies in vehicle-to-infrastructure (V2I) integration

Here’s what’s coming next:

1. Smart Tunnels with Positioning Beacons

Install UWB or BLE anchors every few meters in tunnels. These act like “GPS towers” underground, giving cars real-time location fixes within 10–30 cm accuracy.

2. HD Maps + AI Landmark Matching

Create high-definition 3D maps of tunnels. Cars can then match live LiDAR scans to the map—like facial recognition, but for road infrastructure. 🗺️👀

3. 5G & C-V2X Connectivity

Use 5G networks or Cellular Vehicle-to-Everything (C-V2X) to stream positioning data from the road to the car. This enables real-time updates and traffic coordination.

4. Hybrid Navigation 2.0

Combine infrastructure signals + sensor fusion + AI into one seamless system. Even if one part fails, the others keep the car on track.

The goal? A fully autonomous drive from city to city, even through 10-km-long underground expressways, without losing a single centimeter of precision. 💡🛣️

🌍 Why This Matters

South Korea isn’t the only country building long underground highways. China, Norway, Switzerland, and even the U.S. are expanding underground expressways to reduce surface congestion and protect landscapes.

But without smart navigation, these tunnels become autonomous driving dead zones. This research highlights the urgent need to design roads with self-driving cars in mind—not just human drivers.

It’s a shift from vehicle-centric to infrastructure-enabled autonomy. The car is smart, but the road is smarter. 🤝

📌 Key Takeaways
  • GPS fails in tunnels → big problem for self-driving cars
  • Hybrid navigation (IMU + camera + LiDAR + AI) reduces error
  • Max error: 113.62 cm → too high for autonomy
  • Solution? Smart tunnels with U-GPS, BLE, UWB beacons
  • Future: Infrastructure-powered, lane-accurate underground driving
🏁 Final Thoughts

So, can self-driving cars navigate ultra-long underground expressways today?

Not reliably.

But thanks to studies like this, we’re getting closer. The KATECH team proved that sensor fusion and AI vision help, but also showed that infrastructure support is non-negotiable for safety and precision.

The future of autonomous driving isn’t just about better cars—it’s about smarter roads. 🛠️🛣️

And when the K-Underground Expressway rolls out with U-GPS, BLE, and UWB tech? That’s when robot cars will finally drive through tunnels—eyes closed, GPS off, and perfectly on track. 🤖✅

Stay tuned to Engisphere for more breakthroughs in engineering that move the world—literally. 🌍⚙️


Concepts to Know

🧭 GPS (Global Positioning System) - Your car’s “I’m here!” signal from space. 🛰️ A network of satellites that tells your device (or car) where it is on Earth. Super handy—until you go underground and it goes poof! 📴 - More about this concept in the article "Revolutionizing Freight Wagon Tracking: How RFID Wake-Up Receivers Are Changing Rail Logistics 🚆 📡".

🔄 IMU (Inertial Measurement Unit) - The car’s inner sense of motion. 🌀 This sensor tracks how fast the car is accelerating and how much it’s turning. Think of it like your inner ear—helping you “feel” movement even with your eyes closed. But like us, it can get dizzy over time! 😵‍💫

🚗 IVN (In-Vehicle Network) - Your car’s nervous system. 🧠⚡ It’s how different parts of the car (like wheels, brakes, and computer) talk to each other. It shares data like wheel speed and steering angle—helping guess how far the car has moved, even without GPS.

🤖 Sensor Fusion - Teamwork makes the dream work. 🤝 Combining data from GPS, cameras, IMU, and more to get the best possible guess of where the car is. Like using sight, sound, and touch together to walk in the dark. - More about the concept in the article "Revolutionizing Autonomous Driving: MapFusion's Smart Sensor Fusion 🚗💡".

🛣️ Dead Reckoning (DR) - Guessing your location based on where you were. 🤔📍 When GPS is gone, the car says: “I was here, I moved this fast for this long, so I must be here now.” But small errors add up—like walking blindfolded and guessing each step.

🎯 Kalman Filter - The math wizard that smooths out the noise. 🧮✨ A smart algorithm that takes messy sensor data and predicts the most likely position. It’s like a referee that says, “GPS is off, IMU is drifting, but based on all clues—here’s your best bet.” - More about this concept in the article "AI from Above 🏗️ Revolutionizing Construction Safety with Tower Crane Surveillance".

👁️ LiDAR (Light Detection and Ranging) - The car’s 3D vision with lasers. 🔦🌀 It shoots invisible laser beams and measures how long they take to bounce back. Creates a detailed 3D map of the world—perfect for spotting walls, signs, and other cars. - More about this concept in the article "Flying into the Future 🚁 How UAVs Are Revolutionizing Transportation Infrastructure Assessment".

📸 Camera (Vision System) - The car’s eyes. 👀 Uses AI to “see” lane lines, signs, and objects. But just like us, it struggles in dark, foggy, or reflective tunnels. 🌫️

🧠 AI Object Detection (YOLOP, YOLOv9, CasA) - Smart algorithms that play “spot the difference.” 🔍

  • YOLOP: Sees lane markings (even if faded).
  • YOLOv9: Spots 2D objects like signs and fire extinguishers.
  • CasA: Uses LiDAR to detect 3D objects in 3D space.

They’re like the car’s brain saying: “Hey, I recognize this wall sign—I must be near exit 7!” - More about this concept in the article "Smarter Helmet Detection with GAML-YOLO 🛵 Enhancing Road Safety Using Advanced AI Vision".

📍 U-GPS (Virtual GPS / Underground GPS) - Fake GPS signals… but in a good way! 🌐🔧 Ground-based transmitters inside tunnels that mimic satellite signals. Gives cars GPS-like positioning even underground. Think of it as “Wi-Fi for location.”

🔗 BLE (Bluetooth Low Energy) - Tiny wireless beacons that whisper, “I’m here!” 📡💬 Small devices on tunnel walls that send signals to cars, helping them know their exact spot. Uses little power and works great in tight spaces.

📏 UWB (Ultra-Wideband) - Super-precise radar for centimeter-level accuracy. 🎯📡 A high-frequency radio tech that can locate objects within 10–30 cm. Like sonar, but way more accurate—perfect for self-driving in tight lanes.

🏗️ K-Underground Expressway - South Korea’s smart tunnel dream. 🇰🇷🛣️ A future network of ultra-long underground highways equipped with U-GPS, BLE, UWB, and AI systems to support safe autonomous driving—even with zero GPS.

🚦 V2I (Vehicle-to-Infrastructure) - When cars and roads become BFFs. 🤝🚗 Cars talk to traffic lights, signs, and tunnels to get real-time data. Imagine your car getting a text from the road: “Slow down—accident ahead.” 📱⚠️

🧩 HD Map (High-Definition Map) - Google Maps, but for robots. 🗺️🤖 Super-detailed 3D maps with exact positions of lanes, signs, and curbs. Self-driving cars use them like a puzzle reference—matching what they see to what’s expected. - More about this concept in the article "🗺️ GlobalMapNet: Revolutionizing HD Maps for Self-Driving Cars".


Source: Choi, K.-S.; Sa, Y.-H.; Choi, M.-G.; Kim, S.-J.; Lee, W.-W. A Study on High-Precision Vehicle Navigation for Autonomous Driving on an Ultra-Long Underground Expressway. Eng. Proc. 2025, 102, 10. https://doi.org/10.3390/engproc2025102010

From: Korea Automotive Technology Institute; Korea Expressway Corporation Research Institute.

© 2025 EngiSphere.com