LOADING

TIER IV

Apr 11, 2025BLOGAn autonomous driving road trip in San Francisco and Los Angeles

20250411-US-road-trip-01

I'm Koichi from TIER IV’s System Software team. Last autumn, I traveled to the U.S. with six colleagues to experience autonomous vehicles firsthand in San Francisco and Los Angeles. We all joined TIER IV in 2024 and were eager to see how the technology was being used abroad. In this report, we’ll share our observations as engineers working on autonomous driving software.



During the trip, we rented a Tesla and took multiple rides in Waymo robotaxis. The following is a brief summary of our experiences in each vehicle.



Waymo (SF)

  • Ride time: About 500 minutes
  • Impressions: Impressive. It handled low-level obstacles well and we never encountered a situation where rider support was needed.


Waymo (LA)

  • Ride time: About 200 minutes
  • Impressions: Quite sophisticated, but not as fine-tuned as the San Francisco service. Despite the shorter ride time, we encountered a situation where rider support had to intervene.


Tesla

  • Ride time: About 200 minutes
  • Impressions: High-quality autonomous driving, handling tasks like reverse parking in indoor garages and smooth merging with traffic. However, since it doesn't use pre-mapped data, it sometimes attempted lane changes in no-passing zones, so the person in the driving seat had to concentrate on the road.


Overall, the autonomous driving performance of the Tesla and Waymo vehicles was impressive, but one key difference stood out – where the autonomous functionality can be used. As of writing, Waymo’s taxi service is available in a limited number of cities: San Francisco, Los Angeles, Phoenix, and Austin. Meanwhile, Tesla’s mass-produced vehicles with autonomous driving features are being driven on roads across the world.



It's important to note that this does not necessarily mean Tesla outperforms Waymo. The Tesla we used during this trip occasionally displayed unstable behavior, such as attempting to overtake in no-passing zones or failing to enter the correct turn lane at an intersection. In contrast, Waymo's taxis have detailed maps of the areas where they operate. By referring to information about intersections, lanes, and surrounding buildings, Waymo is able to provide a reliable autonomous taxi service, especially in San Francisco.



Tesla

We rented a Tesla Model Y equipped with version 12.5.4 of the company’s Full Self Driving (Supervised) software. As the name suggests, it has advanced driving capabilities, but it is classified as Level 2 under the Society of Automotive Engineers autonomous driving categorization.



There are clear distinctions between autonomous driving levels:



  • Level 5 represents fully autonomous driving anywhere, with no need for human intervention.
  • Level 4 describes autonomous driving in specific areas or under certain conditions, such as clear weather in San Francisco. Waymo's robotaxi service and TIER IV's autonomous bus in Shiojiri, Nagano Prefecture, have received Level 4 certification.
  • Level 3 requires a human at the wheel, but in certain conditions, the driver can disengage while the vehicle drives and perform other activities, such as reading a newspaper or watching a movie. However, the driver must remain ready to take control if necessary.
  • Level 2 requires a human driver to remain attentive at all times. Level 2 systems include basic functions like lane-keeping and cruise control, as well as navigation, signal recognition, and obstacle avoidance.
  • Level 1 involves very basic automation, such as adaptive cruise control, and Level 0 means no driving assistance.


The Model Y was able to handle most complex road situations autonomously. It can drive fully autonomously in certain situations, such as starting from a parallel parking spot, following a navigation route, and stopping at the destination.



When we got the car, the driving mode was set to "Assertive," the most aggressive setting. The ride felt a bit rough, making it hard to relax. However, after switching to the default "Chill" mode, which observes the speed limit, the driving became much safer and more comfortable. Even in this mode, the car was able to merge onto highways during rush hour and perform lane changes and overtakes when necessary.



All in all, the driving performance on public roads was impressive and generally felt safe. However, there were some navigation mistakes, such as when the car was confused by a new bike lane separating the right-turn and left-turn lanes. The car wanted to go right but missed the section where it could cross the bike lane and instead continued into the left-turn lane. We intervened before the vehicle attempted the right turn, ensuring there was no traffic violation. After steering it into the correct lane, we re-engaged FSD, and the car continued on the correct path. The bike lane wasn’t yet listed on Google Maps, which led us to assume the issue was caused by a failure of the vehicle’s cameras to accurately interpret the road layout.



After that incident, we had a 30-minute uninterrupted autonomous drive at night, and on this occasion, the vehicle handled the bike lanes correctly.



Overall, we were impressed with the self-driving performance. The Tesla handled multi-lane changes and frequent lane shifts on highways smoothly, just like a human driver. Even at night, its camera-based perception system performed reliably, and the drive generally felt safe.



Waymo

Sensors

The fifth-generation Waymo Driver is a Jaguar I-PACE equipped with about 25 cameras, five LiDARs, six radars, and other sensors, including microphones and ultrasonic sensors.

20250411-US-road-trip-02

Multiple sensors are housed in a unit on the roof of Waymo vehicles.



LiDAR

The vehicle is equipped with one 360-degree top LiDAR for long-range sensing and four short-range LiDARs, positioned at the front, rear, and by the front wheels. These perimeter LiDARs assist in navigating tight gaps and detecting small objects close to the car. The top LiDAR is mounted above an array of 12 cameras arranged in a circle, enabling sensor fusion that enhances both recognition accuracy and mapping precision.



Radar

Radar sensors are mounted in black pods at the front and rear corners of the vehicle: two at the front corners facing sideways and two at the rear corners primarily facing backward. Additionally, two forward-facing radar units are installed on the roof rack that houses the top LiDAR and vision system. Together, these sensors provide full 360-degree coverage of the vehicle.



Cameras

The units housing the LiDARs on the front, rear and sides of the vehicle also include downward-facing cameras and infrared LEDs. Additionally, there are 6 cameras in the radar pods, forward-facing cameras with different fields of view, and, as mentioned above, 12 cameras arranged in a circle below the top LiDAR.



The cameras with infrared LEDs and some of the roof-mounted cameras appear to be either RGB-infrared cameras or infrared-only cameras. This setup provides clear visibility even at night, when conventional RGB cameras are less effective at capturing details.



Regarding sensor performance, Waymo uses rotating LiDAR housings and small wipers to keep the sensors clean. This helps reduce the frequency of returns to service stations and enables the vehicle to continue operating even in bad weather conditions. Speaking of which, during one late-night road trip in thick fog, the vehicle was able to drive without issues.



Recognition performance

Waymo’s perception capabilities were impressive in both San Francisco and Los Angeles, consistently detecting a wide range of objects. In addition to common road users like cars and pedestrians, it accurately recognized less conventional objects such as traffic cones and poles. On the display, detected objects appeared as point clouds. Moving objects are rendered with a graphical underlay, possibly to help passengers distinguish between moving and stationary objects. The system also displayed turn signals, brake lights, and the status of emergency vehicles when applicable. While most objects were depicted as abstract representations, traffic cones were clearly identifiable as orange and white cones. Notedly, detected objects did not have text labels.



20250411-US-road-trip-03

Traffic cones are depicted as orange and white cones.



Night in the fog

We ordered a Waymo to collect us from the summit of a small mountain called Twin Peaks. The sun had already set, and thick fog made visibility poor. With little traffic and few pedestrians, there weren’t many objects to detect, but the vehicle still maintained stable recognition under these challenging conditions. At one point, while navigating a curve, a skateboarder suddenly appeared in front of us. The vehicle immediately detected the person and braked sharply, avoiding a collision.



20250411-US-road-trip-04

Thick fog made visibility poor during a journey from the summit of Twin Peaks.



Emergency vehicles

When we were on board, a police car happened to pass by, and a display indicating an emergency vehicle appeared on the monitor. It is not clear whether it recognized it because the siren started to sound or because of its appearance. Incidentally, when a blood donation vehicle passed by, it was also recognized as an emergency vehicle on the monitor.



20250411-US-road-trip-05

Emergency vehicles are indicated with a red underlay in the Waymo user interface.



Rubbish on roads

At times, the vehicle avoided small pieces of trash and wood – obstacles it could have easily driven over – even when nothing was displayed on the monitor. On many occasions it also misidentified turn signals, recognizing them as being on even though they were not.



Route planning

Navigating narrow roads

Even in complex traffic scenarios, Waymo vehicles consistently demonstrated sound driving judgment. One such instance occurred on a narrow road in a residential part of LA.

20250411-US-road-trip-06

A garbage truck blocks our path on a residential street in Los Angeles.



A garbage truck collecting trash stopped in front of us. The adjacent lane was for oncoming traffic and cars were parked on both sides of the road, leaving only a narrow space to overtake the truck. Even for a human driver, this situation would be quite challenging. First, a decision must be made whether to overtake. Once the decision is made, the driver must carefully navigate the narrow space while observing oncoming traffic. The Waymo vehicle initially decelerated as it approached the rear of the truck.



20250411-US-road-trip-07

Oncoming traffic makes it difficult to overtake the truck.



It then detected an oncoming vehicle in the adjacent lane. This was likely detected by the LiDAR sensor on the front left of the Waymo vehicle. A significant advantage of autonomous driving systems is the ability to detect oncoming vehicles early and stop accordingly, even when they are hidden from a human driver's view. After safely yielding to the oncoming vehicle, the Waymo overtook the truck with minimal clearance.



20250411-US-road-trip-08

The vehicle navigates around the garbage truck when the lane is clear of oncoming traffic.



Handling jaywalkers

Some pedestrians attempted to cross the road at locations other than crosswalks, requiring the vehicle to anticipate their actions and either stop or maneuver to avoid a collision. While driving through downtown San Francisco, we encountered a pedestrian who suddenly appeared from a blind spot.



20250411-US-road-trip-09

A pedestrian walks into the road from a blind spot.



To avoid the pedestrian, the vehicle gently turned the steering wheel to the left and gradually came to a stop. Waymo predicted that the pedestrian would cross the road and was able to execute the avoidance maneuver and stop. However, the pedestrian paused when he noticed the car, looked around, and decided not to cross the road at that point. The Waymo vehicle immediately recognized that the pedestrian had decided not to cross and resumed driving. This demonstrates the accuracy of Waymo's behavior prediction and the precision of its path planning.



It’s worth noting that in certain situations, Waymo Driver can contact a human fleet response agent to get additional context about its environment. So in some of the cases we observed, it’s possible that remote support was quietly assisting behind the scenes.



UI/UX

The vehicle is equipped with one display at the front and one at the rear, both of which are touchscreens. You can change the background music, press the stop button, or contact support without needing a smartphone app. The display also always shows the remaining time and estimated arrival time to the destination.



20250411-US-road-trip-10

When the vehicle stops at a red light, the traffic signal appears on the user interface.



The planned route (predicted path) is displayed in green, while the actual route taken (actual path) is shown in blue, making it easy to visually distinguish between the two. The screen also shows other road traffic and pedestrians using data from the vehicle’s 25 sensors. When stopped at a red traffic light, a red mark is shown, so even if the traffic lights are not visible to passengers, they can tell whether the vehicle is stopped due to traffic or waiting at a red light.



Before boarding
Nearly 300 Waymo vehicles are operating on San Francisco roads. To make it easy for users to identify their vehicle, a two-character display on the vehicle's roof can be personalized via the app.



During the ride
Passengers can stream music directly from their smartphones or choose from various playlists, including rock, jazz, and Disney hits, via iHeartRadio, the largest radio station in the U.S.



After arrival
If the trunk is opened before the ride, the trunk will open at the destination. “Don't forget your items in the trunk” also appears on the in-car displays.



20250411-US-road-trip-11

Passengers who use the trunk are reminded to collect their items at the destination.



Rider support

During our stay in Los Angeles, we encountered a situation where the vehicle got stuck. However, Waymo’s support response was quick and effective. When trying to exit a parking lot onto a four-lane road, our vehicle got stuck trying to enter the second lane. The second lane was congested, but the first lane was clear, so a human driver would have used that lane and changed lanes further along the road. With the vehicle unable to navigate out of the parking lot, and about seven vehicles lined up behind us honking their horns, we decided to contact support.



20250411-US-road-trip-12

Rider support intervened to help our vehicle exit a parking lot after a steady flow of traffic prevented it from entering the targeted lane.



Rider support is a feature that allows passengers to communicate with an operator via voice call by pressing the support button on the in-car display. We used it to report the problem and an operator assessed the situation and took action remotely, after which the vehicle changed its route and slowly edged out of the parking lot. According to the operator, the vehicle was not being driven remotely. The system changed the route based on an instruction sent by the operator.



Wrap-up

In this post, we’ve summarized our impressions of autonomous driving on U.S. roads. While we were impressed by the quality of the rides, it’s clear there are still technical hurdles to overcome before Level 5 autonomous driving is realized. This is true even for Waymo and Tesla, underscoring the challenges ahead.






Koichi Imai | System Software team

Koichi joined TIER IV full-time in April 2024 and is currently developing middleware and operating systems. He first became involved with TIER IV in 2022 as a part-time engineer and student researcher. He holds a master’s degree from the University of Tokyo’s Graduate School of Information Science and Technology.



TIER IV engineers Yukinari Hisaki, Sho Iwasawa, Masahiro Kubota, Asami Morita, Masato Saeiki and Max Schmeller contributed to this blog post.


Another team of engineers went on a similar road trip in 2023. Check out their report here.






TIER IV is always on the lookout for passionate individuals to join our journey. If you share our vision of making autonomous driving accessible to all, get in touch.



We’re currently hiring for the following related position:


Visit our careers page to view all job openings.

If you’re uncertain about which roles align best with your experience, or if the current job openings don’t quite match your preferences, register your interest here. We’ll get in touch if a role that matches your experience becomes available, and schedule an informal interview.




Media contact
pr@tier4.jp
Business inquiries
sales@tier4.jp



Social Media
X (Japan/Global) | LinkedIn | Facebook | Instagram | YouTube



More