Hey Reddit, we’re Satish Jeyachandran, Mizuki McGrath, and Nathaniel Fairfield, eager to hear your questions on hardware and software development of self-driving technology. Here’s a little background on each of us:

I’m Satish, and I joined Waymo in 2017 to lead the hardware organization focused on designing, integration and scaling Waymo’s self-driving system including cameras, radar, lidar, compute and more. I spent nearly two decades in hardware development and scaling products in the auto industry most recently at Tesla. Hear what I previously said on Waymo’s history and why we’re focused on full autonomy: https://www.ridehome.info/waymos-head-of-hardware-satish-jeyachandran/

Hi, I’m Miz. I joined Waymo earlier this year, after 19 years at Google working on a range of search ranking systems in Web Search, Play and Maps, and founding the Tokyo R&D center. I now lead the simulation and data science teams, inspired by how large-scale real world and virtual driving can bring us closer to making autonomous vehicles an everyday reality. You can read more about some of those efforts here: https://www.theatlantic.com/technology/archive/2017/08/inside-waymos-secret-testing-and-simulation-facilities/537648/

I’m Nathaniel, and I'm one of the earliest members of Project Chauffeur, that went on to become the Google self-driving car project and now Waymo. I've worn many hats during my decade here, including leading our early explorations of transportation as a service and door-to-door driving. I lead the behavior prediction, motion planning, routing and fleet assistance teams. I have a background in computer science and robotics. You might have recently seen me talking about how our Driver tackles the unusual task of driving on Halloween: https://twitter.com/Waymo/status/1189997577099366400

Proof:

Edit: We've got 10 minutes left so we're wrapping up now.

Edit: Redditers, this has been so fun! Thank you for your fantastic questions -- we’re excited to have our first AMA under our belt. We do plan to do more of these in the future so stay tuned for dates!

Comments: 258 • Responses: 23  • Date: 

Unseeablething33 karma

Questions most likely for Nathaniel.

As riders(Waymo One), my wife and I get strange routing decisions. Curious what ultimately causes the vehicles to intentionally avoid quicker routes? Do the cars prefer less traffic at the expense of time? Will they avoid construction areas even if the road is still in use and more efficient? And sometimes they'll go all the way around an area, ignoring a straight line to our destination. Do the turns required to get into a destination also change their pathing?

waymo28 karma

Thanks so much for riding with us! I ride in the cars all over Mountain View, and I sometimes get the scenic route, so I know where you're coming from!

When the car is choosing its preferred route, it considers a number of factors: the length of different routes, traffic on the roads, any construction or other slowdowns.. As you guessed, some maneuvers may lead to a longer route time so we’re able to take those into consideration too. - Nathaniel

rao7928 karma

Pedestrians will often make eye contact with drivers to make sure they have been noticed before crossing the road. How will you address the inability of pedestrians to know whether the driverless cars is aware of their presence?

waymo29 karma

Overall, we are extremely cautious around pedestrians, and we slow and yield, which helps indicate that we see them. We've found that actions speak pretty clearly -- which is the same situation that humans are in at night, for example, when eye contact doesn't work! - Nathaniel

CallMeOatmeal22 karma

You have been scaling driverless tests and driverless rides, so you must have some high level of confidence that these vehicles will not cause an accident. What kind of internal metrics do you use that give you this confidence? Do you have a way to calculate the probability an accident would occur for a given route? Do you have an estimated accident per mile ( or at fault accident per mile ) for those driverless vehicles you are dispatching in Chandler area? Or do you measure confidence in an entirely different way?

(credit /u/Mattsasa for question)

waymo28 karma

what key metrics or milestones will you wait to hit before removing the safety driver?

We have been giving some riders completely driverless rides in parts of Phoenix, which is pretty exciting.

This is a nuanced challenge, and you can fool yourself if you use any one method or statistic to get confidence of the safety of the system. So we use a combination of methods that lets us build a fuller picture… and we continue to improve and add to those methods. At a high level they includes approaches like:

  • driving millions of miles on real public roads and reviewing disengages and other events
  • simulating software over billions of miles, and against an extensive set of challenging scenarios, including a lot that we've never seen in real life, but could happen!
  • real-world testing at our closed course. That’s because there are lots of rare situations that we want to test for but that don’t occur enough in the real-world to get meaningful data. At our private test track we can create situations and measure our technology performance against those scenarios.
  • rigorous design principles (redundant actuation, fallback systems), safety-based engineering (hazard analysis, DFMEAs), and extensively validating that our system reacts as expected.
  • We actually cover some of this in the Waymo safety report waymo.com/safety

- Nathaniel & Miz

realmariotorres20 karma

Great topic for an AMA. Waymo is famous for using deep neural nets at the perception of your cars. Do you also use any kind of machine learning techniques on the behaviour level as well? If so, how do you guarantee the car will never do anything bad?

waymo27 karma

Thanks for the question. We use deep learning across our stack: perception, prediction, etc. In behavior, specifically, we use ML for predicting what other agents will do, or how they will react to us, or what is a natural baseline to drive down a narrow road. But your question is spot on: the key requirement when we incorporate ML is to make sure that it works within a framework where we can leverage the strengths of ML but combine it with our engineered solutions and rigorously evaluate the safety of the output. - Nathaniel

waymowaysong19 karma

What is the current cost of all of self driving hardware that goes into one vehicle with the 4th Gen platform? What is the current cost of all of self driving hardware that goes into one vehicle with the 5th Gen platform?

Aside from the cost of the vehicle, how long until you can get the cost of all of the self driving hardware (sensors / compute) under $10,000 per vehicle?

waymo25 karma

We’ve significantly reduced the price of our hardware suite, which my team designs in-house. The latest generation of our hardware suite is more than 50% cheaper than our last generation. And our next gen lidar delivers more than an order of magnitude improvement in cost and functionality. - Satish

borisst13 karma

What exactly do you mean by 'driverless'?

Could you confirm that there are no humans that monitor the car, either locally or remotely, and are able to brake or steer the car in case of emergency?

waymo36 karma

Thanks for the question! There is a lot of confusion around this term -- it's used pretty broadly in the industry. When we use “driverless” we mean there are no other humans in the car (besides our riders, of course!).

Our entire fleet is connected to our operations center, which is staffed by a fleet response team who can monitor the vehicles and give the vehicles contextual information as needed but they can’t brake or steer the car. The vehicle itself handles braking and steering at all times. - Nathaniel

turpauk3 karma

What about a natural disaster? What can be done in case of flooding or forest fires? Will your cars just stop?

waymo23 karma

WRT natural disasters: Of course, natural disasters can encompass a lot of different situations. In general, our cars detect water, smoke, dust, etc., and slow down or stop if that’s the right thing to do. Second, our ops team constantly monitors local conditions, and can request that all vehicles return to the depot, or pull over immediately, or reroute the cars, depending on the situation.

There isn't a remote emergency *brake*, but there is the ability to request the vehicles to pull over ASAP.

meiyouL512 karma

  • I've heard that the Waymo driver code started off being based on the Stanford Car code. Is that true?

  • Deep learning became popular several years after Google started the SDC project. How has this affected the approaches used? Was the project leveraging ML/DL significantly before, or have breakthroughs been incorporated over time?

  • How do you deal with ingesting the sheer amount of data generated by an ever growing fleet? What sort of engineering challenges stem from this?

  • What sort of redundancies have to exist for an L4 vehicle to run with 0 humans in it?

  • How do you deploy vehicle software? Is it containerized, or are vehicle imaged?

  • What sort of bring up / bring down problems have to be solved to turn a great AI driver into a functioning robotaxi service that can run unattended?

  • Why was Firefly killed, especially after prototypes were already on the road?

waymo15 karma

Deep learning became popular several years after Google started the SDC project. How has this affected the approaches used? Was the project leveraging ML/DL significantly before, or have breakthroughs been incorporated over time?

We greatly benefited from advances in deep learning and have been investing in the area for years, starting from the earliest days of the field around 2011.

Our first applications of deep learning were in perception and allowed us to greatly improve the accuracy of our object detection/classification. Since then, we’ve expanded our use of deep learning to other parts of the stack beyond perception, and currently use it in prediction, planning, mapping, and simulation. We continue to push on the leading edge of ML research, and we've also got great collaborations with other ML groups (like Brain and Deep Mind). - Nathaniel

waymo10 karma

What sort of redundancies have to exist for an L4 vehicle to run with 0 humans in it?

We build a number of redundancies into our vehicles, including power, braking/steering, communication/connectivity, sensors/compute, and onboard software stack. - Satish

bochen878712 karma

What are the top 3 tech challenges you’re working on right now?

waymo20 karma

From a hardware perspective, the top challenge we’re working on is ensuring our hardware sensor suite can perform optimally in all types of weather. We’re testing our vehicles in all types of weather to continuously improve our hardware stack, including severe rain in Florida, snow in the Upper Peninsula of Michigan, and haboobs (dust storms) in Arizona. We’re excited about the improved weather capabilities of the fifth generation of our hardware suite, which we’re putting the finishing touches on now. - Satish

turpauk11 karma

How are you dependable on your maps? Are your cars able to drive safely without any map?

waymo21 karma

Our cars rely on maps, but we designed our system so we can navigate safely where the roads have changed.

Maps are useful to let our vehicle know where it is, as well as to help it anticipate what is coming up next. e.g. it’s useful for our vehicles to know there’s a stop sign coming around the corner before we visually detect it.

We build our own high definition 3D maps. While driving, our vehicles benefit from sharing new information with the rest of the fleet dynamically -- for example if there is a construction zone or some other new/temporary object or situation. Maps, combined with sensing, gives us the best understanding of the world. - Miz

SlowFatGRT11 karma

How long do you think it will be, realistically, before a fully self-driving car is available on the market?

waymo21 karma

Think you’re asking about personally owned vehicles, is that right?

Right now, our top focus = ride-hailing and long-haul trucking, along with commercial b2b deliveries. For personal car ownership, we’re actively talking with our automaker partners to offer their customers personally owned vehicles with the Waymo Driver. Stay tuned for more on that...we've got a cool business approach in mind. - Satish

coryrenton10 karma

What would be the best form factor change to an automobile in terms of making it much easier to develop autonomous driving that you wish you could popularize, but can't because consumers would think it is too ugly?

waymo30 karma

sampleminded9 karma

Any plans to use sensors that are not attached to the vehicle? Say if a you know a certain place has a visibility problem in a mapped area, you add a camera that all Waymo cars have access to so they can see around the corner or tall building? Or using quadcopters to route traffic around accidents and emergencies in real-time?

waymo22 karma

So far in our testing we have not come across a need for a sensor that is external to one of our vehicles. - Satish

expnad8 karma

How do you deal with really dark objects, stuff that cameras or Lidars can’t see at close distance, like shiny black cars?

waymo18 karma

Our lidars are capable of seeing really dark objects, even things like a tire on a freeway at night, at a range where we can react safely. For example, shiny black cars have a number of features that reflect plenty of light back to our lidars. Our radars are also very good at detecting vehicles, regardless of their color. Our system is designed to detect all sorts of road users, like cars, people, cyclists, motorcycles, etc. - Satish

sampleminded7 karma

Are Waymo vehicles aware of each other? Do they communicate? Are there plans to do this?

waymo12 karma

Yes, our Waymo vehicles share information in real time. This is particularly useful when one of our vehicles encounters something unexpected in its environment -- for example, a construction zone. - Nathaniel

mewenger6 karma

Hi! All of you have impressive and unique histories which undoubtedly fit your current work. What would you say to others fascinated with the AV space and wanting to get involved but without an engineering pedigree? What in-demand roles do you see for non-engineering folks? What about in the ecosystem of AV - municipalities, suppliers, etc. Are there trends or counter intuitive market need you're seeing where someone could get involved?

waymo9 karma

Thank you so much for this question! We do have a super talented team and we’re growing across all areas including non-eng business units.

On top of that, our Eng team is expanding -- hardware, onboard software (including perception, prediction, and planning), simulation, and other areas. I hope you’ll keep an eye on our open roles at waymo.com/joinus - Satish

bladerskb6 karma

  • Does your in-car compute hardware consist of TPU/Edge TPU?
  • Do you currently perform any 'Waymo Engineering', 'Experimental' or more advanced future software tests that consists of driving with no Lidar-type HD mapping in a city?

    Thanks again.

waymo11 karma

This is one of the best parts of working here! We explore all kinds of ideas: new sensors, snow, "less map", exotic stereo camera configurations, synthetic aperture radar, golf carts. - Nathaniel

waymo10 karma

We have designed our computer architecture with flexibility so the exact mix of silicon we use changes. Today we use a combination of CPUs, GPUs, accelerators and IO processing engines. - Satish

dheera6 karma

Satish, what are the best camera options available on the market now for self-driving vehicles? IMO It's kind of ridiculous that any GMSL camera solutions on the market (e.g. Leopard) are $400 per camera + $1000 for a daughter board of some sort while the equivalent USB 3.0 cameras (same sensors!) cost <$75. Considering it's basically the same USB camera minus the USB hardware and just some serdes chips one would think that GMSL should be cheaper than USB for the same sensors, but it isn't.

I understand economies of scale, but with the number of self-driving cars out there there should be some scaling already.

waymo15 karma

The cameras that are part of our hardware sensor suite for our vehicles actually are designed in-house here at Waymo. I haven’t seen any off-the-shelf camera that would provide the same level of performance, low cost, and latency. - Satish

turpauk6 karma

What are your abilities to swerve? Do you think your cars are able to avoid a potential head-on collision?

waymo14 karma

Our cars have discretion to swerve. Our basic approach is to think through the possibilities ahead of time and choose the behavior that is most safe in the most contexts. In some cases, that is swerving to avoid oncoming collisions (and we have extensive closed-course tests to validate our behavior). - Nathaniel

turpauk5 karma

Thank you for your AMA. Do you consider adding an additional type of a sensor? If yes, what will it be?

waymo12 karma

Yes! In the fifth generation of our in-house designed hardware sensor suite, we will be introducing a new sensor modality. - Satish

realmariotorres5 karma

Hi Nathaniel, I believe predicting the behavior of other road users is very challenging. Can you say anything about which kind of techniques do you use at waymo to make sure your predictions are correct? How do you make sure that waymo will not cause accidents if some of those predictions are wrong?

waymo10 karma

Thanks for the question, Mario! Two really key parts of the answer. First, we have a huge amount of real world data that we can measure our prediction performance. Second, we don't just predict one possibility! We predict all the potential actions of the other agents (together with the likelihood of each action). The planner then considers all these possibilities when coming up with a safe plan. - Nathaniel