What's it like to ride in a self-driving car?
The technology in a much-modified electric Nissan Leaf enables full driverless control, even on the UK’s most challenging roads. We’ve been for a ride to find out how it works...

I held my breath as our car approached an oncoming 44-tonne lorry and passed just a few feet to the left of it; each vehicle was doing around 55mph on the narrow country lane. I needn’t have worried, though, because the car I was in bristled with technology that enables it to genuinely drive itself along rural roads – with absolutely no human intervention.
The two-year-old Nissan Leaf in our images is one of two autonomous vehicles (AVs) that have been developed in the evolvAD project, (AD signifying autonomous driving) – the third and final phase of an eight-year, 16,000-mile self-driving research programme conducted by Nissan and four partners. A reassuring statistic that I kept in mind as the Leaf safely passed that truck is that all those self-driving miles were completed without a single accident.

Before evolvAD, true driverless cars (such as the fleet of self-driving Jaguar I-Pace s operated by ride-share platform Waymo) have largely been confined to the spacious city streets and open roads typical of the US. Those are a far cry from the UK’s rural roads and residential streets, which are among the most challenging environments that a driver has to deal with.
On many country roads, the markings that the driver aids (such as lane-keeping assistance) fitted to many of today’s cars rely on for road positioning are either worn or non-existent, and the edges of the roads themselves can be poorly defined – or obscured in bad weather.

Among the challenges that our suburban streets pose for self-driving cars are that they can be unexpectedly narrowed by parked cars, criss-crossed by unpredictable pedestrians, and strewn with obstacles for the car to negotiate. In the absence of a human driver’s eyes and brain to deal with these factors, a car needs to recognise and process the hazards some other way if it is to truly drive itself on any road.
To cope, a self-driving car needs to know exactly where it is, so Nissan collaborated with a company called Catapult to create detailed, up-to-date maps of the roads that the test cars would drive on, using a combination of aerial photography and AI. In conjunction with GPS data, these maps tell the cars where junctions, pedestrian crossings and other important features (including poor surfaces or unmade roads) are before they encounter them.
On its own, a map isn’t enough to guide a cars correctly, of course, so it’s continuously supplemented by real-time lidar (light detection and ranging) information and camera feeds, enabling it to recognise and identify hazards and obstructions, such as parked cars and pedestrians. The test car also monitors real-time traffic data (as is used by today’s navigation apps, such as Google Maps). So, if there’s a queue of traffic at a set of lights, the car knows about this in advance and can prepare to slow down before its sensors identify the vehicles.

To help the Leaf predict how pedestrians and other road users might behave, its self-driving system is loaded with data based on research on road user behaviour. This was generated by a specialist company called Humanising Autonomy, with further information from the Transport Research Laboratory.
In all, the Leaf uses 15 cameras dotted around the car, plus four long-range and two wide-range lidar systems, with two GPS antennae mounted in a roofbox. Two cameras behind the windscreen are used to collect traffic light information; others work with the long-range lidar sensors to spot and identify everything from HGVs to squirrels, and then to work out their speed and predicted trajectory to ensure that the Leaf passes them safely. The short-range lidar sensors monitor the ground beside the car to analyse the road edge – vital on roads without edge markings.

Besides the add-on gadgetry, the Leaf itself was tweaked by Nissan with a view to making it as competent and comfortable as possible when tackling varied road conditions. The test cars have a bespoke computerised chassis control system, along with upgraded steering and adaptive suspension. The onboard computer continuously monitors the road and calculates how much traction the car’s tyres have, adjusting power delivery, braking and suspension to suit. The evolvAD team says this is another area in which it’s ahead of Waymo, whose I-Paces can’t adapt to the road conditions automatically and continuously like the Leaf.
The evolvAD test cars have also been programmed to drive confidently (in the manner of an advanced driver, the team says), so they don’t hold up or frustrate other road users to the point of wanting to overtake. The test cars can even autonomously plan the best places for overtaking other vehicles, taking into account blind spots and the visibility of oncoming vehicles, while obeying speed limits.

From my ride in the Leaf, I can confirm that it emulates a skilled driver in most situations, accelerating strongly when it reaches clear stretches of national speed limit road and smoothly applying the brakes to wipe off speed as it approaches bends. On suburban streets, it can be hesitant at roundabouts, because it’s programmed to wait until another vehicle to its right has started to turn off the roundabout, rather than moving off when it sees the car indicating, like a human might. However, it is good at judging the speed of oncoming vehicles and pulling out ahead of slow traffic at crossroads and T-junctions.
Importantly, the evolvAD Leaf behaves in the way you’d expect of a capable and careful driver, and can hold its own in cut-and-thrust urban traffic and open roads alike, and that means it should be respected by most other drivers.
Of course, you can’t run before you can walk, so the evolvAD Leafs were initially trained to be at home in less complicated conditions. The process began with HumanDrive, a 30-month project based on trunk roads and motorways.
The objective was to enable self-driving cars to change lanes, merge with other traffic and stop and start as necessary. To achieve this, the cars used GPS data for location, radar and lidar sensors to measure distances in all weather and lighting conditions, and cameras to build up an accurate picture of the changing environment around the car. The culmination of this work, in 2020, was a 230-mile fully autonomous drive from Cranfield in Bedfordshire to Sunderland. It set a record as the longest journey completed by a self-driving vehicle in the UK.

HumanDrive was followed by ServCity, a three-year challenge that enabled the car to negotiate the complex urban roads and traffic situations found in a specific area in Greenwich, south London, the conditions here being analogous with those found in many UK cities. On top of the data used previously, ServCity made use of information from a dedicated series of roadside sensors that fed details of the positioning and speed of other vehicles outside the test car’s field of vision.
What happens next?
The long-term aim of evolvAD and the two previous projects is to create vehicles that can provide a self-driving mobility service in various locations around the UK from 2028, when it’s expected that legislation will be passed that allows them to operate here.
Nissan and its partners will soon begin to assess various cities and regions across the UK to determine which are the most suitable for the autonomous driving service.
In the meantime, Nissan has just launched a pilot scheme called Easy Ride in Yokohama, Japan, where autonomous vehicles are already permitted and where there is a shortage of human drivers due to the ageing population. Easy Ride’s fleet of autonomous Nissan Serena eight-seaters uses technology developed during the UK-based research, collecting passengers from designated points. It’s said to be cheaper to use than taxis – but pricier than bus travel.
Who are the other operators of self-driving cars?
Many companies around the world have been working on self-driving technology for more than a decade. The best known is Waymo, which was formerly known as the Google Self-Driving Car Project.

Its driverless Jaguar Jaguar I-Paces provide pre-booked taxi services in the US cities of Austin, Los Angeles, Phoenix and San Francisco, and are expected to be available soon in Miami, as well as in some Japanese cities. They can be booked via the Waymo One app, and up to four people can travel in each vehicle.
Before the service goes live in a new area, the entire road system is mapped, and this information is used alongside real-time data and AI to plan the best routes and enable the cars to navigate them successfully. According to Waymo, the vehicles also use driving experience information built up from more than 20 million miles of driving.

Tesla – a big proponent of self-driving technology – is also planning an autonomous ride hailing service. Unveiled as a concept in October 2024, the Cybercab is a two-passenger, 35kWh electric self-driving car with no steering wheel or pedals. It has two butterfly doors but no door handles, because the doors open automatically. It also has no rear window and no door mirrors. Nor is there an external charging port; instead, the production vehicles will apparently charge from inductive pads placed beneath the vehicle.
The Cybercab fleet was originally proposed to go live this June, but the 145% tariff imposed by the US on Chinese imports has led the company to halt shipments of components for production of the vehicles, so that target date might not be achieved.
Can you teach driverless cars ethics?
By Dan Jones
One of the key questions that people have when it comes to autonomous vehicles (AVs) is how they might respond if a situation arises where they need to decide whether or not to risk one life to save many.
This moral dilemma – dubbed the Trolley Problem – has been debated by philosophers and others since Philippa Foot introduced it back in 1967. Now, with the advent of driverless cars, the AI in charge faces this ethical quandary in the absence of a human brain behind the wheel.
In the Trolley Problem, a train is approaching five people on a track. It can switch onto another track, but then it will definitely hit one person. What is the right thing to do? This is analogous to a situation where an AV carrying four people might avoid a fatal crash by driving on to the pavement and killing one – the kind of decision a human driver might have to make in a split second.

In a 2022 paper, Chris Gerdes, professor emeritus of mechanical engineering and co-director of the Centre for Automotive Research at Stanford (CARS), and experts at Ford Motor Co concluded that the value of one life over another shouldn’t be a consideration. Instead, in the event of an unavoidable accident, we should aim to not draw external factors into the crash scenario.
So, for example, if a motorbike suddenly pulls into the path of the AV, requiring it to swerve into oncoming traffic that isn’t otherwise involved in the situation, it shouldn’t.
The team also talked about the importance of teaching driverless cars the same social contract that road users have with each other, including – if necessary – breaking driving laws in order to avoid such situations.
By the time a human driver gains their driving licence, they’re likely to have already been exposed to countless moral dilemmas that help them to make reasoned decisions when faced with high-pressure situations on the road. It is anticipated that machine learning will enable the AI of driverless cars to behave in a way similar to humans.
GAIA-2, a technology from AI specialist Wayve, enables the electronic brains of autonomous cars to be exposed to complex simulated scenarios to hone their decision-making capabilities.This could enable AVs to respond to real- world driving situations – including potentially life-threatening ones – in a safe and consistent way. Doing this in a virtual space allows the car’s brain to be tested in thousands of scenarios much faster than in real-world testing and without risk to hardware or life.
While no one has cited a definitive answer to the Trolley Problem, the ability of AVs to behave more like humans is growing steadily closer.
For all the latest reviews, advice and new car deals, sign up to the What Car? newsletter here









