A brief history of computing in Formula 1
Modern Formula 1 teams use thousands of cutting-edge computers to measure, control, analyse and simulate every aspect of a grand prix car, McLaren software engineer Chris Alexander investigates the history of computer tech in the sport.
From onboard specialist electronics to countless virtual servers in data-centres around the world, computers are pervasive in every aspect of Formula 1 engineering. But how did the technology get to where it is today? Much like the nature of the sport itself, and, indeed, the mobile or desktop machine upon which you’re reading this very article, the journey of computers in F1 is a story of both power and speed.
As you can well imagine, the early years of Formula 1 weren’t heavily influenced by computer technology. In fact, when the world championship started in 1950, the first programmable computer had been invented just the year before. The Electronic Delay Storage Automatic Calculator (EDSAC) machine, as it was called, was built at the University of Cambridge and programmed with five-hole punch tape. Due to the primitive nature of the technology, it took up the same amount of floor space as two McLaren MP4-31s, and it took many hours just to input a simple program!
Formula 1 cars remained completely mechanical devices through the 1960s, designed on traditional drawing boards by multi-skilled engineers armed with a retractable pencil and a fancy set of French curves.
When Bruce McLaren and Denny Hulme raced McLaren’s earliest grand prix cars in the late 1960s, the driver was the key instrument in analysing and understanding car performance. A simple mistake by a driver, or a failure in his understanding what the car was ‘saying’ to him, could easily result in a retirement.
For instance, in the 1967 Monaco Grand Prix, Bruce pitted with a misfire that he mistakenly thought had been caused by low fuel; his error was pointed out to him in the pits by Jack Brabham, and he returned to the fray, eventually finishing fourth.
In modern F1 cars, thousands of data parameters are measured every second, so engineers at the track and back at base can analyse issues the car without them even needing to come to the pits.
It wasn’t until the 1970s when advances in electronic components and microprocessors allowed for the introduction of what we’d recognise today as a microcomputer. It was in 1975 when McLaren first deployed telemetry – collecting data about the car – and it wasn’t in F1, it was on the company’s IndyCar effort, capturing 14 different pieces of information about the car that could be downloaded back at the garage.
To put that in perspective, that’s about the same number of different pieces of information a modern smartphone can capture about its environment.
As with the home computer boom, on-car electronic technology began to significantly improve in the 1980s. As both electronic and analogue systems became lighter, smaller and more powerful – key aspects of any piece of equipment added to an F1 car – teams and especially engine manufacturers began to run more complex systems onboard.
The first electronics were used for performing management tasks in addition to telemetry to improve car reliability and performance. These management systems were precursors of what you would find in your modern road car, helping to improve the efficiency and reliability of the engine while performing diagnostics and tracking journeys.
In F1, the very first of these electronic systems were onboard only, lacking the ability to transmit back to the pits. Instead, technicians would download the data from the onboard memory when the car was back in the garage. Initially, storage was limited to just one lap’s worth of data, so the driver would be given a signal on the pit-board to turn on the telemetry for a particular lap, and the data would then be taken off the car when it returned to the garage. Tall, rack-mounted computers increasingly began to occupy garage space, alongside the more conventional mechanics’ tools.
These faltering steps marked the beginnings of the data age in Formula 1.
The early 1980s also saw the introduction of electronic engine management systems. When McLaren introduced the TAG Turbo engine for the MP4/1E in 1983, it came with an advanced Bosch system that brought together the control of fuel injection and ignition within the same unit. This allowed the electronics to control power, drivability, and fuel efficiency to a much greater degree than had previously been possible.
Fuel usage was an important problem to solve at the time. In 1985, cars were limited to 220 litres of fuel without refueling; in 1986, it was tightened further to 195 litres, meaning that accurate and efficient monitoring of the fuel became increasingly important.
McLaren’s 1985 MP4/2B was the team’s first to feature an electronic readout in the cockpit detailing the fuel level remaining. Using this technology, Alain Prost crossed the line first in that year’s San Marino Grand Prix after Ayrton Senna’s Lotus and Stefan Johansson’s Ferrari both ran out of fuel ahead of him (Prost was later disqualified when his car was found to be underweight).
The system was still unreliable, however; Prost famously threw caution to the wind to win his second world title in Adelaide in 1986, despite his fuel readout warning him that he was severely in the red. Luckily for the Frenchman, it was wrong!
As with all things in F1, speed was of the essence, and waiting to download the data from the car took too long before it could be made useful. By the second half of the ’80s, the first streams of data were becoming available in the garage before the car had made it back to the pitlane.
This was ‘burst’ telemetry – in which the car would use radio signals to broadcast key pieces of data back to the garage as it went past the pits on each lap – resulting in a ‘burst’ of data for that lap. This small sample was then available to engineers several minutes before the car came back to the pits and the fuller picture of recorded data could be examined.
Despite the advances made during the sport’s first 40 years, it was the 1990s that finally saw an explosion of computing capability – on the car itself, but also throughout the whole team.
By 1993, the sport had really exploited computer technology to run ‘active-ride’ cars. It was an era that exploited electronic control systems even more than even today’s cars: active-ride suspension kept the cars stable; power-steering assisted the driver; power-braking increased traction into corners, and traction control facilitated the smoothest possible exit.
Much more data needed to be collected from the car, and analysed at a much higher frequency than ever before. That job was given to a series of ever more powerful and speedier machines. As technology on the cars increased, the tech used to download and stream data back in the garage became increasingly advanced. In turn, the machinery back at the factory became bigger and faster.
So while the ruling authorities began to rein in the use of assistive technology on the cars themselves, the sport ramped up its use of computers in every other area. It marked the beginning of the overhaul of the sport.
Nowadays, F1 relies on the internet to stream everything – from telemetry to television – around the world, connected at 10x the speed of regular home broadband.
In the 1960s, when electronic systems were just starting to be used in Formula 1, the internet had not even been invented; it was 1969 before ARPANET, the first large-scale network, connected together four machines at universities in America. This network was so slow by today’s standards that it would have taken more than five hours to send a three-minute music track from one machine to another.
In 2016, you could transfer the same amount of data from trackside at the Australian Grand Prix to the McLaren Technology Campus in just hundredths of a second!
With constraints on the number of personnel allowed at events, and the amount of equipment that can realistically be freighted around the world, there are now teams of engineers back at every team’s base who have the same capability to access data as their trackside colleagues.
Real-time telemetry data is streamed across the world when the car is on the circuit, and engineers can collaborate on analysis, and share simulation data within seconds between the factory and the track. Speed in this process is essential to make sure as much useful development data is produced during the short but intense practice sessions held on grand prix weekends.
The computers used in a modern F1 team are without compromise: top-of-the-range.
Ultra-portable laptops provide on-the-road engineers with the access to data, simulation and analysis tools they need to optimise the car’s performance at the next event. High-performance workstations give groups at base the ability to quickly run complex simulations on data from a variety of sources. Specialist software such as SAP HANA allows engineers to search thousands of laps of data for exactly the piece of information they need to help them with performance in a race weekend.
And specially designed hardware clusters – groups of tens to hundreds of computers working together on complex mathematics – power computational fluid dynamics (CFD) systems, used for improving aerodynamic performance, as well as running the simulator that lets the driver develop the car when not on track.
In addition to these physical computers, all teams leverage cloud-computing: unlike traditional machines, cloud computers are fully virtual, running in massive data-centres around the world and accessed over the internet.
When an engineering team needs a complex problem solved or an enormous volume of data analysed, the cloud can provide thousands of these virtual machines on-demand, dedicated to the task at hand. This technology offers an unprecedented speed and throughput capability with an amount of power which could not feasibly be matched by computers on-site at the factory or track. Additionally, special links can be established over the internet allowing faster transfer of data between the team and the cloud servers, as well as providing first-class security for the sensitive data.
To harness the power of computing now available to them, F1 engineers use sophisticated and custom-made software that you wouldn’t find on your home or office computer.
At McLaren, we’ve developed our own highly sophisticated data analysis and simulation platform, which enables every single engineer within the team to leverage our historic and real-time systems data. This platform unifies access to a wide variety of data, from the cars on track at grands prix, to laps driven by our test drivers in the simulators; from aerodynamic data generated by the windtunnel to specialised test equipment for specific car components, such as the clutch or brakes.
Because all of this data can be accessed in the same way, new exploratory and analysis tools can be quickly and easily developed for emerging needs, as and when they occur in the fast-paced environment of Formula 1. This data platform also provides a solid foundation for numerous specialised, high-performance data applications for specific engineering disciplines. Virtually every engineering group within the team – from suspension to brakes, and chassis, to race engineers – have their own dedicated suite of software tools that helps them to analyse the data that is most important to them.
The use of computers in Formula 1 has changed the face of the sport and contributed immeasurably to the engineering process of building fast cars. It continues to allow teams to push the boundaries of simulation, development and analysis technology, and go racing with well set-up and optimised cars at the beginning of race weekends.
As Formula 1 evolves, its use of computers and software continues at the rapid pace required to support the ever-changing design and engineering challenges.