background image

A day in the life of an F1 IT Operations Engineer, with Matt Berger – presented by Dell Technologies

Part III: McLaren’s unsung heroes, how our IT Engineers keep the team on track

Dell

Read time: 12.5 minutes

The McLaren Formula 1 Team is a data-driven organisation: anything we can measure, we can improve, and a Formula 1 team is in a state of continuous improvement.

Acquiring, transmitting and analysing data is what enables us to do that. And, when things are running as they should, it isn’t something the majority of the team needs to think about: data simply flows. Except, of course, it doesn’t simply do anything. There’s an IT operation running 24/7 that keeps McLaren in the game.

An F1 garage is somewhat like a shiny, high-tech version of Mary Poppins’ famous carpet bag. It’s a small space, but we pack a lot in… the front of the garage contains two bays for the cars, the engineers’ island in the centre of the garage, fuel bowsers, a lighting rig, workbenches, and tool drawers.

Behind that lies a larger warren of workshops and partitioned rooms, including a bodywork shop, PU shop, gearbox shop, and a chemistry lab, in addition to storage and workbenches. It is also home to our IT systems engineers and their IT Tower.

The trackside team usually includes one or two IT systems engineers. They’re not responsible for the electronic systems on the car, but they are accountable for the acquisition, storage, and transmission of telemetry that comes off it. They also look after communications and the myriad of everyday IT helpdesk tasks that keep a workplace running, like making sure the Wi-Fi works and that laptops are running smoothly and are fully updated.

In a four-part series with Dell Technologies, the McLaren Formula 1 Team’s Official Innovation Partner, we’re exploring how different teams across the business utilise data with the help of Dell Technologies on a typical working day. For this third piece, we spent the day with F1 IT Operations Engineer Matt Berger at a TPC (Testing of Previous Cars) event in Barcelona.

Watch now
Watch now

Trackside with IT

There’s a temptation to think of the F1 season as being 24 separate weekend events, but the reality is rather more full-time. From set-up to pack-down, a Grand Prix ‘weekend’ often spans seven or eight days. On top of this, we have pre-and post-season tests, Pirelli tyre tests, and TPC.

While in-season testing is largely prohibited, there is a dispensation to test cars that are two or more years old at the TPC events. For many at the track, these events are significantly different from the intensity of a Grand Prix weekend, but for IT Systems, the job list remains relatively unchanged.

“The working day actually begins at the hotel, the moment I get up,” Matt says. “From my hotel room, I will remotely start up the servers in the garage. Then when we arrive at the track, I’ll check the operational status and the connectivity back to the MTC - we’ll be ready to start running once all of that is confirmed.”

This is an important point: modern F1 cars can’t operate without the support of the garage IT infrastructure. Whether that’s a PU Engineer looking for a board of green lights to say the engine is operating within acceptable parameters, or a Race Engineer being able to talk to the driver while the car is racing at 360km/h, 4km away.

Without the IT systems, the car won’t leave the garage. This is why Matt calls the Dell VxRail servers in the IT Tower “the beating heart” of the operation.

Matt Berger

Communications and telemetry

At a TPC, the day typically consists of two, four-hour sessions, separated by a lunch hour. If the green light is scheduled for 09:00, the team will arrive at track around 07:00. Then, at around 08:45, there will be a comms check, ensuring everyone, from the pit wall to the garage and those at the McLaren Technology Centre (MTC), can communicate with each other. Before that can happen, the IT systems engineer needs to ensure the comms system is functioning properly.

“The infrastructure managing the intercom will record four channels: the driver, the engineers, the mechanics, and information, to be played back later if required,” Matt says. “I’ll make sure it has connectivity and that the correct channel information is being displayed. I’ll start the recorder a few minutes before the comms check and, assuming everyone can hear each other loud and clear, we’ll start running soon after that.”

The audio recorded is sometimes analysed later, while the telemetry gathered by the Dell VxRail servers is distributed in real time, in addition to being recorded for analysis later.

“We use three servers for telemetry and data acquisition from the car, while there are three more at the bottom of the tower, which form the cluster that holds our virtual machines,” says Matt. “Today, we’re running 37 virtual machines. They have many functions – for example, if the driver sitting in the cockpit in the garage wants to see how he performed from one lap to the next on the previous run, that data can be replayed from the servers directly to the monitors sitting on the bulkhead of his car.

Matt Berger

Preparing for all eventualities

“The tower also has a battery backup. If we have a prolonged power outage or any sort of issue with circuit power, it allows us to manage a graceful shutdown, safeguarding the data stored within. It is, as I say, the beating heart of our IT infrastructure, and we can’t run without it – or at least, it would be impossible to run effectively.”

It's very rare for things to go quite that wrong – though a complete power outage is not completely unheard of. However, smaller, more manageable issues are par for the course... Telemetry nodes around the track fail, circuit power suffers brownouts, etc. The IT system is built with a degree of robustness to ensure it copes with problems of this nature, allowing the team to keep going regardless.

Jude Hunt

Plugging the [data] holes

“My interactions with the Race Engineers are predominantly from an operational standpoint,” says Matt. “If there were any issues with the telemetry during the run - if they experience a drop, or what we would call ‘gappy’ telemetry where we're not seeing continuous data streams - we would interact with our partners who provide the circuit telemetry.

“As the car is moving around, the telemetry is gathered over radio frequencies, but when the car is in the garage, it's plugged-in and has a hard line to the infrastructure. There’s a difference between what the car transmits out on track, versus what it downloads in the garage. It helps us troubleshoot and eliminate any gaps where the data may be failing.”

IT deals with other issues that are rather more mundane, and one of the biggest is temperature. Just as the car needs an efficient cooling package, so too does our IT Tower.

The internal fans use colour-coded LEDs to indicate status, resembling a supersized version of what hobbyist PC builders would recognise. The tower is bathed in reassuring green when everything is cool, amber when temperatures are rising, and red when things are critical – though the systems engineers have warning protocols too.

Matt Berger

Setting up for an event

This, adds Matt, is not the most challenging part of the job. That comes earlier in the event, when the garage is being set up. For a Grand Prix, the advance members of the crew typically begin arriving at the track a week before the race. They’ll unpack freight, begin building the shell of the garage and start connecting it to basic services and utilities. The next stage, before the team begins building a car, is to set up the IT infrastructure.

“Each garage we go to is different,” says Matt. “We’ll have a garage plan and know where we’re going to position the key pieces of equipment, but the differences between each circuit can make access difficult. For example, it might be that running cables to various key pieces of infrastructure isn’t straightforward. It’s not just a question of connectivity: We also need to keep everything neat and tidy.”

“Normally, we'll have anywhere up to two days to fully set up the garage, but it isn’t always the case. A track may have an event on before us, which is the usual reason why we might have less time.

“Before the event begins, I'll be in contact with the circuit infrastructure to make sure we've got an internet connection when we arrive at the track. When we arrive, the first thing I’ll check is that we’ve got a link back to the MTC. After that, I get started on powering up the IT kit. The tower is first, and once that’s up and running, I’ll move on to the rest of the garage.”

background image

Dell Technologies Official innovation partner

Packing down

What goes up, must, of course, come down. Pack-down tends to be an all-hands-to-the-pump operation: everyone dons a high-vis and pitches in, working under the direction of the garage techs and IT systems engineers.

The IT tower is one of the final things to be packed into flight cases, once the final data transfers have been made to the MTC. It’s hard, physical work – but after a successful event, it’s also good humoured.

“Everyone’s looking out for each other, and I’m very happy to be part of that,” says Matt. “When the shutters are open and the visor comes down, we’re very serious. There is some pressure, but the team is there to help you absorb that pressure. When we’re done for the day, we also know how to relax! I really enjoy working in IT, and I’ve been a fan of F1 for a long time, so getting to combine those two things… it's hard work… but it doesn’t necessarily feel like it.”

As Official Innovation Partner of McLaren Formula 1 Team, Dell Technologies enables McLaren Racing’s team to anticipate and solve problems before they occur. Find out more here.