Driving Innovation at Ford Motor Company - An Interview With Ken Johnston

An exclusive interview with Ken Johnston, VP Data Platforms and Telematics 

Ford Mustang Autonomic

Meet Ken Johnston, one of the visionary leaders behind Ford’s Data Platforms and Telematics group. As an executive engineering manager, Ken brings over two decades of software engineering expertise to the table. His impressive record includes managing large multi-national engineering teams from forty to seven hundred employees across the United States, Canada, Mexico, Brazil, Europe, and Asia.

With a deep knowledge of cloud services, data science, and ethical AI, Ken has driven innovation in areas such as cloud-based payment processing, commercial entity graph development, web and local search optimization, cloud-scale capacity planning, and more.

Prior to Ford, Ken was Data Science and Data Engineering Manager for Responsible AI at Microsoft and has been serving as CEO of Autonomic, a subsidiary of Ford.

 

Ken has driven innovation in areas such as cloud-based payment processing, commercial entity graph development, web and local search optimization, cloud-scale capacity planning, and more.

Connie:

Let’s talk about overview of AI initiatives: Can you provide an overview of the AI and LLM initiatives currently underway at Ford in the auto sector? What are the key areas where AI is making a significant impact?

 

Ken:

AI is a very big part of Ford. Clearly, it is the centerpiece of our BlueCruise autonomous vehicle work (which we were the highest rated system in 2023 according to Consumer Reports). AI is used all over Ford. For example, we have a model that predicts optimal color contrasts for our in-vehicle displays given different lighting conditions in a car.  We're partnering with Amazon to improve the Alexa integration in our cars. When we build cars, we have models for predicting part demand.

 

We're even working to combine physics models with AI to improve predictive maintenance. 

We're even working to combine physics models with AI to improve predictive maintenance. We know customers own and use our cars to get places, so keeping them running with the fewest interruptions is the best way to create happier customers.

 

So, I’d say AI is everywhere.   

 

Author's note - learn more about: BlueCruise Semi-Autonomous Driving

 

Ford BlueCruise

Connie:

On Predictive Maintenance, could you elaborate on how predictive maintenance powered by AI improves vehicle reliability and customer satisfaction? Are there any specific success stories or examples you can share?

 

Ken:

Predictive maintenance comes down to leveraging all the rich telemetry that can be emitted from vehicles. We have a responsibility to customers to protect their privacy but also to safety, so it is important we collect data on vehicle performance while maintaining proper consent with all customers. With this vehicle data we can now do multiple types of models.

 

There are these models we call "Prognostics" and really what they do is predict RUL (Remaining Useful Life) of a part of fluid. This approach actually stems from the aviation sector where it is vital to predict and schedule maintenance.  Downtime for commercial aircraft is a huge cost and missing maintenance is very risky.

 

There are many other things we can do in the area of prognostics like batteries, brakes, and even good old windshield wipers. 

For example, you may hear things like "you should change your oil every 3,000 miles".  That is not a model, that is a rule built from experiences.  It is not necessarily accurate for your car, for your driving habits and conditions.  A RUL based model will take actual vehicle telemetry and predict when your oil really needs changing. If you drive, say, at high speeds with a hot engine your oil will break down faster than a typical daily commuter would. 

I’m not saying that’s how I have driven my own 1966 Mustang, but I will say I have observed the impact speed and temperature has on oil. There are many other things we can do in the area of prognostics like batteries, brakes, and even good old windshield wipers.

 

Things like that. when predicted and scheduled make everyone's life better.

 

Beyond this RUL models we also can predict the risks of part failures to proactively help manage risks of recalls. Even detecting likely failures early in a product line allows us to modify manufacturing to reduce that risk for the yet to be built vehicles.

 

A lot of these models are physics based but we have found that deep learning can help boost the models' predictive power. Imagine being able to predict a service need six or nine months in advance instead of weeks. That allows you to schedule maintenance and ensure parts are available at your convenience instead of after failure.

 

Connie:

Can you discuss the role of Large Language Models in fleet management, particularly in report generation? How are these models transforming the way Ford manages its vehicle fleets?

 

In the near future, imagine being able to ask your fleet system, “which drivers are having the most harsh breaking events and is the in-vehicle feedback system having a positive impact on driver behavior?” That’s when it gets really fun.

Ken:

Mostly this helps fleet managers analyze what is going on with their fleets. Imagine being able to use speech to text and ask your fleet management system how many of my police cars (Ford sells a lot of cars to police departments all over the world) will need brake pads changed next month. For now, we are working on simple scenarios. But in the near future, imagine being able to ask your fleet system, “which drivers are having the most harsh breaking events and is the in-vehicle feedback system having a positive impact on driver behavior?” That’s when it gets really fun.

 

Ken:

Mostly this helps fleet managers analyze what is going on with their fleets. Imagine being able to use speech to text and ask your fleet management system how many of my police cars (Ford sells a lot of cars to police departments all over the world) will need brake pads changed next month. For now, we are working on simple scenarios. In the near future, imagine being able to ask your fleet system, “which drivers are having the most harsh breaking events and is the in-vehicle feedback system having a positive impact on driver behavior?” That’s when it gets really fun.

 

Connie:

You mentioned deep learning models for optimizing dashboard content. Could you provide more details on how these models work and the benefits they offer to drivers in terms of user interface and content personalization?

 

Ken:

Yes, we build (deep learning) models to help us in R&D. For example, now that our dashboards are almost all going digital, and we let you customize them and update them with all sorts of cool graphics and widgets. We need to think about what combinations of colors might work on say a bright sunny day when the light is hitting your dashboard display. I have a Mustang Mach E, a Bronco Raptor, an F150 lightning, and a 1966 Mustang. The 1966 mustang has dials but the other three the dashboard can be customized. I have Baja mode in one and well, that’s for sand dunes and I’d expect when I'm driving on sand dunes the sun will be much brighter than I typically get here in the Seattle area.

 

Instead of designers picking color combinations and sending them through tests we can quickly score these color combinations based upon screens and brightness to find supported combinations. Because it is a deep learning model adding in new displays and building upon the data from previous generations becomes pretty straightforward, saving us a lot of test time.

 

Ford F150 LightningFord F150 Lightning

 

Connie:

While you mentioned it's not your focus, can you briefly touch upon Ford's approach to autonomous driving and the role of AI in achieving this goal?

 

Ken:

You are right, autonomous driving is not my focus, but it is a big focus for Ford and the industry as a whole. Our division president Doug Field often talks about the importance of continuing to improve the hands-off driving experience and that full autonomy, while a goal, is not the most important goal.

 

The gift of time is the most important next step. I have driven hundreds of miles in autonomous mode and while I have to keep my eyes on the road for safety reasons just being able to rest my hands and take a drink is a nice break. I mean a really nice break.  No hunching over the steering wheel for tens of minutes at a time is a much better way to do a road trip. 

 

Soon I will be able to take my eyes off the road and legally check text messages or answer interview questions for a blog post from a great friend like Connie Yang while my car drives me.

 

Today, believe it or not, autonomous driving is signed off mile by mile by individual road. At Ford we collect data from our cars, train and evaluate our models constantly adding hundreds of thousands of miles of autonomous road as quickly as we can. We even know when a model was trained on a road but now the road is under construction and at least for now, for safety reasons, we’re going to ask the driver to keep their hands on the wheel and eyes on the road. Eventually, soon we think, we’ll be able to zip you through even construction zones.

 

So, a very big focus for Ford, just not my team.  I do help get data off the cars to help improve the models, but I don't actually work on the models.

 

Connie:Ford Autonomous Driving

When it comes to driver safety, how is AI being utilized to enhance driver safety in Ford vehicles? Are there any advanced driver-assistance systems or safety features worth highlighting?

 

Ken:

ADAS (Advanced Driver Assistance System) is both for autonomous and for safety. Much of the safety work is like collision alert and automatic breaking are more rules based than AI but we use AI to help develop those models. How can the system tell the difference between a very slow-moving van that pulls suddenly in front of you vs you coming up too quickly to a car stopped at a light. With more advanced cars sensors detect the risk and the system responds by automatically breaking for you. Again, these are not really AI models, but we use AI to help us study and program for various conditions.  We may use AI to improve these models in the future but right now the trigger to auto-break is not AI.

 

A place we do use AI to improve our Distance Till Empty (DTE) predictions is your gas gauge. It can be modified by your own driving behaviors.  Gas tanks are not uniform in size, and if your car is parked on a hill the fuel in the tank won't be level.  Getting DTE right is actually a pretty complex model. 

 

For EVs we call this predictive range and it is a very AI driven model. I have to admit that my EV range prediction, since I might be known to accelerate quickly on green lights, is lower than say, a more sedate driver.

 

Connie:

How do you see AI and LLM technologies evolving to further enhance the overall customer experience, both in terms of driving and vehicle ownership?Ford Bronco Raptor

 

2022 Bronco Raptor

 

Ken:

One example is, of course, talking to your car. Today we ask users for voice feedback, and they tell their car what went wrong, and we get the file as a text file sent from the car. Soon, we want people to be able to leave say an NFL football game, get into their car and say things like “set navigation to home and use the most streets for autonomous mode as possible,” or heck, “take me home KITT.” That last one is a call back to the show Knight Rider from the 80’s.   As I mentioned before, we have Alexa in our newer vehicles and can certainly plan to build on that technology. 

 

Connie:

Collaborations and partnerships: Are there any notable collaborations or partnerships with other companies or research institutions that are contributing to Ford's AI advancements in the auto sector?

 

Ken:

There is a lot of competition and collaboration in autonomous vehicle space. I can’t really share any details about what Ford is doing here.

 

I partner heavily with our big cloud partners including Google, Amazon, Microsoft and Salesforce to build and pilot many new capabilities. 

I can say we are working with suppliers on how to better share usage and wear data so they can make better parts for us. I can also say that in my areas I partner heavily with our big cloud partners including Google, Amazon, Microsoft and Salesforce to build and pilot many new capabilities.  Those companies have experience working with other manufacturers and bring expertise to help us accelerate our work and progress. I wish I could say more but well, I can’t.

 

Connie:

Let’s talk about future trends, what emerging AI and LLM trends do you anticipate will have a significant impact on the automotive industry in the coming years, and how is Ford preparing for them?

 

We have a lot of software developers and Copilot from Microsoft is something we are heavily piloting. Google is working with us on how we can help find just the right video clip at the right time stamp when working on a car. 

I’m sure there is a lot of voice interaction coming but for me I’m looking at shorter term things like Microsoft Copilot for our developers to improve productivity, better integration of video search for our mechanics, better faster diagnostics of vehicle problems, and better synthetic data.  We have a lot of software developers and Copilot from Microsoft is something we are heavily piloting.  Google is working with us on how we can help find just the right video clip at the right time stamp when working on a car.  Even better Prognostics is one of my big focus areas.

 

Ken:

I’m sure there is a lot of voice interaction coming but for me I’m looking at shorter term things like Microsoft Copilot for our developers to improve productivity, better integration of video search for our mechanics, better faster diagnostics of vehicle problems, and better synthetic data.  We have a lot of software developers and Copilot from Microsoft is something we are heavily piloting.  Google is working with us on how we can help find just the right video clip at the right time stamp when working on a car.  Even better Prognostics is one of my big focus areas.

 

Let me expand a little on synthetic data and why it is so important. One big area I didn’t mention is the importance of modeling all conditions for autonomous vehicles. Well, it doesn’t rain much in Arizona, but it does rain.  Our ability to get video and sensor data to train models in wet conditions in drier climates, like Arizona, is limited. With AI models for synthetic data, we can create these variants, bootstrapping our models, and accelerate development. We also use this data in testing as well.

 

Connie:

Thanks so much for you time Ken! Final question, could you talk to us about challenges and ethical considerations you have. What are some of the key challenges and ethical considerations Ford encounters when implementing AI technologies in the auto sector, and how are these challenges being addressed?

 

Ken:

At Ford our mission is, “To help build a better world, where every person is free to move and pursue their dreams.” Freedom is a big part of that promise and that means to us privacy and safety.

 

Cars in a way are a more personal item for individuals than even their computers or smart phones. Smart Phones have been connected from their inception. Cars being connected is pretty new. People like the anonymity and freedom that comes from cars, but they like the cool new software defined features that can come from being connected. How we go forward ensuring to our customers that we are protecting their privacy and improving their safety is an ongoing balancing act. In addition, we have constantly evolving government mandates to manage.

 

I'm certain we will run into other ethical challenges over time. If a car in autonomous mode injures someone, who is liable, and under what conditions? What if the camera was damaged in an earlier minor accident and the driver didn’t take immediate action? What if we didn’t detect and didn’t warn the driver about the impending accident?

 

Many of our vehicles are platforms used to build other equipment like a bucket loader to repair downed power lines. Who is responsible when the vehicle is modified from original specifications whether for commercial reasons or I just want to have ground effect LEDs and a super bumping heavy base sound system. Those kinds of modifications have always been part of the auto industry and now we have AI as another complexity. I’m sure we’ll find our way through the uncertainties, but it isn’t all figured out just yet.

 

AI is a big part of the current and future state of the auto industry in both how we make, ship, and maintain vehicles and it is also a big part of the customer experience.

Thanks for the questions. AI is a big part of the current and future state of the auto industry in both how we make, ship, and maintain vehicles and it is also a big part of the customer experience. Like all industries we are embracing rapidly, growing in knowledge, and pressing forward.

 

Connie Yang is Principal, AI and Data Science at DesignMind.  She is an accomplished AI and Data Science leader with a strong background in data engineering.  Learn about DesignMind's AI and Data Science solutions