Autonomous trucks: Close encounters of the dangerous kind

September 28, 2023

Mark Schremmer

|

My first job in journalism was at my hometown newspaper more than 20 years ago. I was attending the local university and started working part time at the daily paper to learn the profession.

One of my first tasks was to type in the obituaries of area residents. While it was an entry-level task, the level of importance was made clear. These were people’s life stories, and that wasn’t something to take lightly.

This meant we followed a strict process to do everything we could to avoid mistakes. After typing in the obituary, we called the funeral home if there were any inconsistencies in spelling or any other questions. An editor then read the obituary, looking for any typos or inconsistencies the writer may have missed.

Then before the obituary went to the press, the editor read it aloud while the writer read along. The editor not only read the text but also spelled out the names of every person and place as a final line of defense. The process was tedious but thorough.

Fast-forward 20-plus years, and at least some news outlets have decided to let artificial intelligence take charge of writing news and sports stories – and even obituaries. Proponents will tell you it’s about efficiency. They will say it’s about freeing up staff to work on more in-depth projects. In reality, though, it’s all about saving money.

Well, let’s check in to see how that cost savings is working out.

Earlier this month, Microsoft’s MSN news portal published what appears to have been an AI-generated obituary of former NBA player Brandon Hunter, who died on Sept. 12 at age 42. As you may have heard or guessed, the article was disastrous in nearly every way imaginable.

The headline read, “Brandon Hunter useless at 42.”

That was bad enough, but it didn’t end there. Using synonyms for various words in an apparent attempt to avoid plagiarism, the technology said that Hunter played “video games” instead of basketball games and that he received “All-MAC convention alternatives” instead of All-MAC conference selections.

This may be the worst example of AI-generated journalism so far, but it’s certainly not the only one. MSN also published a travel guide for Ottawa, Canada, that recommended tourists stop at a local food bank and suggested they arrive with an “empty stomach.” Some newspapers also have used AI to generate local sports stories, often leading to a garbled mess that for some unknown reason loves to use the phrase, “close encounters of the athletic kind.”

The technology obviously is being deployed before it is ready, and there are seemingly few, if any, checks and balances before these articles are published. At best, the articles are unreadable. At worst, they are terribly offensive.

But as bad as these stories are, they’re nothing compared to the physical harm that could be caused when AI technology is tasked with operating an 80,000-pound autonomous truck without a driver in the seat.

Just like with AI-generated articles, proponents will tell you that it’s about efficiency. They’ll say the technology will be used only on certain routes and that it’s needed to make up for a driver shortage, which doesn’t even exist.

But just like with media companies, it’s really about saving money. And instead of following all of the steps that professional truck drivers have complied with for years to ensure a safe trip, many autonomous companies want to cut regulatory corners to bring the technology to the marketplace.

A recent House Transportation and Infrastructure Committee hearing about autonomous trucks touched on some safety concerns.

Del. Eleanor Holmes Norton, D-D.C., asked Chris Urmson, chief executive officer of the autonomous company Aurora, about how the technology would work when faced with difficult decisions.

“Is Aurora able to guarantee that its technology will prioritize the safety of people, not property, not infrastructure, but people when making split-second decisions on the highway?” Holmes Norton asked.

Urmson said that “safety is paramount” but did not directly answer the question.

Witness Cathy Chase, president of Advocates for Highway and Auto Safety, said the technology is not ready and pointed to recent crashes involving autonomous vehicles. Chase also noted examples of San Francisco police and firefighters being prevented from reaching emergencies by autonomous vehicles obstructing the roads.

“I think we absolutely need prudence when discussing autonomous vehicles of any level,” Chase said. “The biggest example we have so far is San Francisco, and it’s not going so well, to say the least. It’s a problem. We can’t just stick our head in the sand and pretend these problems don’t exist.”

Proponents of AI-generated journalism did just that, and we’re seeing the horrendous results. But if officials stick their heads in the sand regarding the problems with autonomous trucks, the results won’t just be in poor taste. They will be catastrophic. LL