This year at CES, Intel is kicking off the 2021 trade show with not one but two press conferences. The company is reserving its computing announcements for the afternoon, starting at 1 p.m. PT. That’s when we expect to hear more about its 11th-gen Rocket Lake desktop processor platform and much more.
Meanwhile, Intel plans to use the morning hours to announce the latest advancements in autonomous driving made by its Mobileye division.
We’ll be updating this post throughout the week of CES, but here’s everything Intel has announced so far.
Mobileye announcements
Intel’s morning CES session is purely dedicated to its autonomous driving endeavors through the company’s Mobileye unit. The news conference, headlined by Mobileye‘s CEO and Intel SVP Amnon Shashua, has just kicked off, and we’ll be updating this post as more announcements roll in. If you’re expecting PC announcements, be sure to tune in this afternoon for news surrounding Intel’s processors, as we’ll update this post throughout the day.
“Mobileye is a growth engine for the company, and Intel is fully committed to the business,” Shashua said of the unit giving Intel’s press conference this year.
Mobileye is boosting its scale of testing. Shashua talked about driving policy algorithms and building out high definition maps as factors that affect scalability. The company believes that its driving policy is now transferrable, so it can enter new markets for testing. Now, after five years, the company feels that it can build production maps at scale.
With COVID ravaging the planet, the company said it had adapted its business model to scale in order to enter new markets. Rather than building an entirely new facility, Shashua said that Mobileye was able to ship a vehicle — in two weeks with two field employees — set it up, and start demonstrating to OEM partners in Munich, Germany.
“This gave us the feeling that now we can scale,” he said. The company wants to expand to Shanghai, Tokyo, New York, and Detroit, among other places globally.
Mobileye said that it measures failure rates in terms of hours of driving. In the US, you get a crash about every 5,000 miles. This is about one out of about 50,000 hours of driving.
With 50,000 cars, this means that every hour on average, we’ll have an accident that is our fault based on human driving. But with autonomous driving, this isn’t acceptable — it has to be better. So Mobileye developed a redundancy system to help mitigate crash for a level four system.
Cameras, radars, and lidars help to create that redundant system for autonomous driving, Shashua said.
“From a technological point of view, it’s so crucial to do the hard work,” he said. It’s not about just the radars and lidars. The company approaches autonomous driving by building an end-to-end camera system first, and then by adding the radar and lidar as redundant system.
Mobileye is also building a level two system for Chinese markets. Since it’s a camera-based system, it would be affordable for consumers, the company stated. And that’s the path that Mobileye is taking to monetize its efforts before reaching level four reliability.
The company believes that consumer autonomous vehicles will take some time, so it wants to tackle robo taxis first. The robo taxi will be built with Luminar Lidars and no cameras, but the cameras will be added before launch.
Come 2025, Mobileye is looking at the next-generation of Lidars called FMCW. Intel has an advantage in this area, Shashua said, with photonics with active and passive layers on chips.
On the radar side, the company is targeting imaging radar, which has higher resolution and is software defined rather than analog defined.
And as technologies mature, Shashua believes that we’ll have affordable level four consumer autonomous vehicles by 2025.
Shashua claimed that its Responsibility Sensitivity Safety (RSS) framework is one of the company’s top achievement because society will not accept any lapse of judgment with a computer. With autonomous vehicles, it has to behave with human judgment because it will interact with humans. And the idea of being careful — yielding when you don’t have the right of way — isn’t mathematically defined. With RSS, it defines mathematically what it means to “be careful” when driving. The framework takes the worst case with the assumptions that have been defined with regulators. It’s a rule-based kind of thinking to define what reckless and what is careful, Shashua said. The theory has to be fully transparent.
“RSS is one of our crown jewels of achievement, and we deliberately made it transparent,” Shashua said. In essence, RSS gave Mobileye a calculus of human behavior while driving, which extends beyond red lights and speed limits.
In building a perception system, the bottleneck is understanding all the risks and the semantics of the road, not identifying people or objects on the road. It’s about understanding priorities on the path to determine who needs to yield, Shashua said. “It’s so detailed that the probability of not making a mistake in a single path is almost unachievable,=,”
By piecing information from multiple cars, the company realized that it can do better than any data collected from a single car. This is called building a map — not the navigational map from Google Maps, but data from cars. This is what Mobileye defined as its REM technology, or Road Experience Management. The company wants to collect low data bandwidth and do the hard work, as detailed high bandwidth data would be costly to upload, and OEM partners may not want to invest in that bandwidth. Data collection isn’t about event recording, he said. This is about limiting the complexities.
The company has as much as nearly 1 million vehicles from six car OEMs collecting and sending data to the cloud today with worldwide coverage that’s global and scalable.
Intel had completed its acquisition of the company in 2017 at an estimated enterprise value of $14.7 billion.
“The acquisition couples the best-in-class technologies from both companies, including Intel’s high-performance computing and connectivity expertise and Mobileye’s leading computer vision expertise to create automated driving solutions from the cloud through the network to the car,” the companies said at the time.
The acquisition has allowed Intel to make a serious pitch as a self-driving platform and expand beyond just computing. In this space, it will have to compete against technologies developed by Google’s Waymo and Nvidia Drive platforms.
Intel and Nvidia are finding themselves competing against each other in more ways in recent years — Intel’s entry into the graphics processor space is also encroaching on an area dominated by Nvidia’s GeForce RTX graphics cards. Most recently and ahead of CES 2021, automaker Nio announced that it is working with Nvidia on that company’s Drive platform to bring advanced autonomous driving capabilities to the market.
Computing announcements
The computing keynote is expected to kick off at 1 p.m. PT. You’ll be able to watch the livestream from Intel’s website, but we’ll also be covering all the latest news as we update this post.
In addition to all the latest news from Intel, be sure to also follow Digital Trends’ CES 2021 hub, where we will have all the latest tech news and announcements from the show, along with hands-on sessions with products and in-depth analysis.
Editors’ Recommendations