January 9, 2018
What Lyft’s self-driving car gets right: driving style
Riding in a self-driving Lyft car is way more boring than you’d expect. But that’s really the point.
The ride, which I took on the streets of Las Vegas at the outset of CES 2018, was only remarkable in how mundane it was. Turns, lane changes, braking for red lights, accelerating for green — it was all pretty much the same as if a human were doing the driving. Well, if it weren’t for the display on the dash showing a LiDAR-constructed view of the streets around us, and the robotic female voice that would occasionally chime in with a “lane change checking” or some other status update.
And this is why the self-driving experience Lyft showed off — developed by its platform partner, Aptiv (formerly Delphi) — is so impressive: The drive felt just like an attentive chauffeur. The driving style was very focused on the passenger, certainly: There were no sudden accelerations to make a stale green light or catch up to traffic, for example.
How you begin a self-driving ride with Lyft
Image: Pete Pachal/Mashable
But it was also very human: When waiting for oncoming traffic to thin out so it could perform a U-turn, the car didn’t just position itself in the left turning lane and stop. It moved forward a bit, waited, then after a few seconds moved a bit more, turned the wheel slightly, then crept up a little more as the last couple of cars whizzed by, then did the full turn after they passed. If I didn’t know better, I would have suspected the human safety driver, who by law needs to “spot” the wheel with his hands, was the one driving.
That’s by design, says Lee Bauer of Aptiv. In developing its self-driving profile, it optimized everything for the passenger experience, making sure the ride was as smooth as possible so people inside the car could actually be productive.
“As you transition the mindset from a driver to a passenger, the experience has to be different,” says Bauer. “You drive your car different than you want to be driven.”
That became even clearer to me later in the day, when I transitioned back to a regular Vegas cab, and, in the midst of architecting a particularly nuanced tweet, the driver took a hard left turn to beat some oncoming traffic. The force of the turn took me out of the moment, breaking my flow. Where’s a robot when you need it?
OK, so self-driving cars can behave better — for safety and comfort — but do they always have to look like cars with strange mechanical hats? Aptiv is making progress there, too, integrating sensors into the body of the vehicle in relatively unobtrusive ways. While the Lyft self-driving BMW 5 Series is loaded with two different kinds of LiDAR, radar, and more, it still manages to look like a car, something Uber’s self-driving Volvos, with their camera-laden mechanical hats, can’t claim.
The front grill of the car houses the long-range LiDAR. Short-range LiDAR is beneath
Image: Pete Pachal/mashable.
More LiDAR (dark rectangle) and radar (white rectangle).
Image: Pete Pachal/Mashable
Lyft already has a self-driving trial underway in Boston, but at CES it’s making the service available to anyone with the Lyft app within 550 yards of the Las Vegas Convention Center. Starting Jan. 9, if you happen to be close by, and a car is available, Lyft will give the passenger the option to take a self-driving ride.
!!!!!! #MashableCES #ces2018 pic.twitter.com/0Anhxk86K3
— Pete Pachal (@petepachal) January 8, 2018
That’s definitely a taste of the future, though once they see the normal-looking car and take the quite predictable ride, they might have trouble getting excited about it.
Read more: http://mashable.com/2018/01/08/lyft-aptiv-self-driving-ces/
January 19, 2018
Brilliant New Headlights Use a Million Pixels to Talk to the World
by MeDaryl • Cars • Tags: auto industry, ces, CES2018, transportation
For the most part, digital technology is all about dumping things that move. Complex engines are giving way to simpler computer-controlled electric motors. Mirrorless cameras no longer have to flip mirrors out of the way to take photos, the way DSLR’s do. In the burgeoning lidar laser-sensor business, developers are embracing solid state systems that do away with all that spinning. CD players pulling music tracks from spinning discs? Adios.
So it’s perhaps surprising that one of the flashier automotive innovations that debuted at CES this week centers not just on moving parts but on millions of the things. In Las Vegas, Texas Instruments (which makes way more than those big calculators) unveiled a headlight system that adapts the company’s digital light processing technology to generate beams that can be precisely controlled via more than a million addressable pixels.
This system, so far deployed mostly in cinema and display projectors, works thanks to tiny moveable “micromirrors” that flip as many as 10,000 times a second, to either reflect light through a projection lens, generating a white pixel, or onto a black absorption surface in order to create a black pixel. It’s like turning each pixel on or off. The result is a new, smarter kind of headlight, one you can program to show the way without blinding oncoming drivers, or even project text, graphics, or animations onto the road ahead. (The company used CES to hawk the chipset that allows for this programming, known as DLP5531-Q1, to automakers and their lighting suppliers.)
Texas Instruments doesn’t have a monopoly on digitally programmable headlight technology—Audi unveiled a similar laser-based system last year—but it promises the highest resolution offering so far. Plus, its system works with any light source, so you don’t have to dump more common LED lights for the fancier laser tech.
The clearest immediate use case here is the opportunity to keep the high-beams shining even as other vehicles approach from the opposite direction. The system will track the oncoming car’s headlights via an onboard camera, and dim the bit of its own lights aiming that way, following the car as it gets closer. This tech could also work in conjunction with vehicle sensors to spotlight things that demand the driver’s attention, like roadside signs or animals about to dash into the road.
Eventually, these headlights could become essential for autonomous cars. “The chipset was developed to support adaptive driving beam headlight systems, but is capable of being programed to project information on the road,” says Brian Ballard, TI’s exterior lighting manager. A driverless car won’t have hands to tell that man waiting at the curb that it’s safe to cross the street. Headlights that can project a crosswalk, or even write out “Go for it!,” could fill the communication gap and encourage the public to accept these crash-preventing vehicles onto their streets.
TI says its (so far unnamed) automotive customers are already integrating the tech into their headlights, but don’t expect to see it on US roads anytime soon. These adaptive driving beam systems aren’t legal in the United States, because a very dusty rule demands cars have separate light sources for high and low beams—TI’s system combines them (same for Audi’s laser setup). As with most bureaucracy, the path to updating or killing that rule is arduous, requiring manufacturers prove this is essential safety technology for new cars.
So while we’re thrilled by the prospect of broadcasting the words “Get out of the left lane!” onto the road ahead for slow-moving cars to see (reversed so it’s legible in mirrors, of course), the best case will likely be the improved visibility that doesn’t have to be compromised just because someone else is on the road. We’ll take that too.
Car Talk
Related Video
Please Don't Kick the Inventory Bot
This company wants its robot to take on the tedious and time consuming job of scanning inventory at stores. Please don't assault it while it is working.
Read more: https://www.wired.com/story/texas-instruments-headlights/