Word abonnee en neem Beursduivel Premium
Rode planeet als pijlen grid met hoorntjes Beursduivel
Aandeel

TomTom NL0013332471

Laatste koers (eur)

5,700
  • Verschill

    -0,280 -4,68%
  • Volume

    569.243 Gem. (3M) 307,3K
  • Bied

    -  
  • Laat

    -  
+ Toevoegen aan watchlist

De hype cycle van self-driving cars

3.871 Posts
Pagina: «« 1 ... 170 171 172 173 174 ... 194 »» | Laatste | Omlaag ↓
  1. Wipo 23 januari 2018 23:48

    October 08, 2015
    Sony Acquires Belgian Innovator of Range Image Sensor Technology, Softkinetic Systems S.A., in its Push Toward Next-Generation Range Image Sensors and Solutions

    Tokyo, Japan - Sony Corporation ("Sony") is announcing that it has completed the acquisition of Softkinetic Systems S.A. ("Softkinetic"), after reaching an agreement with the company and its major shareholders. With this acquisition, Softkinetic - which possesses time-of-flight ("ToF") range image sensor technology, as well as related systems and software - has become a wholly-owned subsidiary of Sony.

    ToF is a method for resolving the distance to an object. ToF distance measurement pixels, which are laid on top of the sensor in two dimensions, measure the flight time (delay) it takes for light to leave the light source, reflect off the object, and return to the image sensor.
  2. forum rang 8 Beperktedijkbewaking 24 januari 2018 07:55
    quote:

    Wipo schreef op 23 januari 2018 10:10:

    [...]
    Ik heb mijn vuistregel verbeterd in 0,3 meter per nanoseconde en maak daardoor geen rekenfoutjes meer, slice breedte dan 2,25 meter
    Ik dacht al...
    Enfin, beter ten halve gekeerd (in de 'slice') dan ten hele gedwaald, haha.
  3. Wipo 24 januari 2018 13:00
    quote:

    Wipo schreef op 23 januari 2018 23:48:

    October 08, 2015
    Sony Acquires Belgian Innovator of Range Image Sensor Technology, Softkinetic Systems S.A., in its Push Toward Next-Generation Range Image Sensors and Solutions

    Tokyo, Japan - Sony Corporation ("Sony") is announcing that it has completed the acquisition of Softkinetic Systems S.A. ("Softkinetic"), after reaching an agreement with the company and its major shareholders. With this acquisition, Softkinetic - which possesses time-of-flight ("ToF") range image sensor technology, as well as related systems and software - has become a wholly-owned subsidiary of Sony.

    ToF is a method for resolving the distance to an object. ToF distance measurement pixels, which are laid on top of the sensor in two dimensions, measure the flight time (delay) it takes for light to leave the light source, reflect off the object, and return to the image sensor.
    Zie:
    www.sony.net/SonyInfo/News/Press/2017...

    En ook:
    www.baslerweb.com/en/products/cameras...
  4. [verwijderd] 8 februari 2018 09:29
    Musk in de cc van vannacht over de sensoren. Hij klinkt vrij overtuigend. Wellicht kan iemand van de techneuten hier dit kort toelichten/samenvatten.

    And our next question comes from the line of David Tamberrino with Goldman Sachs. Your line is now open.

    David Tamberrino - Goldman Sachs & Co. LLC

    Great. Thank you. Elon, on your autonomous vehicle strategy, why do you believe that your current hardware set of only camera plus radar is going to be able to get you to fully-validated autonomous vehicle system? Most of your competitors noted that they need redundancy from lidar hardware to given the robustness of the 3D point cloud and the data that's generated. What are they missing in their software stack and their algorithms that Tesla is able to obtain from just the camera and plus radar?

    Further, what would be your response if the regulatory bodies required that level of redundancy is really needed from an incremental lidar hardware?

    Elon Reeve Musk - Tesla, Inc.

    Yes. Well, first of all, I should say there's actually three sensor systems. There are cameras, (22:25) redundant forward cameras, there's the forward radar, and there are the ultrasonics for near field. So, the third is also – the third set is also important for near-field stuff, just as it is for human.

    But I think it's pretty obvious that the road system is geared towards passive optical. We have to solve passive optical image recognition, extremely well in order to be able to drive in any given environment and the changing environment. We must solve passive optical image recognition. We must solve it extremely well.

    At the point at which you have solved it extremely well, what is the point in having active optical, meaning lidar, which does not – which cannot read signs; it's just giving you – in my view, it is a crutch that will drive companies to a local maximum that they will find very difficult to get out of.

    If you take the hard path of a sophisticated neural net that's capable of advanced image recognition, then I think you achieve the goal maximum. And you combine that with increasingly sophisticated radar and if you're going to pick active proton generator, doing so in 400 nanometer to 700 nanometer wavelength is pretty silly, since you're getting that passively.

    You would want to do active photon generation in the radar frequencies of approximately around 4 millimeters because that is equation (24:13) penetrating. And you can essentially see through snow, rain, dust, fog, anything. So, it's just I find it quite puzzling that companies would choose to do an active proton system in the wrong wavelength. They're going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary. And I think they will find themselves at a competitive disadvantage.

    Now perhaps I am wrong. In which case, I'll look like a fool. But I am quite certain that I am not.
  5. [verwijderd] 8 februari 2018 10:30
    quote:

    groeibriljant schreef op 8 februari 2018 09:29:

    Musk in de cc van vannacht over de sensoren. Hij klinkt vrij overtuigend. Wellicht kan iemand van de techneuten hier dit kort toelichten/samenvatten.

    And our next question comes from the line of David Tamberrino with Goldman Sachs. Your line is now open.

    David Tamberrino - Goldman Sachs & Co. LLC

    Great. Thank you. Elon, on your autonomous vehicle strategy, why do you believe that your current hardware set of only camera plus radar is going to be able to get you to fully-validated autonomous vehicle system? Most of your competitors noted that they need redundancy from lidar hardware to given the robustness of the 3D point cloud and the data that's generated. What are they missing in their software stack and their algorithms that Tesla is able to obtain from just the camera and plus radar?

    Further, what would be your response if the regulatory bodies required that level of redundancy is really needed from an incremental lidar hardware?

    Elon Reeve Musk - Tesla, Inc.

    Yes. Well, first of all, I should say there's actually three sensor systems. There are cameras, (22:25) redundant forward cameras, there's the forward radar, and there are the ultrasonics for near field. So, the third is also – the third set is also important for near-field stuff, just as it is for human.

    But I think it's pretty obvious that the road system is geared towards passive optical. We have to solve passive optical image recognition, extremely well in order to be able to drive in any given environment and the changing environment. We must solve passive optical image recognition. We must solve it extremely well.

    At the point at which you have solved it extremely well, what is the point in having active optical, meaning lidar, which does not – which cannot read signs; it's just giving you – in my view, it is a crutch that will drive companies to a local maximum that they will find very difficult to get out of.

    If you take the hard path of a sophisticated neural net that's capable of advanced image recognition, then I think you achieve the goal maximum. And you combine that with increasingly sophisticated radar and if you're going to pick active proton generator, doing so in 400 nanometer to 700 nanometer wavelength is pretty silly, since you're getting that passively.

    You would want to do active photon generation in the radar frequencies of approximately around 4 millimeters because that is equation (24:13) penetrating. And you can essentially see through snow, rain, dust, fog, anything. So, it's just I find it quite puzzling that companies would choose to do an active proton system in the wrong wavelength. They're going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary. And I think they will find themselves at a competitive disadvantage.

    Now perhaps I am wrong. In which case, I'll look like a fool. But I am quite certain that I am not.
    Dit komt vast van SA, aan al die vertaalfouten te zien. Lees voor Proton: Photon. Passive is een camera die alleen licht ontvangt, active is een device dat zelf licht uitzendt en aan de hand van het teruggekaatste licht een meting doet (range). Meer kan ik er voor nu ook niet uit opmaken. Het is wachten op de echte transcriptie om er echt iets over te kunnen zeggen.
    Zie het hypecycledraadje, de post van BDB 19 jan 2018 om 08:59 voor ontwikkelingen op het vlak van Time Of Flight sensoren.
  6. [verwijderd] 8 februari 2018 10:53
    Bedankt. Is idd SA, maar door die paar spelfoutjes lees je wel heen.
    Opvallend is dat Musk bij zijn standpunt blijft en radar + passive camera ziet als de holy grail.

    Overigens is er in de cc ook een vraag over de coast to coast drive. Die komt over 3 tot uiterlijk 6 maanden. Ze hadden vertraging met de AI-software en willen een generieke oplossing. Die wordt dan ook direct na die drive geleverd aan de klanten.
  7. Wipo 8 februari 2018 13:41
    Een passief camera systeem maakt gebruik van daglicht zodat s’nachts alleen radar beschikbaar is.
    Een actief systeem gebruikt een infrarood lichtbron (photon generator) dat ook s’nachts werkt
  8. forum rang 6 leonardus65 8 februari 2018 13:52
    www.nieuwsblad.be/cnt/dmf20180207_033...
    "Namen van klanten mag het bedrijf contractueel niet bekendmaken, maar “in elke nieuwe wagen wereldwijd steken vandaag gemiddeld tien chips van Melexis”, zo maakte het management "
    Tomtom is niet het enige bedrijf dat de namen van zijn klanten(bijna) niet bekend maakt
  9. Wipo 8 februari 2018 14:07
    quote:

    Wipo schreef op 8 februari 2018 13:41:

    Een passief camera systeem maakt gebruik van daglicht zodat s’nachts alleen radar beschikbaar is.
    Een actief systeem gebruikt een infrarood lichtbron (photon generator) dat ook s’nachts werkt
    Tenzij je genoegen neemt met het licht van de koplamen maar dat lijkt mij erg beperkt
  10. forum rang 8 Beperktedijkbewaking 10 februari 2018 08:36
    quote:

    groeibriljant schreef op 8 februari 2018 09:29:

    Musk in de cc van vannacht over de sensoren. Hij klinkt vrij overtuigend. Wellicht kan iemand van de techneuten hier dit kort toelichten/samenvatten.

    ...
    Yes. Well, first of all, I should say there's actually three sensor systems. There are cameras, (22:25) redundant forward cameras, there's the forward radar, and there are the ultrasonics for near field. So, the third is also – the third set is also important for near-field stuff, just as it is for human.

    But I think it's pretty obvious that the road system is geared towards passive optical. We have to solve passive optical image recognition, extremely well in order to be able to drive in any given environment and the changing environment. We must solve passive optical image recognition. We must solve it extremely well.

    At the point at which you have solved it extremely well, what is the point in having active optical, meaning lidar, which does not – which cannot read signs; it's just giving you – in my view, it is a crutch that will drive companies to a local maximum that they will find very difficult to get out of.

    If you take the hard path of a sophisticated neural net that's capable of advanced image recognition, then I think you achieve the goal maximum. And you combine that with increasingly sophisticated radar and if you're going to pick active proton generator, doing so in 400 nanometer to 700 nanometer wavelength is pretty silly, since you're getting that passively.

    You would want to do active photon generation in the radar frequencies of approximately around 4 millimeters because that is equation (24:13) penetrating. And you can essentially see through snow, rain, dust, fog, anything. So, it's just I find it quite puzzling that companies would choose to do an active proton system in the wrong wavelength. They're going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary. And I think they will find themselves at a competitive disadvantage.
    Musk heeft een punt. Met gewone (passieve) optische camera's i.c.m. AI kan een SDC al een aardig beeld van de weg voor zich vormen, ongeveer zoals Mobileye dat doet. Afstandsschatting kan dan door dual-camera of dual lens, via het parallax-principe (de reden dat we twee ogen hebben). En natuurlijk ook door radar (en door time-of-flight camera's, maar daar heeft hij het niet over). Voor zijdelingse afstandsschatting kunnen akoestische sensoren prima zijn.

    Probleempje blijft rijden in het donker. Het licht van de koplampen dan maar? Noem ik niet echt 'passief' meer, en de koplampen van tegenliggers lijken me best een probleem. Radar lost dat niet makkelijk op (te weinig resolutie). Lidar wel.
    Overigens arrogant cq neerbuigend gelul van Elon over 'fotongeneratie' bij radar.
    Het deeltjeskarakter van licht speelt bij aardse toepassingen van radar geen enkele rol.

    Wat in het bovenstaande niet aan de orde komt is de precieze lokalisatie van het voertuig. Daarvoor heb je wel lidar (en TT's RoadDNA als 'weggordijn") nodig.
    Wellicht kan dat ook met radar alleen, via het RRS (Road Radar Signature) van Bosch. Maar daar heeft TomTom volgens eigen zeggen een stake in.

  11. [verwijderd] 10 februari 2018 13:05
    Embark? Nog niet van gehoord....

    www.ttnews.com/articles/embark-self-d...

    Embark’s trucks use a combination of cameras, radar and lidar to track the vehicle’s environment, as well as a form of artificial intelligence to process the data captured by the sensors.

    The company said its use of machine learning enabled it to complete its recent coast-to-coast run without undergoing the expensive and time-consuming process of premapping the entire route.
  12. forum rang 8 Beperktedijkbewaking 10 februari 2018 13:07
    quote:

    Beperktedijkbewaking schreef op 10 februari 2018 08:36:

    [...]
    ...
    Het deeltjeskarakter van licht speelt bij aardse toepassingen van radar geen enkele rol.
    ...
    Even wat preciezer:
    Het deeltjeskarakter van elektromagnetische straling speelt bij aardse toepassingen geen enkele rol boven golflengtes van 0,1 nm (en dat is diep in het Röntgengebied, ASML worstelt al bij 7 nm).

    Dus bij de mm's golflengte van radar al helemaal niet. Dat Musk dan toch over fotonen lult, bewijst hoe zeer hij uit z'n nek kletst.
    Dat hij met Paypal gescoord heeft wil nog niet zeggen dat hij een goede 'science' student was.

    Integendeel, met het de ruimte inschieten van een auto wil hij een moderne farao zijn. Hij zal door tien plagen gestraft worden. De koersval van Tesla is de eerste. Stakingen bij zijn productie kunnen wel eens de volgende zijn.
    En een klantenstaking de derde straf. Lees het Oude Testament voor de volgende straffen. Ook het verhaal over Absalom.

    Twee derde van de Amerikanen crepeert van de armoede, maar Elon schiet een auto de ruimte in...
    Trump en Musk zijn compleet gek geworden. Wat zei Willem van Oranje in 1584 ook weer? Lichtelijk hertaald :
    “Mijn God, mijn God, heb medelijden met mij en met dit arme Amerikaanse volk.”

    Weet, Trump en Musk, dat jullie onafhankelijkheidsprincipes ontleend zijn aan Hollandse watergeuzen en ambachtslieden. Wij waren rond 1770-1800 jullie enige bondgenoten, en voorbeeld voor jullie grondwet (Obama begreep het).
    Vertel ons aub geen sprookjes.

  13. [verwijderd] 20 februari 2018 14:46
    The car from the future - the latest VW concept car can be controlled just using your voice

    The I.D. Vizzion contains no steering wheel or visible controls, instead relying on voice and gestures.

    home.bt.com/tech-gadgets/tech-news/vo...

    ..“For the ‘driver’ and passengers this signifies a new dimension of safety and comfort....je moet het maar geloven hahahahaha gaat er nooit komen wacht ff ze zeggen het infeite zelf

    Since it is a concept vehicle there are no definitive plans to put it into production yet – many car giants show off such cars as way of demonstrating what they think the future of motoring could look like – but a driverless car future is a likely destination in the coming decades.
  14. forum rang 6 pwijsneus 21 februari 2018 21:00
    Nobody Wants to Let Google Win the War for Maps All Over Again
    Self-driving cars need painfully detailed data on every inch of street. Can automakers solve the problem without the reigning superpower of maps?


    By Mark Berger
    21 februari 2018 11:00 CET
    From: Bloomberg hyperdrive
    www.bloomberg.com/news/features/2018-...

    On any given day, there could be a half dozen autonomous cars mapping the same street corner in Silicon Valley. These cars, each from a different company, are all doing the same thing: building high-definition street maps, which may eventually serve as an on-board navigation guide for driverless vehicles.

    These companies converge where the law and weather are welcoming—or where they can get the most attention. For example, a flock of mapping vehicles congregates every year in the vicinity of the CES technology trade show, a hot spot for self-driving feats. “There probably have been 50 companies that mapped Las Vegas simply to do a CES drive,” said Chris McNally, an analyst with Evercore ISI. “It’s such a waste of resources.”

    Autonomous cars require powerful sensors to see and advanced software to think. They especially need up-to-the-minute maps of every conceivable roadway to move. Whoever owns the most detailed and expansive version of these maps that vehicles read will own an asset that could be worth billions.

    Which is how you get an all-out mapping war, with dozens of contenders entering into a dizzying array of alliances and burning tens of millions of investment dollars in pursuit of a massive payoff that could be years away. Alphabet Inc.’s Google emerged years ago as the winner in consumer digital maps, which human drivers use to evade rush-hour traffic or find a restaurant. Google won by blanketing the globe with its street-mapping cars and with software expertise that couldn’t be matched by navigation companies, automakers and even Apple Inc. Nobody wants to let Google win again.

    The companies working on maps for autonomous vehicles are taking two different approaches. One aims to create complete high-definition maps that will let the driverless cars of the future navigate all on their own; another creates maps piece-by-piece, using sensors in today’s vehicles that will allow cars to gradually automate more and more parts of driving.

    Alphabet is trying both approaches. A team inside Google is working on a 3-D mapping project that it may license to automakers, according to four people familiar with its plans, which have not previously been reported. This mapping service is different than the high-definition maps that Waymo, another Alphabet unit, is creating for its autonomous vehicles.

    Google’s mapping project is focused on so-called driver-assistance systems that enable cars to automate some driving features and help them see what’s ahead or around a corner. Google released an early version of this in December, called Vehicle Mapping Service, that incorporates sensor data from cars into their maps.

    For now, Google is offering it to carmakers that use Android Automotive, the company’s embedded operating system for cars. Google has named three partners for that system to date, but other automakers are reluctant to hand their dashboards over to the search giant. So Google is looking to expand the features on the mapping service and find other ways to distribute it, these people said.

    “We’ve built a comprehensive map of the world for people and we are working to expand the utility to our maps to cars,” a Google spokeswoman said in a statement. She declined to comment on future plans.

    At the same time, Waymo and the other giants with sizable driverless research arms—including General Motors Co., Uber Technologies Inc. and Ford Motor Co.—are all sending out their own fleets to create rich, detailed HD maps for use in driverless cars. There are also smaller startups hawking gadgets or specialized software to build these maps for automakers that find themselves farther behind. Still other suppliers are working on mapping services for conventional cars with limited robotic features, such as adaptive cruise control or night vision.

    These self-driving maps are far more demanding than older digital ones, prompting huge investments across Detroit, Silicon Valley and China. "An autonomous vehicle wants that to be as precise, accurate and up-to-date as possible," said Bryan Salesky, who leads Argo AI LLC, a year-old startup backed by a $1 billion investment by Ford. The "off-the-shelf solution doesn't quite exist."
  15. forum rang 6 pwijsneus 21 februari 2018 21:01
    (vervolg)


    The Cartographic Arms Dealers


    Making a driverless map, like making a driverless car, is a laborious task. Fleets of autonomous test cars, loaded with expensive lidar sensors and cameras, go out into the world with human backup drivers and capture their surroundings. Plotting the results helps train the next fleet, which will still have safety drivers at the wheel—and, in some cases, scores of additional humans sitting behind computer monitors to catalog all the footage.

    It’s an expensive ordeal with a payoff that’s years, if not decades, away. “Even if you could drive your own vehicles around and hit every road in the world, how do you update?” asked Dan Galves, a spokesman for Mobileye. “You’d have to send these vehicles around again.”

    Unlike conventional digital maps, self-driving maps require almost-constant updates. The slightest variation on the road—a construction zone that pops up overnight, or a bit of debris—could stop a driverless car in its tracks. “It’s the freak thing that happens that’s going to make autonomous not work,” said McNally, the analyst.

    Mobileye argues that it’s more efficient and cost-effective to let the cars we’re driving today see what’s ahead. In January, the Intel Corp. unit announced a “low-bandwidth” mapping effort, with its front-facing camera and chip sensor that it plans to place in 2 million cars this year. The idea is to get cars to view such things as lane markers, traffic signals and road boundaries, letting them automate some driving.

    Mobileye says this will take less computing horsepower than building a comprehensive HD map of the roads would; Mobileye’s Galves said the company will pair its sensor data with the maps from navigational companies and, over time, create a map that a fully driverless car could use.

    That’s also the tactic of Google’s longtime mapping foes: HERE and TomTom NV. These two European companies have positioned themselves as the primary alternatives to Google Maps, selling the dashboard screen maps to automakers today. Yet these “static” maps see only broad street shapes and capture snapshots in time. Now both companies are working on replacement products: “dynamic” maps that represent lanes, curbs and everything else on the road. The hope is that car manufacturers will stick with old-guard mapmakers as vehicles move from somewhat intelligent to fully automated vehicles without steering wheels.

    HERE, owned by a consortium of German automakers, has a few examples on the road. Its mapping system enables limited hands-free driving for Audi AG, one of its co-owners, and plans to support safety features this year for Bayerische Motoren Werke AG, another co-owner. (Intel also took a 15 percent stake in HERE last year.)

    Tesla Inc. is the car company most eagerly embracing the incremental march toward autonomous driving with its driver-assistance software, Autopilot. Tesla relies on cameras and sensors on its vehicles but has eschewed lidar. The company hasn’t disclosed what mapping service it’s using for Autopilot, and a company representative declined to comment. Tesla had a nasty public split with Mobileye two years ago.

    But Tesla has leaned on at least one other company, Mapbox Inc., to help assemble its maps. Tesla paid $5 million to Mapbox for a two-year licensing deal in December 2015, according to a regulatory filing. Mapbox has mostly sold its location data to apps such as Pinterest and Snapchat. Fresh off a $164 million financing round, the startup has started to inch into automotive maps. Through its software installed on phones, Mapbox said it plots some 220 million miles of road data globally a day, providing an updated snapshot of basic features like street lanes.

    “We have more sensors on the road today than the entire connected car space will have by 2020,” said Chief Executive Officer Eric Gundersen. Its pitch to carmakers is to use that location data as a base layer for future maps—pairing it with camera systems, such as Mobileye’s, or their own sensor data. And like other companies targeting automakers, Mapbox is happy to play neutral and work with anyone. “We don't know who is going to win,” Gundersen said.
  16. forum rang 6 pwijsneus 21 februari 2018 21:01
    (vervolg)

    The New Hotshot Pathfinders

    It’s not just that no one knows who will come out on top. The mapping industry doesn’t even know which strategy is best. Every self-driving map looks different because each one depends on the sensor system of the vehicle that creates it. And there isn't a standard sensor package, said Spark Capital’s Nabeel Hyatt, an early investor in Cruise Automation, the autonomous-driving company bought by General Motors in 2016 for $581 million.

    As a result, a slew of HD mapping companies are taking different stabs at the problem, each gobbling up venture capital and competing for lucrative contracts. Some of them disparage Mobileye’s approach, which relies on a seamless transition from semi-autonomous driving (what’s called Level 2 and 3) to driving without human assistance (Level 4 or 5). “It’s very hard to climb the ladder from 2 to 3 and then to 4,” said Wei Luo, COO of DeepMap Inc. “There’s a very intense gap.” The best HD maps, Luo argues, are built with only driverless functions in mind. The startup said it's working with Ford, Honda Motor Co. and China’s SAIC Motor Corp. (Mobileeye is also working with SAIC, and Waymo is in talks with Honda.)

    Waymo is in this camp, too. The effort formerly known as the Google self-driving car project started on maps in 2009, with Waymo’s Andrew Chatham and one other engineer doing the “super tedious” work of crafting them from scratch—shipping cars packed with sensors to capture a city’s surroundings, then coding those 3-D images into a digital landscape. Chatham said cars may rely on perceptions systems alone to drive on the highway but would be helpless in other traffic conditions. Imagine pulling up to a busy, double-left-lane intersection you’ve never seen before. Now imagine a self-driving car trying to do that.

    “That’s the advantage of having a detailed map,” said Chatham. “We can give the cars all the answers to the nasty questions.” He said Waymo is exploring solutions to mapping real-time factors such as construction updates, but declined to share details.

    Thanks to its years of effort and artificial intelligence arsenal, Waymo is considered the leader in HD maps. But to date, the company has pitched its entire suite to prospective partners and landed few. Chatham declined to say whether Waymo is considering selling its map as a separate product.

    Another potential force in this market is Uber. The ride-hailing giant is also working on HD maps for its driverless program, using test vehicles in a similar way to Waymo. Lisa Weitekamp, an Uber manager, said the private company is exploring ways to place map-generating sensors inside the millions of human-driven vehicles in its service. The maps those cars already use—the “static” navigation software in the app that takes in popular routes and driving decisions—helps inform Uber’s driverless maps, Weitekamp added. “It gives us a leg up,” she said.

    That would make access to ride-hailing maps a valuable asset. Currently, Uber uses a combination of TomTom, Google and its own data for the maps its drivers and riders see. The contract between Uber and Google is set to expire this year, according to two people familiar with the deal. Representatives from both companies declined to comment.

    Plenty of newcomers are pitching carmakers on the need to catch up with front-runners such as Waymo and Uber. DeepMap Inc., started by veterans of Google and Apple, is banking on its intelligent software to cut down the time and cost involved in converting the images pulled from self-driving car sensors into a single, high-resolution landscape. The startup said it's working with Ford, Honda Motor Co. and China’s SAIC Motor Corp.

    Civil Maps has tech that “fingerprints” sensor data, forming digital grids with each loop made by a mapping vehicle around the same area. It's a bit like the way the mobile app Shazam recognizes a piece of music, said CEO Sravan Puttagunta. Ford is an investor and Puttagunta said his company is in the process of raising additional money.

    For now, most car companies are testing the waters rather than cutting massive, multimillion-dollar deals for maps. A Ford spokesman described its work with startups as “research.” Argo, the automaker’s self-driving bet, has looked at a variety of suppliers but is currently relying on its own internal maps. GM spokesman Ray Wert said the company prefers to do its own mapping.

    The new entrants know they can’t all survive. “It’s very similar to navigational maps or even the search engine,” said DeepMap’s Luo, a former Googler. “Whoever has bigger scale will have the advantage.”
3.871 Posts
Pagina: «« 1 ... 170 171 172 173 174 ... 194 »» | Laatste |Omhoog ↑

Neem deel aan de discussie

Word nu gratis lid van Beursduivel.be

Al abonnee? Log in

Macro & Bedrijfsagenda

  1. 22 april

    1. NL investeringen februari
    2. SAP Q1-cijfers
    3. NL consumentenvertrouwen april
    4. NL prijzen bestaande koopwoningen maart
    5. ING jaarvergadering
    6. VS Chicago Fed-index maart
    7. EU consumentenvertrouwen april (voorlopig)
  2. 23 april

    1. Japan samengestelde inkoopmanagersindex april
    2. Novartis Q1-cijfers
    3. Renault Q1-cijfers
de volitaliteit verwacht indicator betekend: Market moving event/hoge(re) volatiliteit verwacht