MOTOSHARE đđď¸
Turning Idle Vehicles into Shared Rides & Earnings
From Idle to Income. From Parked to Purpose.
Earn by Sharing, Ride by Renting.
Where Owners Earn, Riders Move.
Owners Earn. Riders Move. Motoshare Connects.
With Motoshare, every parked vehicle finds a purpose.
Owners earn. Renters ride.
đ Everyone wins.
Source:- wired.com
Fasten your harnesses, because the era of cloud computingâs giant data centers is about to be rear-ended by the age of self-driving cars. Hereâs the problem: When a self-driving car has to make snap decisions, it needs answers fast. Even slight delays in updating road and weather conditions could mean longer travel times or dangerous errors. But those smart vehicles of the near-future donât quite have the huge computing power to process the data necessary to avoid collisions, chat with nearby vehicles about optimizing traffic flow, and find the best routes that avoid gridlocked or washed-out roads. The logical source of that power lies in the massive server farms where hundreds of thousands of processors can churn out solutions. But that wonât work if the vehicles have to wait the 100 milliseconds or so it usually takes for information to travel each way to and from distant data centers. Cars, after all, move fast.
That problem from the frontier of technology is why many tech leaders foresee the need for a new âedge computingâ networkâone that turns the logic of todayâs cloud inside out. Today the $247 billion cloud computing industry funnels everything through massive centralized data centers operated by giants like Amazon, Microsoft, and Google. Thatâs been a smart model for scaling up web search and social networks, as well as streaming media to billions of users. But itâs not so smart for latency-intolerant applications like autonomous cars or mobile mixed reality.
âItâs a foregone conclusion that giant, centralized server farms that take up 19 city blocks of power are just not going to work everywhere,â says Zachary Smith, a double-bass player and Juilliard School graduate who is the CEO and cofounder of a New York City startup called Packet. Smith is among those who believe that the solution lies in seeding the landscape with smaller server outpostsâthose edge networksâthat would widely distribute processing power in order to speed its results to client devices, like those cars, that canât tolerate delay.
Packetâs scattered micro datacenters are nothing like the sprawling facilities operated by Amazon and Google, which can contain tens of thousands of servers and squat outside major cities in suburbs, small towns, or rural areas, thanks to their huge physical footprints and energy appetites. Packetâs centers often contain just a few server racksâbut the company promises customers in major cities speedy access to raw computing power, with average delays of just 10 to 15 milliseconds (an improvement of roughly a factor of ten). That kind of speed is on the âmust haveâ lists of companies and developers hoping to stream virtual reality and augmented reality experiences to smartphones, for example. Such experiences rely upon a neurological processâthe vestibulo-ocular reflexâthat coordinates eye and head movements. It occurs within seven milliseconds, and if your device takes 10 times that long to hear back from a server, forget about suspension of disbelief.
Immersive experiences are just the start of this new kind of need for speed. Everywhere you look, our autonomously driving, drone-clogged, robot-operated future needs to shave more milliseconds off its network-roundtrip clock. For smart vehicles alone, Toyota noted that the amount of data flowing between vehicles and cloud computing services is estimated to reach 10 exabytes per month by 2025.
Cloud computing giants havenât ignored the lag problem. In May, Microsoft announced the testing of its new Azure IoT Edge service, intended to push some cloud computing functions onto developersâ own devices. Barely a month later, Amazon Web Services opened up general access to AWS Greengrass software that similarly extends some cloud-style services to devices running on local networks. Still, these services require customers to operate hardware on their own. Customers who are used to handing that whole business off to a cloud provider may view that as a backwards step.
US telecom companies are also seeing their build-out of new 5G networksâwhich should eventually support faster mobile data speedsâas a chance to cut down on lag time. As the service providers expand their networks of cell towers and base stations, they could seize the opportunity to add server power to the new locations. In July, AT&T announced plans to build a mobile edge computing network based on 5G, with the goal of reaching âsingle-digit millisecond latency.â Theoretically, data would only need to travel a few miles between customers and the nearest cell tower or central office, instead of hundreds of miles to reach a cloud data center.
AT&T claims it has a head start on rival telecoms because of its ânetwork virtualization initiative,â which includes the software capability to automatically juggle workloads and make good use of idle resources in the mobile network, according to Fuetsch. Itâs similar to how big data centers use virtualization to spread out a customerâs data processing workload across multiple computer servers.
Meanwhile, companies such as Packet might be able to piggyback their own machines onto the new facilities, too. âI think weâre at this time where a huge amount of investment is going into mobile networks over the next two to three years,â Packetâs Smith says. âSo itâs a good time to say âWhy not tack on some compute?ââ (Packetâs own funding comes in part from the giant Japanese telecom and internet conglomerate Softbank, which invested $9.4 million in 2016.) In July 2017, Packet announced its expansion to Ashburn, Atlanta, Chicago, Dallas, Los Angeles, and Seattle, along with new international locations in Frankfurt, Toronto, Hong Kong, Singapore, and Sydney.
Packet is far from the only startup making claims on the edge. Austin-based Vapor IO has already begun building its own micro data centers alongside existing cell towers. In June, the startup announced its âProject Volutusâ initiative, which includes a partnership with Crown Castle, the largest US provider of shared wireless infrastructure (and a Vapor IO investor). That enables Vapor IO to take advantage of Crown Castleâs existing network of 40,000 cell towers and 60,000 miles of fiber optic lines in metropolitan areas. The startup has been developing automated software to remotely operate and monitor micro data centers to ensure that customers donât experience interruptions in service if some computer servers go down, says Cole Crawford, Vapor IOâs founder and CEO.
Donât look for the edge to shut down all those data centers in Oregon, North Carolina, and other rural outposts: Our eraâs digital cathedrals are not vanishing anytime soon. Edge computingâs vision of having âthousands of small, regional and micro-regional data centers that are integrated into the last mile networksâ is actually a ânatural extension of todayâs centralized cloud,â Crawford says. In fact, the cloud computing industry has extended its tentacles toward the edge with content delivery networks such as Akamai, Cloudflare, and Amazon CloudFront that already use âedge locationsâ to speed up delivery of music and video streaming.
Nonetheless, the remote computing industry stands on the cusp of a âback to the futureâ moment, according to Peter Levine, general partner at the venture capital firm Andreessen Horowitz. In a 2016 video presentation, Levine highlighted how the pre-2000 internet once relied upon a decentralized network of PCs and client servers. Next, the centralized network of the modern cloud computing industry really took off, starting around 2005. Now, demand for edge computing is pushing development of decentralized networks once again (even as the public cloud computing industryâs growth is expected to peak at 18 percent this year, before starting to taper off).
That kind of abstract shift is already showing up, unlocking experiences that could only exist with help from the edge. Hatch, a spinoff company from Angry Birds developer Rovio, has begun rolling out a subscription game streaming service that allows smartphone customers to instantly begin playing without waiting on downloads. The service offers low-latency multiplayer and social gaming features such as sharing gameplay via Twitch-style live-streaming. Hatch has been cagey about the technology it developed to slash the number of data-processing steps in streaming games, other than saying it eliminates the need for video compression and can do mobile game streaming at 60 frames per second. But when it came to figuring out how to transmit and receive all that data without latency wrecking the experience, Hatch teamed up withâguess whoâPacket.
âWe are one of the first consumer-facing use cases for edge computing,â says Juhani Honkala, founder and CEO of Hatch. âBut I believe there will be other use cases that can benefit from low latency, such as AR/VR, self-driving cars, and robotics.â
Of course, most Hatch customers will not know or care about how those micro datacenters allow them to instantly play games with friends. The same blissful ignorance will likely surround most people who stream augmented-reality experiences on their smartphones while riding in self-driving cars 10 years from now. All of us will gradually come to expect new computer-driven experiences to be made available anywhere instantlyâas if by magic. But in this case, magic is just another name for putting the right computer in the right place at the right time.
âThere is so much more that people can do,â says Packetâs Smith, âthan stare at their smartphones and wait for downloads to happen.â We want our computation now. And the edge is the way weâll get it.