Friday 1st December saw a very healthy turnout for the last EEO-AGI seminar of 2017, given by the engaging and convivial Ed Parsons, Google’s Geospatial Technologist and self-professed geographer-in-residence. Having moved from GIS Applications Manager at Autodesk, to Chief Technology Officer for Ordnance Survey, and acting as Executive Fellow at University of Aberdeen and Visiting Professor at UCL, Ed’s career has carved a path in geospatial data management and visualisation. In his current role, he now seeks to evangelise Google’s efforts to improve the world using geospatial data. As he sees it, Google may seem like a giant frightening techno-monster; but Ed is here to act as the friendly geographer conduit.
Ed’s enjoyable talk focused on his near and distant predictions for the future of technology and the use of spatial data. 56% of all mobile Google searches are for local information, and with Google Maps now serving 1 billion users, an incredible portion of the world’s population is involving spatial information in its daily life. Suddenly the giant techno-monster analogy didn’t seem so far off…
As Ed joked, ‘only idiots do lectures about the future’, but he made a decent effort in presenting megatrends we can no longer ignore. We are coming into a world of urban living, where residents are comfortable with technology and businesses increasingly make successful use of ‘big data’, APIs and web services. Ambient location is now becoming a natural part of life, with the introduction of Google Maps app for iOS in 2007 helping maps transition from static information to dynamic tools for daily tasks. Maps now operate as egocentric, placing the user at the centre of the data. With such capabilities as travel notifications as you walk into a train or bus station, digital assistants like Alexa and Siri, and lights and thermostats that turn on when you reach the vicinity of your house, ‘science fiction technology’ and the use of locational data is becoming part of ordinary life. Indeed, Ed used a quote from Mark Weiser, chief scientist at Xeroc PARC to portray that “the most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” The question of data privacy was raised, and countered with the reassurance that at the heart of ambient location is the freedom to withhold your information. It is very rare nowadays to get lost with locational data at our fingertips, but we must be offered the choice to get lost if we want to.
Voyaging into the world of virtual reality, Ed acknowledged the criticism and the lack of uptake of the Google Glass headwear but highlighted the potential applications and advantages of 3D modelling and user-friendly augmented reality software. The hardware still experiences limitations in true 3D movement, but with the use of SLAM (simultaneous location and mapping) software, photogrammetry techniques are helping to fill in the blanks of indoor mapping and movement in virtual space using just handheld processing power.
Ed next presented a playful analogy of technology over the last three decades as headgear: the hardhats of 1995-2005 (focused on defence, engineering and inexpensive solutions), the fedora and sunglasses of 2005-2015 (mocking hipsters creating a stylish, entertaining and mobile internet) and the robot head of the present (depicting the development of artificial intelligence). Earth Engine, Google’s cloud-based platform for remote sensing image analysis, now offers a fast, free, up-to-date solution to traditionally slow and clunky remote sensing programs. With over 5 petabytes of data available, Ed gave a demo of the platform’s impressive ability to remove clouds from aerial imagery over the UK on-the-fly. As is the way with live demos, a minor snag required Ed to re-log in to his Google account and then perform two-factor authentication, spawning laughter but allowing him to declare the usefulness of this security measure: “I could have been a malicious person trying to steal Ed’s details!”
The tech evangelist finished with some examples of machine learning, with feature recognition in driverless cars, computer-controlled drone racing, landmark recognition from frequent congregations of people sharing their location (“no human was used in the making of this map”), and Global Fishing Watch taking 10TB of ship tracking data and aerial images to identify illegal fishing hotspots. Ed finally thanked everyone who has ever filled in a CAPTCHA, explaining Google’s improvements of street sign and business name photo-recognition by using the training dataset of millions of CAPTCHA answers entered by unwitting humans. Overall, the seminar proved an enjoyable tour of Google and others’ offerings to the progression of geospatial data usage, and where we may see ourselves (and additionally the pitfalls, legalities, and questions of morality we must be aware of) in stepping into a future of machine-and-human harmony.
(MSc in Geographical Information Science at University of Edinburgh)