Trailblazer trials, real-life applications and the internet of things are showing how big data and predictive analytics will underpin the smart cities of the future.
20 August 2018
Smart cities aim to measure as many aspects of the urban environment as possible, collecting data from sensors in buildings and street furniture to record anything from traffic and pedestrian movements, to environmental variables like temperature, seismic activity or pollution levels. When the data is analysed, the city’s infrastructure can be modified to respond to issues, potentially without human intervention.
In any discussion about smart cities, there is a temptation to refer to the concept in the abstract, as if they are part of some vision of a utopian future. In reality, some cities are already using the data they are capturing to transform the way they operate. Singapore, for example, has deployed an island-wide network of more than 1,000 sensors to collect data from busy areas such as traffic junctions, bus stops and taxi queues. The information is relayed over an ultra-high-speed broadband network to the relevant agencies to help them develop new services to reduce congestion.
In New York, Google-owned Sidewalk Labs has installed more than 200 out of a planned 7,500 LinkNYC kiosks across the city to replace old phone booths. The terminals offer pedestrians free super-fast WiFi and access to city services on a touchscreen, and in future they are likely to support a range of functions.
But the ability to roll out such innovations is not just limited to city authorities or big firms. The Things Network is a global community of more than 2,000 people working to establish free-to-use, crowd-sourced internet of things (IoT) networks that are connected via cheap, long-range, low-bandwidth antennas.
In Amsterdam, members were able to provide city-wide IoT coverage by installing 10 such “gateways”, costing $1,200 each, in just six weeks. The network is based on low-frequency technology that has a reach of around 10km and requires no WiFi password or mobile subscription to access. Energy consumption is low – devices can run up to three years on a single charge.
The service is being used to develop a range of open-source apps aimed at improving city life. One is designed to prevent Amsterdam’s canal boats sinking when they fill with water in rainy weather. An IoT device installed in the vessel detects the presence of water and sends a message to the owner. If they respond, a service called SaveMyBoat locates the vessel and clears the water out.
While vast amounts of data from sensors, databases or other forms of documentation can help cities respond to the needs of their citizens, further benefits come from being able to analyse the data to forecast events and allocate resources appropriately. Predictive analytics can be used to smooth traffic flows, or minimise the risk of overloading critical aspects of the city’s infrastructure by redirecting people and traffic into other areas.
The Bristol Is Open smart city test bed, a collaboration between the University of Bristol and Bristol City Council, has installed a “Data Dome” in the city’s former planetarium, where real-time data, gathered from IoT sensors across the city, is collected and analysed using a high-performance server and visualised in 3D on a giant screen. In one demonstration, the system collected CO2 emissions data and used it to compose 3D models showing almost real-time levels of pollution.
Dimitra Simeonidou, chief technology officer for Bristol Is Open, explains: “The system was configured to show the likely impact of interventions, such as changing pollution levels when traffic was rerouted. In future, it will enable many other functions, we will be able to visualise transport and citizen engagement, and make many smart interventions in the city.”
And at New York’s Mayor’s Office of Data Analytics (Moda), a small team interrogates spreadsheets and other documents, sourced from the city’s public bodies, to anticipate, predict and prevent problems and direct resources more effectively. Working with the New York Fire Department, Moda created an algorithm to more accurately predict the likelihood of fires in certain buildings. And, in the wake of Hurricane Sandy in 2012, it integrated data from city agencies, National Guard surveys and utilities suppliers to streamline the allocation of disaster response resources.
Inspired by Moda, London hopes to set up its own office of data analytics. “Using data that’s already available is the most fruitful area for public agencies to look into,” says Stefan Webb, head of projects at the Future Cities Catapult. He points to work carried out by London-based proptech firm Land Insight, which combines data sets drawn from the UK Land Registry and local authorities to provide an online tool for developers seeking new opportunities.
“There are different paradigms in terms of how local authorities make their data available. Whereas before it was sufficient to put it on a website, now, cities are curating their data and seeking solutions to different problems” - Stefan Webb, Head of projects, The Future Cities Catapult
When extremely large data sets are combined and analysed by computers; new patterns, trends, and associations are revealed, which could result in many useful new insights or services.
Webb cites the UK city Milton Keynes as being a pioneer of this big data approach, with its smart city test bed MK:Smart. This “data analytics hub”, created by the Open University with telecoms group BT, collects and integrates information from energy, transport and water networks, weather and pollution statistics and crowd-sourced data from social media and mobile apps. The system “mashes up” various data points and runs “use cases” to identify potential applications.
The Community Action Platform for Energy is one such example. The tool combines aerial thermal survey data gathered using drones, and maps it against other data sources, such as energy performance certificates, to identify thermally inefficient properties. The project will use an online platform through which communities can manage and purchase energy-efficiency improvements, such as solar panels and loft insulation. The aim is to create a local marketplace for green energy products, bringing together suppliers and buyers.
Upendra Dharmadhikary, client partner at IT company Tech Mahindra, a stakeholder in MK:Smart, comments: “The benefit of solar is limited unless the price point is reduced. We are saying, if you buy it as a community, there is a better business case.”
Bristol Is Open has also partnered with SMEs, writing intelligent algorithms, based on high-definition footage recorded on networked cameras placed around the city. One firm is developing an algorithm that is able to extract data on transport movements from the imagery. Another is creating a linguistic translation of the data that can be transmitted to citizens as tweets.
Simeonidou says: “Imagine the situation of a critical event in a city, such as a fire, flood, or a terrorist attack – the system will make it possible to send everyone a tweet explaining the situation. The system will then automatically update people as events progress.”
The work being done in Milton Keynes and Bristol highlights one other critical factor in making a smart city: its residents. In a world where 500 million tweets and six billion Facebook “likes” are generated every day, aggregating crowdsourced data can prove very useful in the drive to improve a city’s day-to-day functioning.
Civic authorities in India, for instance, have turned to smartphone apps to help improve air quality and reduce pollution. The Delhi state government has launched an app called Swachh Delhi, or Clean Delhi, that enables people to upload and report photos of illegally dumped rubbish. The state government in Bihar, meanwhile, is trying to clean up Patna – ranked among the dirtiest cities in India – with an app that allows local residents to report violations such as litter, flooding, dead animals, broken street lights and illegal construction.
Crowdsource apps launched by Singapore’s Smart Nation project include myResponder, developed with the Singapore Civil Defence Force, which aims to reduce fatalities caused by heart attacks in the city’s ageing population. According to the city state’s Registry for Automated External Defibrillator Integration (R-AEDi), more than 1,800 people suffer an out-of-hospital cardiac arrest (OHCA) each year. With every minute that passes after an OHCA, the chance of survival drops by 7%-10% without intervention.
MyResponder aims to increase the chances of intervention. Individuals who are trained in CPR and the use of defibrillators sign up to the app, and if they are within 100 metres of a person who collapses and needs help, it sends an alert and directs them to their location.
The logical culmination of this confluence of big data, big business, local government and people power is the creation of the fully operational, truly smart city. But so far, this vision is proving difficult to realise. Why? The biggest obstacle could be the very thing that makes the concept so exciting: the diversity and reach of its applications. Data protection laws limit the amount of information a company can hold on a subject. Privacy laws restrict access to many public datasets, limiting the insights available. And many smart solutions have been developed in isolation and are unable to connect to each other in a secure and open ecosystem.
“Many companies offered a vision of an end-to-end smart city with mega efficiency, driven by a variety of technologies and data analysis techniques. The problem is that doesn’t exist,” concludes Webb. “Cities need to think about what applications they need to meet their specific challenges and continue building responses to that, designing around the citizen, discovering what technology is required to meet human needs.”
The World Built Environment Forum facilitates industry leading discussions harnessing the enormous potential of the 21st century's people and places.