24 Downloads (Pure)


Walkability reflects the well-being of a city, and its measurement is evolving rapidly due to advancements of big data and machine learning technologies. The study examines the transformative impact of these technological interventions on the evaluation of walkability trends over the period 2015 to 2022. We create a framework consisting of big data sources, machine learning methods, and research purposes, revealing research trajectories and associated challenges. Despite diverse data usage, image data dominates in walkability research. While street view and point of interest data were primarily used to depict the environment, social media and handheld/ wearable data were more commonly employed to represent user behaviours or perceptions. Leveraging machine learning in conjunction with big data assists researchers in three aspects of walkability studies. First, researchers utilise classification and clustering to predict street quality, walkability, and identify neighbourhoods with certain characteristics. Second, researchers unveil relationship between the built environment and pedestrian perceptions or behaviours through regression analysis. Third, researchers employ generative models to create streetscapes or urban structures, although their utilisation is limited. Meanwhile, challenges persist in data access, customisation of machine learning models for urban studies, and establishing standard criteria to guarantee data quality and model accuracy.

Original languageEnglish
Article number102087
Pages (from-to)1-20
Number of pages20
JournalComputers, Environment and Urban Systems
Publication statusPublished - Apr 2024
MoE publication typeA2 Review article, Literature review, Systematic review


  • Artificial intelligence
  • Built environment variable
  • Pedestrian behaviour
  • Pedestrian perception
  • Review
  • Urban study


Dive into the research topics of 'From intangible to tangible : The role of big data and machine learning in walkability studies'. Together they form a unique fingerprint.

Cite this