Strengthening Urban Resilience in the age of Fragmentation (SURF)
🌇

Strengthening Urban Resilience in the age of Fragmentation (SURF)

Tags
Python
R
Computer Vision
Machine Learning
geopandas
selenium
openpyxl
transbigdata
osmnx
cv2
scipy
matplotlib
seaborn
Published
January 10, 2024
Picture
Author

About the Project

Strengthening Urban Resilience in the age of Fragmentation (SURF) is a research project kickstarted by a research team under the Lee Kuan Yew Centre for Innovative Cities. The project aims to inform policymakers on (1) the likely impacts and tradeoffs when different transport electrification policies are implemented, and (2) the limits and potential of improving urban infrastructures that aim to encourage walking.
Read more about the project here:

Role in this Project

I was recruited as a student researcher for this project under the Undergraduate Research Opportunity Programme (UROP) scheme to aid in the following tasks.

Scraping Traffic Data

The research team needs to extract useful traffic data to perform their analysis. My task was to scrape rush hour traffic data from the web for the city Phnom Penh (our area of research).
To solve this task, I created a Python program that captures screenshots of the road network from a map web application. The program then analyzes the color-coded road network and extracts useful traffic information from it.
notion image
notion image
The challenge faced was developing a code that efficiently performs this task. There were numerous parts of the road network which we needed to extract data from, so the program needs to be able to perform this task while using resources (computational time & storage space) as efficiently as possible.

Traffic Simulation ML Model (Ongoing)

Our next task was to develop a simulation which models the daily movement of 2,000,000 Phnom Penh residents and the medium of transportation used in their travels. This was a challenging task as we needed to consider various factors that may affect a commuter’s choice of medium transportation.
As such, it was necessary to develop an ML model to generate this data, and I was tasked with this responsibility.
 
 
Â