Kuldeep Kurte

Kuldeep R Kurte

Research Scientist | Urban sciences | Data science | GeoAI | ML | DL | Reinforcement learning

Kuldeep Kurte is currently a research scientist in the Computational Urban Sciences Group (CUSG) at Oak Ridge National Laboratory. He finished his Ph.D. in Image Information Mining from Indian Institute of Technology Bombay, India, in 2017. During Ph.D., he generated an interest in the field of machine learning and high-performance computing for various geoscience and remote sensing applications. After his Ph.D., he joined Oak Ridge National Laboratory as a postdoctoral researcher in scalable geo-computation in January 2018. In his first project at the lab, he worked on developing a scalable end-to-end settlement detection workflow and its deployment on Titan supercomputer, which was further used to detect swimming pools in Texas from the satellite imagery. As a part of Urban Exascale Computing Project (UrbanECP) he worked on building a capability to facilitate running several instances of the Transims simulations on Titan. He also worked on the tasks of analyzing a regional-scale impact of inclement weather on traffic speed and coupling Transim’s output with the building energy simulation through an efficient spatial indexing approach. Continuing his interest in data-driven urban computing he is currently working on intelligent HVAC control using reinforcement learning for building energy optimization. Kuldeep has experience with working on various applications on different HPC platforms from NVIDIA Jetson Tk1 to the Summit supercomputer. 

Research Projects:

2018 Scalable Geo-computation for Global Human Settlement Discovery [ORNL]

  • Developed deep-learning based scalable workflow for identifying man-made structures from remote sensing imagery. 
  • I carried out performance benchmarking of the workflow on the Titan supercomputer and implemented a dedicated load-balancing and checkpoint-restart strategy for the unbalanced workload of the remote sensing imagery and preventing data loss due to the runtime failures. 
  • Using this workflow, a remote sensing imagery dataset with a spatial resolution of 0.31m covering 685,675 km2 of an area of the Republic of Zambia was processed in under six hours using 5426 nodes of the Titan supercomputer for the Bill and Melinda Gates Foundation.
  • The workflow was further used to detect swimming pools in Texas, US. During this task, 4106 remote sensing images (2GB–12GB size) were processed in under ten minutes. I developed a GPU based parallel index-based swimming pool detection algorithm and executed a large job on the Titan supercomputer. This algorithm can handle any size of remote sensing image irrespective of GPU hardware. 

2018–2019 Urban Exascale Computing Project (UrbanECP) [ORNL]

  • This was a multi-lab project (funded by the US DoE). The objective of this collaborative effort is to couple various multi-scale urban models such as weather models, transportation simulations and building energy to simulate the interdependencies of these different systems in the urban environment. 
  • I developed a capability to execute many instances of the transportation simulation models on the Titan supercomputer using a high-performance workflow management system called Swif-t. 
  • As a part of this effort, I performed a spatio-temporal study of the impact of the inclement weather on the transportation system of the city of Chicago, IL, USA, and its overall energy footprint. 

2019–present Building’s Energy Management using Reinforcement Learning [ORNL]

  • The goal of this project (funded by US DoE) is to develop a cost-effective load management system that can be used in existing homes with minimal effort to enable load flexibility to save money for the homeowners and to improve resiliency for the power grid. 
  • My ongoing contribution in this project is to develop deep reinforcement learning (RL) based optimization techniques to enable data-driven observation, prediction, and control dispatch of the end-use devices such as heat ventilation and air conditioning (HVAC). 
  • The developed models were deployed in a research house to control the HVAC for winter and summer period of the year 2020.

2019–2020 Large-scale Hypergraph Analytics [ORNL]

  • Phoenix is a high-performance distributed streaming graph analytics framework under the Durmstrang project (funded by US DoD) that focuses on enabling concurrent utilization of online and offline graph analysis. 
  • The framework implemented a high-performance in-memory data structure for streaming property graphs over which both the online and offline queries are supported. The current framework does not support the generation and processing capability for streaming hypergraphs. 
  • My main contribution to this project was to extend Phoenix’s capability for a streaming hypergraph. 

2020–2021 Machine learning for Design Space Exploration [ORNL]

  • This project (funded by US Department of Defense) is specifically focused on benchmarking the performance of various computational problems on the system with different emerging memory types such as DRAM, Non-Volatile Memory (NVM) and hybrid (DRAM+NVM). 
  • I used a system simulator called Gem5 to simulate the benchmark problems and collect their memory traces of the operations. These memory traces are further used in the memory simulator called NVMain in three different configurations, i.e. DRAM, NVM, and Hybrid (DRAM+NVM) to collect various performance metrics such as average latency, memory bandwidth, total memory reads and write operations. 
  • The objective of this project is to build the machine learning models to predict various memory performance given different memory configurations. We built the end-to-end workflow of this process and I have been involved in the development of the entire process of the workflow development. 

2021-present  Global Building Intelligence [ORNL]

  • This project (funded by US National Geospatial Intelligence Agency) is focused on developing an approach to identify various characteristics (e.g. building purpose) of each and every building on earth using the proxy predictors such as building rooftop material, spatial arrangements, etc. 
  • My contribution to this project is to engineer an automated way to ingest the spatial predictors into the building intelligence pipeline. Until now I have automated the ingestion of the spatial indicators related to vegetation, nearness to the roads, built-up, and slope.  Currently, I am working on Bayesian network-based building characterization using geospatial data.