Achievements

Game Dev Bootcamp

Best Technical Workshop 2016

Experience

 
 
 
 
 
June 2018 – August 2018
Geneva, Switzerland

Openlab, CMS Experiment

CERN



  • Selected (among 41 candidates out of over 1,800 applicants) for Data Quality Monitoring using deep learning at the CMS Experiment
  • Designed a custom neural network architecture termed Hierarchical Latent Autoencoder to exploit CMS Trigger System’s hierarchical design
  • Improved reconstruction of trigger rate behaviour significantly to enable better anomaly detection with a probabilistic reconstruction metric for enhanced interpretability

 
 
 
 
 
November 2017 – Present
Kolkata, India

Founding Partner

Cognibit



  • Raised precision in predicting elevator failures from existing 21% to 88% while remotely collaborating with Kone’s Analytics team
  • Piloting the ongoing deployment of the researched models into production by closely working with the IoT team in Finland
  • Leading research & development towards a generalized AI-based platform combining big data analytics & deep learning technologies to enable predictive maintenance for any industrial system that generates log data

 
 
 
 
 
May 2017 – June 2017
Hyvinkää, Finland

Visiting Researcher, Global R&D Division

Kone Corporation



  • Researched approaches to anomaly detection using LSTM-based RNN architectures to model elevator logs as a natural language sequence
  • Designed & experimented with several predictive maintenance models to develop a system prototype which achieved record accuracy
  • Performed statistical modeling & diagnostics of Kone’s german elevator data extracted from the Remote Maintenance Program which yielded commercially valuable insights

Selected Publications

We propose a novel neural network architecture called Hierarchical Latent Autoencoder to exploit the underlying hierarchical nature of the CMS Trigger System at CERN for data quality monitoring. The results demonstrate that our architecture does reducehe reconstruction error on the test set from $9.35 \times 10^{-6}$ when using a vanilla Variational Autoencoder to $4.52 \times 10^{-6}$ when using our Hierarchical Latent Autoencoder.
CERN Openlab Technical Report, 2018

With the addition of dynamic memory access and storage mechanism, we present a neural architecture that will serve as a language-agnostic text normalization system while avoiding the kind of unacceptable errors made by the LSTM based recurrent neural networks. Our proposed system requires significantly lesser amounts of data, training time and compute resources.
Speech Communication (EURASIP & ISCA) Elsevier, 2018