Avatar

Parth K. Thaker

Ph.D. Student

Arizona State University

Namaste!

I am working towards my PhD in Electrical Engineering department at Arizona State University. Currently, I am working with Prof. Gautam Dasarathy on interesting problems at the intersection of Graph theory and Optimization.

Recently, I interned at Mitsubishi Electric Research Laboratories (MERL) with Dr.Abraham P. Vinod where I developed bandit-based algorithms for resource monitoring. Previously, I was holding the position of Systems Engineer at Netradyne . I was working with sensor information from Mobile devices/IMU chips to draw out statistics on the general driving behavior of an individual to develop better driver safety features.

I have dual degree (B.Tech + M.Tech) from Electrical Department at IIT Madras. I did my Senior thesis under the guidance of Dr. Radha Krishna Ganti on the broad topic of bi-level rank preserving algorithms. My thesis work can be found here

Interests

  • Nonconvex Optimization
  • Graph Theory
  • Bandit Learning
  • Reinforcement Learning

Education

  • PhD in Electrical Engineering, Ongoing

    Arizona State University

  • M.Tech in Communication, 2016

    Indian Institute of Technology, Madras

  • B.Tech in Electrical Engineering, 2015

    Indian Institute of Technology, Madras

Experience

 
 
 
 
 

Algorithms Intern

Mitsubishi Electric Research Laboratories

May 2022 – Aug 2022 Boston, Massachusetts
 
 
 
 
 

Graduate student

Arizona State University

Aug 2017 – Present Phoenix, Arizona
 
 
 
 
 

Systems Engineer

Netradyne

Aug 2016 – May 2017 Bangalore, India
 
 
 
 
 

Intern

Securifi Systems Pvt. Ltd

May 2014 – Jun 2014 Hyderabad, India
 
 
 
 
 

Intern

Cisco Systems Pvt. Ltd

May 2013 – Jun 2013 Bangalore, India
 
 
 
 
 

Undergraduate student

Indian Institute of Technology, Madras

Aug 2011 – Aug 2016 Chennai, India

Recent Posts

Optimal strategy for classroom behaviour

A take on implication of information theoritic capacity on learning in classroom

Projected gradient descent with skipping

Effects of skipping the projection for gradient descent with projections.