Welcome to the website of our research group
Theoretical Foundations of Artificial Intelligence

We work on the statistical theory of machine learning and artificial intelligence, We provide mathematical understanding of learning algorithms, thereby establishing black-box AI tools as formal statistical principles. Our research focuses on three aspects of data science in the present era of big data analysis.

  • Data complexity: Many applications involve complex data, such as relational or semantic information that lack natural Euclidean structure. Hence, learning from such data require non-parametric methods or representation learning. We develop non-parametric machine learning algorithms  for networks and ordinal data.

  • Statistical guarantee: The ubiquity of non-standard ML problems has led to a demand for new paradigms in learning theory to analyse the performance of related algorithms. We perform mathematical analysis of ML algorithms, including statistical performance guarantees and information theoretic limits of learning.

  • Computational cost: The question of statistical performance is inherently tied with the computational budget, which has become important in the era of big data. We study the trade-off between efficiency and accuracy, and develop efficient machine learning algorithm based on sound statistical principles.