1 min readfrom Machine Learning

Formalizing statistical learning theory in Lean 4 [R]

Formalizing statistical learning theory in Lean 4 [R]
Formalizing statistical learning theory in Lean 4 [R]

I’ve been working on a Lean 4 project focused on formalizing parts of statistical learning theory:

FormalSLT repository

Current results include:

  • finite-class ERM bounds
  • Rademacher symmetrization
  • high-probability Rademacher bounds
  • Sauer–Shelah / VC-dimension bridge
  • finite scalar contraction
  • linear predictor bounds
  • finite PAC-Bayes bounds
  • algorithmic stability

The main idea is to build a readable and pedagogically structured “theorem ladder” for ML theory rather than just isolated declarations.

I’m trying to keep:

  • explicit assumptions
  • scoped theorem statements
  • zero sorry
  • close alignment with standard SLT presentations

Compared to some existing Lean SLT efforts that focus more heavily on empirical-process infrastructure and abstract probability machinery, this project is currently more focused on explicit finite-sample PAC/Rademacher/stability routes and readable end-to-end theorem chains.

I’d especially appreciate feedback on:

  • theorem organization
  • proof structure
  • naming/API decisions
  • useful next formalization targets

Thank you,
R. S

submitted by /u/trickyrex1
[link] [comments]

Want to read more?

Check out the full article on the original site

View original article

Tagged with

#rows.com
#machine learning in spreadsheet applications
#natural language processing for spreadsheets
#generative AI for data analysis
#row zero
#Excel alternatives for data analysis
#financial modeling with spreadsheets
#no-code spreadsheet solutions
#spreadsheet API integration
#statistical learning theory
#Lean 4
#finite-class ERM bounds
#Rademacher symmetrization
#high-probability Rademacher bounds
#Sauer-Shelah VC-dimension bridge
#finite scalar contraction
#linear predictor bounds
#finite PAC-Bayes bounds
#algorithmic stability
#theorem ladder