About

Joel Niklaus is a Postdoc and Lecturer at the University of Bern and Bern University of Applied Sciences. Previously, he was an AI Resident at (Google) X, training multi-billion parameter LLMs on hundreds of TPUs and achieving state-of-the-art performance on LegalBench. At Thomson Reuters Labs he investigated efficient domain-specific pretraining approaches. He visited Stanford University supervised by Prof. Dan Ho, conducting research on large language models in the legal domain. He has extensive experience in pretraining and finetuning large language models for diverse tasks and on diverse compute environments. His research focuses on dataset curation to train and evaluate language models multilingually for the legal domain and laid the groundwork for legal NLP in Switzerland. He serves as an advisor to companies specializing in the applications of modern NLP to legal challenges. He holds a PhD in Natural Language Processing, a Master’s in Data Science and a Bachelor’s in Computer Science from the University of Bern.

JoelNiklaus JoelNiklaus JoelNiklaus JoelNiklaus 🤗 JoelNiklaus

I’m currently operating at near capacity with my existing commitments, but I am still open to consulting on exceptional projects that pique my interest.