Someone has modified microgpt to build a tiny GPT that generates Korean first names, and created a web page that visualizes the entire process [1].
Users can interactively explore the microgpt pipeline end to end, from tokenization until inference.
[1] English GPT lab:
I have no affiliation with the website, but the website is pretty neat if you are learning LLM internals. It explains: Tokenization, Embedding, Attention, Loss & Gradient, Training, Inference and comparison to "Real GPT"
Pretty nifty. Even if you are not interested in the Korean language