About 15% of all Stanford undergrads (of all majors) are learning how to build LLMs. Stats for @chrmanning's CS224N Natural Language Processing with Deep Learning below. That makes sense. LLM's are becoming a basic systems component like compute, networking and storage.
@appenz @chrmanning Feel like chasing the current things. There are much deeper things than LLMs. You cannot think out of the box if everything looks like nails.
@appenz @chrmanning 15% of students building LLM from scratch would need a lot of GPUs. 15% of students finetuning existing LLMs with open source may be something within Paly students reach.
@appenz @martin_casado @chrmanning Question is: Does the wisdom of the Stanford undergrad crowd constitute a leading indicator or lagging indicator? Admittedly a tiny selective connected blabla crowd…
@appenz @chrmanning What other courses offer online lectures besides the excellent CS224N?
@appenz @chrmanning this is another way to say "85% of Stanford undergrads are wasting their time"