Did you actually read what I wrote or the context behind it? I don’t think you did. I am saying not everyone wants to learn these things just for the sake of it. Some people want to learn the parts of maths that are more practical and want to be given practical examples. I don’t see a problem with accommodating those students or looking down on people who think that way like the original commenter was doing.
- 0 Posts
- 4 Comments
Not really. Not everyone enjoys advanced mathematics the same way not everyone enjoys english literature or engineering, or arts and crafts. People have different interests, aptitudes, and skills. That’s how the world works.
To be more specific this is an MLP (Multi-Layer Perceptron). Neural Network is a catch all term that includes other things such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Diffusion models and of course Transformers.
What you are arguing online is some variant of a Generative Pre-trained Transformer, which do have MLP or MoE layers but that’s only one part of what they are. They also have multi-headed attention mechanisms and embedding + unembedding vectors.
I know all this and wouldn’t call myself a machine learning expert. I just use the things. Though I did once train a simple MLP like the one in the picture. I think it’s quite bad calling yourself a machine learning expert and not knowing all of this stuff and more.


Did you actually read what I wrote or the context behind it? I don’t think you did.
I am saying not everyone wants to learn these things just for the sake of it. Some people want to learn the parts of maths that are more practical and want to be given practical examples. I don’t see a problem with accommodating those students or looking down on people who think that way like the original commenter was doing.
What I am not saying is that no one should learn any maths at all. I don’t know how you got that from my comment. It’s like you are deliberately trying to misinterpret what I am saying.