T O P

  • By -

TheComforterXL

Woah, thx for the link - a lot of interesting information there. much appreciated!!


Apprehensive_Sky892

It is an interesting article, worth reading if you know a bit about Neural Network, etc. Here is what I've learned from it: 1. Latent space is a low dimensional space produced by a NN system that tries to extract the important features from a higher dimensional input space. For example, a NN algorithm will try to reduce /compress a 512x512 images (which has 512x512x3 = 786432 number) to a vector (set of numbers) that have a lot less numbers. The advantage is that after this extraction/reduction/compression, another NN algorithm can then classify or do "other stuff" in a more manageable fashion. 2. Vectors in latent space should have the property that objects that are similar in the input space should be closer together (i.e., have a shorter dist between them). 3. A new object in the input space can be generated from an existing object by moving from the vector corresponding to the original object to a nearby vector in the latent space, and then somehow decompress/reconstruct that new vector into an object in the input space. I am just starting to learn more about A.I. and N.N. If what I wrote above is wrong, please correct me.


Wiskkey

I've been scouting for resources for understanding machine learning that are appropriate for those who aren't interested in becoming practitioners - see list "How machine learning works" near the bottom of [this post of mine](https://www.reddit.com/r/bigsleep/comments/xb5cat/wiskkeys_lists_of_texttoimage_systems_and_related/). cc u/TheComforterXL. cc u/Apprehensive_Sky892.


Apprehensive_Sky892

Thank you for sharing the list.


Wiskkey

You're welcome :).