Sparsity encourages serendipity
SPARSITY IS REGULARIZATION.
SPARSITY IS ONE OF THE MOST INTERESTING “autistic special interests” (given how autism is often the opposite of sparsity).
“doing more with less” (less memory, fewer NN layers, etc…)
When you go to the limit of extreme sparsity, your ratio of positive to negative interactions goes WAY up, and you drastically lower your extreme downside risk
[you also lower some extreme upside risk, but extra serendipity means u might be able to capture .5% of value of a unicorn u seeded]
(some generalists are quite sparse!)
sparsity is knowing when to not say hi to everyone you know knowing when not to remember everything knowing how to preserve your valuable compute cycles so you can think as well as you were in your "ideal 12 year old unschooled environment" knowing that ALL of this is what makes you more able to gain social access when it really matters, and not just with the early-stage
The Tao is sparse… (the tao is silent)…
(distributed computations require sparsity! this form of sparsity can give you better context)
sparse reinforcement learning
“reinforcement learning with sparse rewards” […suspense…]
[VC has... sparse reward structure…]
Trenton Bricken was into sparse distributed memory (Jeff Hawkins originally)…