- formatting
- images
- links
- math
- code
- blockquotes
- external-services
•
•
•
•
•
•
-
Google Gemini updates: Flash 1.5, Gemma 2 and Project Astra
We’re sharing updates across our Gemini family of models and a glimpse of Project Astra, our vision for the future of AI assistants.
-
Short Notes: Gradient of the Softmax Function for the Cross-Entropy Loss
The softmax function in neural networks ensures outputs sum to one and are within [0,1]. Here's how to compute its gradients when the cross-entropy loss is applied.
-
Obfuscating a Function – How not to write Code
A while back, I created a straightforward function to convert an integer into a new format, resulting in clear code. However, I inexplicably chose to obscure its purpose, leading to the following outcome: Read more in this post...
-
Short Notes: Eigendecomposition of a Matrix
The derivation of the eigendecomposition is surprisingly simple. Read more here!
-
Surprising Bank Investment
Discover how investing in a startup turned into a financial rollercoaster. Your bank offers you a chance to invest in a new venture, with the promise of keeping all the profits if successful. Excited, you dive in with $1,000,000, providing 99% of the investment while the bank chips in the remaining 1%.