🪴 Notes
Search
Search
Search
Dark mode
Light mode
Explorer
📚 Lectures
📚 Computational Statistics and Data Analysis
📚 Computational Statistics and Data Analysis
📚 Machine Learning Essentials
📝 Logistic Regression
📝 Rectified Linear Unit (ReLU)
📝 Softplus Function
📰 Blog
💡 Obsidian
💡 Introduction
📝 Zettelkasten Method
📝 Zotero Obsidian Setup
🗨️ Talks
🗨️ Academic Notetaking
🗨️ RUBIX Group Meeting Talk
🗨️ Source control with Git
assets
zotero-integration-template
Home
❯
📚 Lectures
❯
📚 Machine Learning Essentials
❯
📝 Softplus Function
📝 Softplus Function
Sep 03, 2024
1 min read
ML
−
lo
g
σ
(
t
)
=
−
lo
g
1
+
exp
(
−
t
)
1
=
lo
g
(
1
+
exp
(
−
t
))
Is like a smoothed Version of
ReLU
.
Graph View
Backlinks
📝 Logistic Regression