🪴 Notes
Search
Search
Search
Dark mode
Light mode
Explorer
🌎 Intelligent Earth
📝 ARC Compute Server
📚 Lectures
📚 Computational Statistics and Data Analysis
📚 Computational Statistics and Data Analysis
📚 Machine Learning Essentials
📝 Logistic Regression
📝 Rectified Linear Unit (ReLU)
📝 Softplus Function
📰 Blog
💡 Obsidian
💡 Introduction
📝 Obsidian Plugins
📝 Zettelkasten Method
🗨️ Talks
🗨️ Academic Notetaking
🗨️ RUBIX Group Meeting Talk
🗨️ Source control with Git
🛠️ Tools
🛠️ GitHub Python Packages Cookiecutter Template
assets
zotero-integration-template
Home
❯
📚 Lectures
❯
📚 Machine Learning Essentials
❯
📝 Softplus Function
📝 Softplus Function
Nov 03, 2024
1 min read
ML
−
lo
g
σ
(
t
)
=
−
lo
g
1
+
exp
(
−
t
)
1
=
lo
g
(
1
+
exp
(
−
t
))
Is like a smoothed Version of
ReLU
.
Graph View
Backlinks
📝 Logistic Regression