πͺ΄ Notes
Search
Search
Search
Dark mode
Light mode
Explorer
π Intelligent Earth
π ARC Compute Server
π Lectures
π Computational Statistics and Data Analysis
π Computational Statistics and Data Analysis
π Machine Learning Essentials
π Logistic Regression
π Rectified Linear Unit (ReLU)
π Softplus Function
π° Blog
π‘ Obsidian
π‘ Introduction
π Obsidian Plugins
π Zettelkasten Method
π¨οΈ Talks
π¨οΈ Academic Notetaking
π¨οΈ RUBIX Group Meeting Talk
π¨οΈ Source control with Git
π οΈ Tools
π οΈ GitHub Python Packages Cookiecutter Template
assets
zotero-integration-template
Home
β―
π Lectures
β―
π Machine Learning Essentials
β―
π Rectified Linear Unit (ReLU)
π Rectified Linear Unit (ReLU)
Nov 03, 2024
1 min read
ML
is an
Activation Function
f
(
x
)
=
{
x
0
β
Β ifΒ
x
>
0
Β otherwiseΒ
β
Graph View
Backlinks
π Softplus Function