JAX compilation of RDDL description files, and a differentiable planner in JAX.
-
Updated
Jun 30, 2024 - Python
JAX compilation of RDDL description files, and a differentiable planner in JAX.
A header-only C++ Library for Optimization Algorithms
XCSF learning classifier system: rule-based online evolutionary machine learning
Dynamically adjusts load balancers coupled with auto scalers in response to workload changes using weakly coupled Markov Decision Processes (MDPs) and a two-timescale online learning approach.
A proof of concept of a recursion doing stochastic gradient descent for a simple neural network. Done in Python3 with numpy
Stanford-CS221 class practical's, Assignments and projects
This project will cover some of the basic Artificial Intelligence along the course using Python. Mainly will use Numpy to build everything. I write all the files in Python and it refers back to the school labs at Dalhousie University.
flexible and extensible implementation of a multithreaded feedforward neural network in Java including popular optimizers, wrapped up in a console user interface
recommender systems algorithms
Parametric estimation of multivariate Hawkes processes with general kernels.
AutoSGM
The ability to predict prices and features affecting the appraisal of property can be a powerful tool in such a cash intensive market for a lessor. Additionally, a predictor that forecasts the number of reviews a specific listing will get may be helpful in examining elements that affect a property's popularity.
A basic neural network with backpropagation programmed from scratch in C++
Logistic Regression with different optimizers in Python from scratch
Easy-to-use linear and non-linear solver
This code uses computational graph and neural network to solve the five-layer traffic demand estimation in Sioux Falls network. It also includes comparison of models and 10 cross-validations.
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
This repository contains my solutions and implementations for assignments assigned during the Machine Learning course.
Add a description, image, and links to the stochastic-gradient-descent topic page so that developers can more easily learn about it.
To associate your repository with the stochastic-gradient-descent topic, visit your repo's landing page and select "manage topics."