Finding similar API functions between Pytorch and Tensorflow with Doc2Vec

Marton Trencseni - Wed 21 December 2022 • Tagged with similarity, python, word2vec, doc2vec, pytorch, tensorflow

I use Doc2Vec to try to find pairs of similar API functions between Pytorch and Tensorflow.

Tensorflow Pytorch

Continue reading

Classification accuracy of quantized Autoencoders with Pytorch and MNIST

Marton Trencseni - Fri 09 April 2021 • Tagged with python, pytorch, cnn, torchvision, mnist, autoencoder

I measure how the classification accuracy of quantized Autoencoder neural network varies with encoding bits on MNIST digits.

Classifier accuracy on quantized Autoencoder output after quantization

Continue reading

Investigating information storage in quantized Autoencoders with Pytorch and MNIST

Marton Trencseni - Sun 04 April 2021 • Tagged with python, pytorch, cnn, torchvision, mnist, autoencoder

I investigate how much information an Autoencoder neural network encodes for MNIST digits.

Pytorch Autoencoder loss with encoding dimension and quantization bits

Continue reading

Building a Pytorch Autoencoder for MNIST digits

Marton Trencseni - Thu 18 March 2021 • Tagged with pytorch, autoencoder, mnist

I build an Autoencoder network to categorize MNIST digits in Pytorch.

Conversion difference vs N

Continue reading

Training a Pytorch Wasserstein MNIST GAN on Google Colab

Marton Trencseni - Wed 03 March 2021 • Tagged with python, pytorch, torchvision, mnist, gan

I train a Pytorch Wasserstein MNIST GAN on Google Colab to beautiful MNIST digits.

Wasserstein GAN Generated MNIST digits

Continue reading

Training a Pytorch Classic MNIST GAN on Google Colab

Marton Trencseni - Tue 02 March 2021 • Tagged with python, pytorch, torchvision, mnist, gan

I train a Pytorch Classic MNIST GAN on Google Colab to generate MNIST digits.

Classic GAN Generated MNIST digits

Continue reading

Training a Pytorch Lightning MNIST GAN on Google Colab

Marton Trencseni - Sat 20 February 2021 • Tagged with python, pytorch, gan, mnist, google-colab

I explore MNIST digits generated by a Generative Adversarial Network trained on Google Colab using Pytorch Lightning.

Softmax GAN after 5 epoch, 100 samples.

Continue reading

Pytorch in 2019

Marton Trencseni - Thu 12 December 2019 • Tagged with pytorch

2019 was another big year for Pytorch, one of the most popular Deep Learning libraries out there. Pytorch has become the de facto deep learning library used for research thanks to it’s dynamic graph model which allows fast model experimentation. It’s also become production ready, with support for mobile and infrastructure tooling such as Tensorboard.

Pytorch Google Trends 2019

Continue reading

Using simulated self-play to solve all OpenAI Gym classic control problems with Pytorch

Marton Trencseni - Thu 14 November 2019 • Tagged with python, pytorch, reinforcement, learning, openai, gym

I use simulated self-play by ranking episodes by summed reward. Game outcomes are divided in two by cutting at the median, winners are assigned +1 rewards, losers are assigned -1 rewards, like in games like Go and Chess. Unlike naive policy gradient descent used in previous posts, this version solves all OpenAI classic control problems, albeit slowly.

OpenAI mountaincar

Continue reading

Applying policy gradient to OpenAI Gym classic control problems with Pytorch

Marton Trencseni - Tue 12 November 2019 • Tagged with python, pytorch, reinforcement, learning, openai, gym

I try to generalize the policy gradient algorithm as introduced earlier to solve all the OpenAI classic control problems. It works for CartPole and Acrobot, but not for Pendulum and MountainCar environments.

OpenAI classic control environments

Continue reading

Solving the CartPole Reinforcement Learning problem with Pytorch

Marton Trencseni - Tue 22 October 2019 • Tagged with python, pytorch, reinforcement, learning, openai, gym, cartpole

The CartPole problem is the Hello World of Reinforcement Learning, originally described in 1985 by Sutton et al. The environment is a pole balanced on a cart. CartPole is one of the environments in OpenAI Gym, so we don't have to code up the physics. Here I walk through a simple solution using Pytorch.

Cartpole animation

Continue reading

Playing Go with supervised learning in Pytorch

Marton Trencseni - Sun 25 August 2019 • Tagged with python, pytorch, cnn, go

Using historic gameplay between strong Go players as training data, a CNN model is built to predict good Go moves on a standard 19x19 Go board.

Go prediction sample

Continue reading

Arabic name classification with Scikit-Learn and Pytorch

Marton Trencseni - Fri 02 August 2019 • Tagged with pytorch, skl, arabic, fetchr

While working on arabic-vs-rest classification, I was curious how good out-of-the-box models perform with publicly available data, and then compare that with what we can achieve with internal data / features derived from millions of deliveries. We train Scikit-learn and Pytorch models for this classification task and achieve 90% prediction accuracy on publicly available data and out-of-the-box models, while internally 99% is achievable.

ROC curve

Continue reading

MNIST pixel attacks with Pytorch

Marton Trencseni - Sat 01 June 2019 • Tagged with python, pytorch, cnn, torchvision, mnist, skl

It’s easy to build a CNN that does well on MNIST digit classification. How easy is it to break it, to distort the images and cause the model to misclassify?

MNIST attack accuracy

Continue reading

Solving CIFAR-10 with Pytorch and SKL

Marton Trencseni - Tue 14 May 2019 • Tagged with python, pytorch, cnn, torchvision, cifar, skl

CIFAR-10 is a classic image recognition problem, consisting of 60,000 32x32 pixel RGB images (50,000 for training and 10,000 for testing) in 10 categories: plane, car, bird, cat, deer, dog, frog, horse, ship, truck. Convolutional Neural Networks (CNN) do really well on CIFAR-10, achieving 99%+ accuracy. The Pytorch distribution includes an example CNN for solving CIFAR-10, at 45% accuracy. I will use that and merge it with a Tensorflow example implementation to achieve 75%. We use torchvision to avoid downloading and data wrangling the datasets. Like in the MNIST example, I use Scikit-Learn to calculate goodness metrics and plots.

CIFAR examples

Continue reading

Solving MNIST with Pytorch and SKL

Marton Trencseni - Thu 02 May 2019 • Tagged with python, pytorch, cnn, torchvision, mnist, skl

MNIST is a classic image recognition problem, specifically digit recognition. It contains 70,000 28x28 pixel grayscale images of hand-written, labeled images, 60,000 for training and 10,000 for testing. Convolutional Neural Networks (CNN) do really well on MNIST, achieving 99%+ accuracy. The Pytorch distribution includes a 4-layer CNN for solving MNIST. Here I will unpack and go through this example. We use torchvision to avoid downloading and data wrangling the datasets. Finally, instead of calculating performance metrics of the model by hand, I will extract results in a format so we can use SciKit-Learn's rich library of metrics.

MNIST example digits

Continue reading

SVM with Pytorch

Marton Trencseni - Tue 16 April 2019 • Tagged with pytorch, svm, iris

I use the standard Iris dataset for supervised learning with a Support Vector Machine model using Pytorch's autograd.

SVM

Continue reading

Hacker News Embeddings with PyTorch

Marton Trencseni - Tue 12 March 2019 • Tagged with pytorch, embedding

A PyTorch model is trained on public Hacker News data, embedding posts and comments into a high-dimensional vector space, using the mean squared error (MSE) of dot products as the loss function. The resulting model is reasonably good at finding similar posts and recommending posts for users.

Vector space

Continue reading

PyTorch Basics: Solving the Ax=b matrix equation with gradient descent

Marton Trencseni - Fri 08 February 2019 • Tagged with pytorch

I will show how to solve the standard A x = b matrix equation with PyTorch. This is a good toy problem to show some guts of the framework without involving neural networks.

PyTorch computational graph

Continue reading