Python types for Data Scientists - Part I
Marton Trencseni - Fri 08 April 2022 • Tagged with python, types
I show how to use basic type hints and get type checking working in ipython notebooks.
Marton Trencseni - Fri 08 April 2022 • Tagged with python, types
I show how to use basic type hints and get type checking working in ipython notebooks.
Marton Trencseni - Sat 26 March 2022 • Tagged with interview, python
Recently I was considering whether to introduce some CS style algorithmic interview questions into our Data Science hiring loop, since having an understanding of algorithms and data structures can be useful for Data Scientists. Not having done this soft of interview for a few years I picked up my copy of Daily Coding Problem and starting solving a few problems to refresh my feeling for what it feels like as a candidate, and whether it would give us any useful signals.
Marton Trencseni - Tue 22 March 2022 • Tagged with probability, statistics
Given a biased coin, construct a fair coin.
Marton Trencseni - Sat 12 March 2022 • Tagged with statistics, war
I run Monte Carlo simulations to show the frequntist solution to the German tank problem.
Marton Trencseni - Sat 12 February 2022 • Tagged with peter-principle, gervais-principle, dunning-kruger, dilbert
Five delightfully cynical half-truths about organizations: the Peter principle, the Dilbert principle, the Gervais principle, Negative selection and the Dunning-Kruger effect.
Marton Trencseni - Mon 31 January 2022 • Tagged with entropy, physics, spin, glass
I summarize the 5 previous posts on probabilistic spin glasses.
Marton Trencseni - Sat 22 January 2022 • Tagged with wordle, monte-carlo
I present a simple Monte Carlo solution which finds a 25-letter-unique Worlde wordlist in about 10 minutes.
Marton Trencseni - Fri 21 January 2022 • Tagged with wordle, monte-carlo
I improve on the previous brute-force Monte Carlo approach for attacking the Wordle coverage problem.
Marton Trencseni - Wed 19 January 2022 • Tagged with wordle, monte-carlo
I show a simple brute-force Monte Carlo approach for attacking the Wordle coverage problem.
Marton Trencseni - Thu 06 January 2022 • Tagged with entropy, physics, spin, glass
I use Monte Carlo simulations to explore the dynamic behaviour of probabilistic spin glasses, specifically how saturation scales with $p$ and $N$.
Marton Trencseni - Fri 31 December 2021 • Tagged with entropy, physics, spin, glass
This is a continuation of the previous articles on probabilistic spin glasses. I run simulations to understand the scaling behaviour for large spin glasses.
Marton Trencseni - Sat 25 December 2021 • Tagged with entropy, physics, spin, glass
I run simulations to understand the dynamic probabilistic evolution of these toy models.
Marton Trencseni - Sat 18 December 2021 • Tagged with entropy, physics, spin, glass
This is a continuation of the previous article on probabilistic spin glasses, with improvements to the simulation code and improved entropy computation.
Marton Trencseni - Sat 11 December 2021 • Tagged with entropy, physics, spin, glass
I run Monte Carlo simulations on probabilistic spin glasses, a simple mathematical model of magnetized matter with short range interactions. I use entropy to characterize the model's order-disorder transition.
Marton Trencseni - Mon 29 November 2021 • Tagged with entropy, physics
I derive the Sackur-Tetrode equation for entropy of a monatomic ideal gas.
Marton Trencseni - Fri 19 November 2021 • Tagged with entropy, physics
I show the first steps of how to arrive at a definition of entropy for a monatomic ideal gas modeled as hard billiard balls.
Marton Trencseni - Fri 29 October 2021 • Tagged with startups, cocoon, facebook
The idea behind WeToddle came from the Baby Fanclub group we have on Messenger, which has most of our family in it. It turns out some ex-Facebook people had a similar idea in 2019, raised $3M, spent a 2 years on it, and then gave up because it didn’t go anywhere (presumably).
Marton Trencseni - Sun 24 October 2021 • Tagged with entropy
I discuss 4 uses of entropy in Data Science: (i) cross entropy as a loss function for training neural network classifiers (ii) entropy as a splitting criterion for building decision trees (iii) entropy for evaluating clustering algorithms (iv) entropy for understanding relationships in tabular data.
Marton Trencseni - Mon 18 October 2021 • Tagged with meta
A review and introspect on the first 100 articles written on Bytepawn.
Marton Trencseni - Sat 09 October 2021 • Tagged with entropy, cross-entropy, joint-entropy, conditional-entropy, relative-entropy, kullback–leibler-diverence
What's the difference between cross entropy, joint entropy, conditional entropy and relative entropy?