1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
|
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Tom Smeding</title>
<style>
/* Source: https://github.com/madmalik/mononoki/blob/master/style.css */
@font-face {
font-family: 'mononoki-webfont';
src: url('/fonts/mononoki-Regular.eot?#iefix') format('embedded-opentype'), /* IE6-IE8 */
url('/fonts/mononoki-Regular.woff2') format('woff2'), /* Super Modern Browsers */
url('/fonts/mononoki-Regular.woff') format('woff'), /* Pretty Modern Browsers */
url('/fonts/mononoki-Regular.ttf') format('truetype'); /* Safari, Android, iOS */
font-weight: normal;
font-style: normal;
}
@font-face {
font-family: 'mononoki-webfont';
src: url('/fonts/mononoki-Bold.eot?#iefix') format('embedded-opentype'), /* IE6-IE8 */
url('/fonts/mononoki-Bold.woff2') format('woff2'), /* Super Modern Browsers */
url('/fonts/mononoki-Bold.woff') format('woff'), /* Pretty Modern Browsers */
url('/fonts/mononoki-Bold.ttf') format('truetype'); /* Safari, Android, iOS */
font-weight: bold;
font-style: normal;
}
@font-face {
font-family: 'mononoki-webfont';
src: url('/fonts/mononoki-Italic.eot?#iefix') format('embedded-opentype'), /* IE6-IE8 */
url('/fonts/mononoki-Italic.woff2') format('woff2'), /* Super Modern Browsers */
url('/fonts/mononoki-Italic.woff') format('woff'), /* Pretty Modern Browsers */
url('/fonts/mononoki-Italic.ttf') format('truetype'); /* Safari, Android, iOS */
font-weight: normal;
font-style: italic;
}
@font-face {
font-family: 'mononoki-webfont';
src: url('/fonts/mononoki-BoldItalic.eot?#iefix') format('embedded-opentype'), /* IE6-IE8 */
url('/fonts/mononoki-BoldItalic.woff2') format('woff2'), /* Super Modern Browsers */
url('/fonts/mononoki-BoldItalic.woff') format('woff'), /* Pretty Modern Browsers */
url('/fonts/mononoki-BoldItalic.ttf') format('truetype'); /* Safari, Android, iOS */
font-weight: bold;
font-style: italic;
}
body {
font-family: mononoki, mononoki-webfont, "Courier New", Courier, Monospace;
text-align: center;
}
div.main-content {
display: inline-block;
max-width: 800px;
}
div.longtext {
text-align: left;
}
summary {
cursor: pointer;
}
#pubs-ul {
text-align: left;
}
@media (prefers-color-scheme: dark) {
body {
color: #eee;
background-color: #181818;
}
a {
color: #bbf;
}
a:visited {
color: #99f;
}
}
</style>
</head>
<body>
<div class="main-content">
<h1>Hi!</h1>
<p>
I'm Tom Smeding. I love programming, doing math and playing the piano.<br>
I'm a PhD candidate in computer science at Utrecht University in the Netherlands.
</p>
<p>
You can reach me on
<a href="https://matrix.to/#/@tom:tomsmeding.com">Matrix</a>,
<a href="https://libera.chat/">IRC</a>,
<a href="https://telegram.me/tomsmeding">Telegram</a>,
<a href="https://github.com/tomsmeding">Github</a>,
<a href="https://www.linkedin.com/in/tom-smeding">LinkedIn</a>,
<a href="https://www.facebook.com/tom.smeding">Facebook</a> and
via <a href="mailto:t.j.smeding@uu.nl">email</a>,
among others.
Besides Github, I also have a number of projects on my <a href="https://git.tomsmeding.com">own server</a>.
</p>
<p>
I've also <a href="/blog">written some notes</a> at some point.
</p>
<details><summary><u>For a list of my academic publications, click here.</u></summary>
<ul id="pubs-ul">
<!--PUBS-REPLACE-START--> (Info missing, server misconfigured, sorry) <!--PUBS-REPLACE-END-->
</ul>
</details>
<h2>My research</h2>
<div class="longtext">
<p>
Currently my primary research interest is in <a href="https://en.wikipedia.org/wiki/Automatic_differentiation" target="_blank">automatic differentiation</a> (AD), as seen through the lens of <a href="https://en.wikipedia.org/wiki/Functional_programming" target="_blank">functional programming</a> (FP), in my case mostly using the programming language <a href="https://haskell.org" target="_blank">Haskell</a>.
My PhD supervisors are Gabriele Keller and Matthijs Vákár.
</p>
<p>
In general I am enthousiastic about lots of things in computer science; I enjoy functional programming and thinking about programming techniques that make FP work well, but I also have some experience in more low-level programming (in C and C++), some of it in <a href="https://stats.ioinformatics.org/people/5681" target="_blank">competitive</a> <a href="https://2019.nwerc.eu/" target="_blank">programming</a>.
I like thinking about how to optimise code to make it run faster, and also about devising compiler optimisations to make other people's code run faster.
Furthermore, I like being able to do all of those things while enabling the compiler to prevent me from making mistakes as much as possible; typically, I do this by using the type system of the programming language I'm working in.
</p>
<p>
Also send me your favourite esolangs :)
</p>
<!--
<details><summary><u>An intro to automatic differentiation for programmers</u></summary>
<p>
In automatic differentiation (AD), we study how to generalise taking symbolic/algebraic derivatives from simple mathematical expressions to whole computer programs, while keeping computational efficiency in mind.
Being able to take the derivative of a numerical function implemented as a program is useful whenever you want to get some idea a program's local behaviour when you change its inputs a little; in the case of AD, we consider continuous inputs — in practice, floating point numbers.
(If you're interested in the case where inputs are discrete, such as integers, look to <em>incremental computation</em> instead — about which I know very little.)
</p>
<p>
For example, in machine learning, the usual approach is to formulate a <em>model</em> with a lot of parameters, and then optimise those parameters to make the model compute some kind of target function as well as possible.
This model is a computer program; neural networks (as considered in machine learning) are simply computer programs of a particular, quite structured form.
Optimising a model involves changing the parameters in such a way as to get the output closer to the desired output on some set of inputs, and the derivative (gradient, in this case) of the model tells you, at least locally, how to change the parameters to make the model a <em>little bit</em> better.
Gradient descent, and other more sophisticated algorithms, use this derivative information to optimise a model in some fashion.
AD allows model implementors to have a lot of freedom in designing their models, because whatever they do, the AD algorithm will let the system compute the model's derivative.
</p>
<p>
Another application of AD is in statistical inference, such as in <a href="https://mc-stan.org/" target="_blank">Stan</a>, where users formulate a <em>statistical</em> model (with some parameters) of some part of the real world and then try to determine, given some real-world observations, where the parameters will approximately lie.
(This is the Bayesian inference approach to statistical inference — frequentist inference fairly directly reduces to optimisation, for example using gradient descent.)
Such inference algorithms need to compute a probability distribution (namely, the result: the distribution telling you where the parameters lie), and while it is relatively easy to give an indication of where these distributions are high (likely parameter values) and where they are low (unlikely ones), it's harder to <em>normalise</em> this distribution, which means computing what the actual <em>probability</em> of some parameter value is (as opposed to just being more or less likely than another parameter value).
This normalisation involves integrating the produced unnormalised probability distribution, and clever integration algorithms (for continuous parameters, <a href="https://en.wikipedia.org/wiki/Hamiltonian_Monte_Carlo" target="_blank">HMC</a>) use the local "relief" (derivative) of your statistical model to more quickly find areas of interest, and more quickly converge to a good estimate of the overall integral (and thus the actual probabilities).
Here, again, AD allows model implementors to not have to worry about writing their models in a specific form — any computation will do.
</p>
</details>
-->
</div>
</div>
</body>
</html>
|